<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Demand Forecasting and Material Requirement Planning Optimization using Open Source Tools</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Jorge Ivan Romero-Gelvez</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Edisson Alexander Delgado-Sierra</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jorge Aurelio Herrera-Cuartas</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Olmer Garcia-Bedoya</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Universidad de Bogotá Jorge Tadeo Lozano</institution>
          ,
          <addr-line>Bogotá</addr-line>
          ,
          <country country="CO">Colombia</country>
        </aff>
      </contrib-group>
      <fpage>94</fpage>
      <lpage>107</lpage>
      <abstract>
        <p>The purpose of this work is to contribute to the extended use of hybrid models to solve MRP issues dealing with stochastic demand over main stock-keeping units. The methodology development first apply SARIMA (Seasonal Autoregressive Integrated Moving Average Model), Long short term memory networks and Fb-prophet as forecasting methods to predict the demand for the master production schedule, next applies an integer programming model with JuMP (Julia Mathematical programming) for solving the MRP using the Lot for lot approach (L4L). The main contribution of this work is to show a way to solve dynamic demand problems over the Forecasting-MIP approach.</p>
      </abstract>
      <kwd-group>
        <kwd>SARIMA</kwd>
        <kwd>LSTMN</kwd>
        <kwd>Fb-prophet</kwd>
        <kwd>Forecasting</kwd>
        <kwd>MIP</kwd>
        <kwd>JuMP</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        The heart of a Material Requirements planning system is the demand forecasting
for the main stock-keeping-unit. However, it is a complicated, time spending and
inaccurate task. The randomness in the forecast has led to improve them using
a variety of tools, such as the Box-Jenkins methodology [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], machine learning
techniques [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], and even to change the paradigm of their use in Material
Requirements planning [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. In forecasting we notice the lot-sizing techniques and their
performance [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] and Box-jenkins implementations for MRP had been used for
MRP by several authors, [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], but the case study had no values for cost maintain
inventory or cost of order. Lee work over eoq lot-sizing calculation errors [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] and
there are several applications in Order-Up-To (OUT) [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ],machine learning [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ].
      </p>
      <p>
        The incremental use of computers in the 1960’s industry allows them to
improve complex scheduling and inventory control [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], which leads to Orlicki
and others to develop a manufacture planning approach based on an independent
demand over main items and a dependent demand over the material [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ].
      </p>
      <p>
        The development and adoption of Material requirements planning were slow
but in 1972 the American production and inventory control society APICS starts
to implement this model and initiates the mrp crusade that continues in our
time [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. The early development of MRP (today also know as “little-mrp” or
MRP-I) refers to a model that comes from a production plan and focuses over
get the optimal requirements for every Stock keeping unit involved in the bill of
manufacture (also called BOM), [
        <xref ref-type="bibr" rid="ref29">29</xref>
        ]. The MRP improvement years later is called
MRP II, but the letters MRP in MRP II stand for Manufacturing Resources
Planning to make it clear that resources are considered in addition to materials
as in mrp. The word “resource” is used to emphasize that any type of productive
capability can be considered, not just machines. This work uses only the MRP I
model, focus the effort in the forecast of demands and manage large sets of data
for every SKU in the company portfolio of products.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Literature review and basic background</title>
      <p>This section describes the basic concepts of Material requirement planning,
Demand Driven Material requirement planning and Forecasting techniques needed
to solve the dynamic demand problem presented in the case study.
2.1</p>
      <sec id="sec-2-1">
        <title>Fundamentals of Material requirement planning</title>
        <p>This item first describes every part in a production system and later focuses
on MRP implementation. A production plan describes in detail the quantity of
principal sku (final product to sell) and produced in subsets of sku, the exact
time of production and lot sizing. The production plan can be divided in
master production schedule (MPS), Material requirement planning (MRP) and the
detailed plan of jobs in the production floor. See Fig. 1.</p>
        <p>
          The APICS dictionary defines MRP as: "A set of techniques that uses a bill
of material data, inventory data, and the master production schedule to calculate
requirements for materials" [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ]. MRP requires three basic inputs. First the master
production schedule, second a bill of material (BOM) for each sku (part number)
what other sku are required as direct components, and third the actual level of
(a) BOM for a Simple Example.
        </p>
        <p>
          (b) The basic MRP record form: [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ].
inventory for every sku. See Fig. 2. According to [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] an MRP system serves
a central role in material planning and control. It translates the overall plans
for production into the detailed individual steps necessary to accomplish those
plans. It provides information for developing capacity plans, and it links to the
systems that actually get the production accomplished. [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ]
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>Lot for lot MRP Optimization model The formulation for MRP is based</title>
        <p>
          in the model proposed by [
          <xref ref-type="bibr" rid="ref29">29</xref>
          ] and follow the form:
subject to
        </p>
        <p>P T
minimize X X(T − t)Xi,t</p>
        <p>x i=1 t=1
t−LT (i)</p>
        <p>X
τ=1</p>
        <p>Xi,t
δi,t − M
δi,t ∈ 0, 1
Xi,t − δi,tLS(i) ≥ 0
≥ 0</p>
        <p>t
Xi,τ + I(i, 0) ≥ X
τ=1</p>
        <p>P
D(i, τ ) + X R(i, j)Xi,j
j=1
∀i ∈ 1, ..., P , ∀t ∈ 1, ..., T .</p>
        <p>P Number of SKUs
T Number of time buckets (i.e., planing horizon)
LTi Lead time for each SKU i
Ri,j Number i necesarios para hacer un j
Di,t External demand for i over tperiod
Ii, 0 Initial inventory for every SKU i
LSi Minimum lot size for every SKU i
M A large number</p>
        <p>
          The planning horizon i calculated by [
          <xref ref-type="bibr" rid="ref25">25</xref>
          ] and the overall costs are as well
found to significantly increase with forecast error [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ]
2.2
        </p>
      </sec>
      <sec id="sec-2-3">
        <title>Forecasting traditional methods</title>
        <p>The core of MPS is the forecasting job to predict future demands. This work
uses the Box-Jenkins methodology to applying the seasonal ARIMA model in
order to obtain the future demand of principal sku.</p>
        <p>
          Time series analysis According to [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] the time series methods are called
common methods because they no longer require other information of past
values. Time series is a term to refer to the collection of observations of economic
or physical phenomena drawn at discrete points in time. The idea is that past
information can be used to forecast future values of the series. In time series
analysis we try to isolate the patterns that arise most frequently, these include
the following:
– Trend: refers to the trend of a series of time that exhibits a stable pattern
of growth or decrease.
– Seasonality: A seasonality pattern are those that are repeated at fixed
intervals.
– Cycles: The variation of cycles is similar to seasonality, except that the
duration and the magnitude of the cycle varies. One associates cycles with
economic variations that are also present in seasonal fluctuations.
– Randomness: A random series is where you do not have a recognized
pattern of data. One can generate a random series of data that have a specific
structure . The data that seems to have apparently a randomness , actually
have a specific structure. Actually the random data fluctuate around a fixed
average.
        </p>
        <p>
          Box-Jenkins Methodology This method was given thanks to two known
statisticians [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ]. The models proposed are based on exploiting the self-correlations
structure of a time series. Box jenkins models are also known by the name of
ARIMA models, which is an acronym for integrated auto-regressive mobile
average. The self-correlation function plays a central role in the development of
these models, this is the characteristic that distinguishes the ARIMA model
from the other methods mentioned above. As all these forecasts are handled
through models, we denote the time series, denoting the time series of interest
as D1, D2, ...Dn. We are going to assume initially that the series is stationary.
2 i ∈ 1, 2, ....n. Practically speaking,
In this way E = Di = μ and var = Di = σ ∀
in the seasonality there is no growth or decrease in the series, and the variation
remains relatively constant. This is important to denote that seasonality does
not imply independence. Therefore it is possible to evaluate Di and Dj they
are dependent random variables when i 6= j although their marginal density
functions are the same.
        </p>
        <p>The assumption of seasonality implies that the marginal distribution of two
observations separated by the same time interval are the same. This means that
Dt and Dt+1 they have the same distribution as Dt+m y Dt+m+1 for any m ≥ 1.
This implies that the covariance of Dt and Dt+1 is exactly the same
covariance of Dt+m and Dt+m+1. Therefore , the covariance of these two observations
depends only on the number of periods that separate them. In this context,
covariance is also known as self-covariance , we are comparing two values of the
same series separated by a fixed delay .Leave Cov(Dt+m, Dt+m+k) be the
covariance of Dt+m and Dt+m+k given by Cov(Dt+m, Dt+m+k) = E(Dt+mDt+m+k) −
E(Dt+m)E(Dt+m+k)∀k, int ≥ 1.</p>
        <p>There are four main steps needed to build BOX-JENKINS forecast models
1. Data transformations: The BOX-JENKINS method is based on the start of a
series of stationary time. To be sure that the time series is stationary, several
steps are needed preliminarily. We know that differentiation eliminates trend
and seasonality. However, if the average of the series is relatively fixed, this
may be the case in which the variance is not constant, so a transformation
of the data is going to be required.
2. Identification of the model: This step refers exactly to which is the most
appropriate ARIMA model. The identification of the type of model is both
art and science, it is difficult if not impossible, to identify the model only the
series is examined. It is much more effective to study the self-correlations of
samples and partial self-correlations for the identification of patterns that
coincide with those of the known processes. In some cases, the self-correlation
structure will point to a simple AR or MA process, but it is more common
to mix these two terms to have a better fit.
3. Parameter estimation: Once the appropriate model has been identified, the
optimal values of the parameters of the model (i.e., a0, a1, . . . ., apyb0, b1, . . . .,
bq) must be determined, usually this step is carried out by means of least
squares adjustment methods or by the maximum likelihood method, this
step is performed by a computer program.
4. Forecasts: Once the model has been identified and the values of the optimal
parameters determined, the model provides forecasts of future values of the
series. The BOX-JENKINS models are more effective in providing one-step
forecasts, but can also provide multi- step forecasts.
5. Evaluation: The waste pattern (forecast errors) can give useful information
to see the quality of the model. The residuals must form a white noise ( ie,
random) process with zero means. When there are patterns in the waste, it
is suggested that the model can improve.
2.3</p>
      </sec>
      <sec id="sec-2-4">
        <title>Long short-term memory networks</title>
        <p>
          The bright idea of introducing self-loops to produce paths where the gradient
can flow for long durations is a core contribution of the first long short-term
memory (LSTM) model [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ]. In this case, we mean that even for an LSTM with
fixed parameters, the time scale of integration can change based on the input
sequence, because of the time constants.
        </p>
        <p>
          The LSTM has been found hugely successful in many applications, such as
unconstrained handwriting recognition [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], speech recognition [
          <xref ref-type="bibr" rid="ref7 ref9">7, 9</xref>
          ] handwriting
generation [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ], machine translation (Sutskever et al., 2014), image captioning
[
          <xref ref-type="bibr" rid="ref14 ref28 ref30">14, 28, 30</xref>
          ], and parsing [
          <xref ref-type="bibr" rid="ref27">27</xref>
          ] .
        </p>
        <p>The LSTM block diagram is illustrated in Fig. 3. The corresponding
forward propagation equations are given below, for a shallow recurrent network
architecture.</p>
        <p>Cells are connected recurrently to each other, replacing the usual hidden
units of ordinary recurrent networks. An input feature is computed with a
regular artificial neuron unit. Its value can be accumulated into the state if the
sigmoidal input gate allows it. The state unit has a linear self-loop whose weight
is controlled by the forget gate. The output of the cell can be shut off by the
output gate. All the gating units have a sigmoid nonlinearity, while the input
unit can have any squashing nonlinearity. The state unit can also be used as an
extra input to the gating units. The black square indicates a delay of a single
time step.</p>
        <p>fi(t) = σ bif + X Uif,j x(jt) + X</p>
        <p>(t−1)
Wif,j hj
(1)
(2)
(3)


j
 ,

 .</p>
        <p>
s(t) = fi(t) s(t−1) + gi(t−1)σ bi + X Ui,j x(jt) + X
i i
(t−1)
Wi,j hj

 ,
j</p>
        <p>j
g(t) = σ big + X Uig,j x(jt) + X
i</p>
        <p>(t−1)
Wig,j hj</p>
        <p>h(t) = tanh sit qi(t),</p>
        <p>i

q(t) = σ bi0 + X Ui0,j x(jt) + X
i</p>
        <p>(t−1)</p>
        <p>Wi0,j hj
j
j

 ,
2.4</p>
      </sec>
      <sec id="sec-2-5">
        <title>The prophet forecasting model</title>
        <p>
          We use Prophet open-source software in Python [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ] based in a decomposable
time series model [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ] components: trend, seasonality, and holidays. They are
combined in the following equation:
        </p>
        <p>y(t) = g(t) + s(t) + h(t) + t</p>
        <p>Were g(t) is the trend function which models non periodic changes in the
value of the time series, s(t) represents periodic changes, and h(t) represents
the effects of holidays which occur on potentially irregular schedules over one
or more days. The error term t represents any idiosyncratic changes that are
not accommodated by the model; later we will make the parametric assumption
that t is normally distributed.
3</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Solution method</title>
      <p>
        Open source tools in optimization are increasing popularity over the last years
and can be observed in works like [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] for the optimization process in supply
chain management and [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] in machine learning applications. We describe our
approach for using open source tools in MRP with dynamic demand as follows.
– Forecasting Model selection: In order to use the method that generates
less error t. First, we apply the SARIMA model through the ASTSA library
1 , from the R programming language. In addition, Fb-prophet 2 and
LstmKeras 3 are also used with the Python programming language. Next, we
select the method with the least error of them.
– JuMP-Julia for Mathematical Programming4: JuMP it is a package
for mathematical optimization and operations research.
– Cbc Solver5: Cbc it is a free solver, that let us solve linear and integer
optimization problems.
– IDE: IPython/IR/IJulia/Jupyter notebooks and Google-Colab.
1 https://cran.r-project.org/web/packages/astsa/index.html
2 https://facebook.github.io/prophet/docs/quickstart.html
3 https://keras.io/
4 http://www.juliaopt.org/JuMP.jl/v0.13/
5 https://github.com/coin-or/Cbc
(4)
(5)
(6)
      </p>
      <p>In this paper, we propose a methodology based on a combination of
Forecasting and MIP techniques to help the managers of the fragrance industry to decide
the best inventory and supply policies. The steps followed in the methodology
are shown in the Fig. 4.</p>
      <p>The methodology is applied to a particular manufacturing case study: to
help the managers of the fragrance industry to achieve their goals related to
minimizing inventory and minimizing significant deviations in their forecasts.
First, the forecasting job is performed by using the box-Jenkins methodology
although ASTSA package in R, the Fb-Prophet package and Lstmn with KERAS
in Python, next, the Master Production Schedule, is created based on the
previous forecasts, and the MRP optimization is performed to get the final MRP
plan. The results of the MRP model are evaluated comparing their results with
the real demand and improved by adjusting the historical data demand records.
4</p>
    </sec>
    <sec id="sec-4">
      <title>Results</title>
      <p>We apply the solution method to a company dedicated to the creation
development, manufacture, and marketing of flavors and fragrances for the Colombian
industry. A step by step application is described as follows:
4.1</p>
      <sec id="sec-4-1">
        <title>Forecasting the principal SKU</title>
        <p>First, the historical demand data is plotted in order to isolate the data patterns.
Later we apply the SARIMA model by using the R language Fig. 5, also the
Fb-Prophet package and Lstmn using KERAS and Google Colab Fig. 6.</p>
        <p>Table 1 shows the difference in forecasts generated and their associated errors
in the last row. The techniques with the lowest associated error are SARIMA
and LSTMN, so the predicted demand generated by them will be used, which
can be seen in Table 1.</p>
        <p>As it can be seen in Table 2 the future demand for 9 months is the principal
input for the MRP formulation, and it will be represented on the subset Dj.
(a) Historical demand data plot</p>
        <p>(b) Forecasting for 2019 (monthly).
In this item, we use the Julia programing language and the package JuMP for
mathematical optimization, the code for the problem it can be seen as follows.
using JuMP,Cbc,NamedArrays,DataFrames
filas=size(D,1)
col=size(D,2)
mrp=Model(solver=CbcSolver())
@variables mrp begin
x[1:filas,1:col]&gt;=0
d[1:filas,1:col]&gt;=0
end
T=col
@objective(mrp,Min,sum( x[i,j]*((T-j)) for i=1:filas, j=1:col))
for i=1:filas,t=1:col
@constraint(mrp, sum(x[i,s] for s=1:t-LT[i,1])+I[i,1]&gt;=sum(D[i,s]
+sum(R[i,j]*x[j,s] for j=1:filas) for s=1:t))
end
@constraint(mrp,x-d.*LS.&gt;=0)
@constraint(mrp,d-x/1000000000000000.&gt;=0)
status=solve(mrp)
println(getobjectivevalue(mrp))
println(DataFrame(getvalue(x)))
println(DataFrame(getvalue(d)))
We use the Ri,j bill of manufacture that shows in the Table 3 as follows:
t2
t3
t4
t5
t6
t7
t8</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Discussion and future investigation</title>
      <p>Future investigation must be related to DDMRP demand driven MRP and
machine learning techniques related to forecasting methods. The first affirmations
is related with methods for maintain a minimum and maximum intervals for
inventory management over every sku. The second affirmation is related to method
that helps us to obtain better and more accurates forecasts for future demand.
6</p>
    </sec>
    <sec id="sec-6">
      <title>Conclusions</title>
      <p>The use of Open source optimization and forecasting tools allows to solve
optimization and predictive problems efficiently.</p>
      <p>The forecasting problem is the principal problem in an MRP based company
and can be solved at scale using open source tools.</p>
      <p>The biggest contribution of this work is to show how to solve a real problem
over forecasting and optimization using three programming languages in order
to reduce error and improve the time scheduling for every stock keeping unit.</p>
      <p>The company uses this implementation in order to plan his manufacturing
process at a larger scale including every product of their portfolio.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Altendorfer</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Felberbauer</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jodlbauer</surname>
          </string-name>
          , H.:
          <article-title>Effects of forecast errors on optimal utilisation in aggregate production planning with stochastic customer demand</article-title>
          .
          <source>International Journal of Production Research</source>
          <volume>54</volume>
          (
          <issue>12</issue>
          ),
          <fpage>3718</fpage>
          -
          <lpage>3735</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Bousqaoui</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Achchab</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tikito</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Machine learning applications in supply chains: long short-term memory for demand forecasting</article-title>
          .
          <source>In: International Conference of Cloud Computing Technologies and Applications</source>
          . pp.
          <fpage>301</fpage>
          -
          <lpage>317</lpage>
          . Springer (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Box</surname>
            ,
            <given-names>G.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jenkins</surname>
            ,
            <given-names>G.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reinsel</surname>
            ,
            <given-names>G.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ljung</surname>
            ,
            <given-names>G.M.:</given-names>
          </string-name>
          <article-title>Time series analysis: forecasting and control</article-title>
          . John Wiley &amp; Sons (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Cox</surname>
            ,
            <given-names>J.F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Blackstone</surname>
            ,
            <given-names>J.H.</given-names>
          </string-name>
          :
          <article-title>APICS dictionary</article-title>
          .
          <source>Amer Production &amp; Inventory</source>
          (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Disney</surname>
            ,
            <given-names>S.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Farasyn</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lambrecht</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Towill</surname>
            ,
            <given-names>D.R.</given-names>
          </string-name>
          , Van de Velde, W.:
          <article-title>Taming the bullwhip effect whilst watching customer service in a single supply chain echelon</article-title>
          .
          <source>European Journal of Operational Research</source>
          <volume>173</volume>
          (
          <issue>1</issue>
          ),
          <fpage>151</fpage>
          -
          <lpage>172</lpage>
          (
          <year>2006</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Graves</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Generating sequences with recurrent neural networks</article-title>
          .
          <source>arXiv preprint arXiv:1308.0850</source>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Graves</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jaitly</surname>
          </string-name>
          , N.:
          <article-title>Towards end-to-end speech recognition with recurrent neural networks</article-title>
          .
          <source>In: International conference on machine learning</source>
          . pp.
          <fpage>1764</fpage>
          -
          <lpage>1772</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Graves</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liwicki</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fernández</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bertolami</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bunke</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schmidhuber</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>A novel connectionist system for unconstrained handwriting recognition</article-title>
          .
          <source>IEEE transactions on pattern analysis and machine intelligence</source>
          <volume>31</volume>
          (
          <issue>5</issue>
          ),
          <fpage>855</fpage>
          -
          <lpage>868</lpage>
          (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Graves</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mohamed</surname>
            ,
            <given-names>A.r.</given-names>
          </string-name>
          , Hinton, G.:
          <article-title>Speech recognition with deep recurrent neural networks</article-title>
          .
          <source>In: 2013 IEEE international conference on acoustics, speech and signal processing</source>
          . pp.
          <fpage>6645</fpage>
          -
          <lpage>6649</lpage>
          . IEEE (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Harvey</surname>
            ,
            <given-names>A.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Peters</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Estimation procedures for structural time series models</article-title>
          .
          <source>Journal of Forecasting</source>
          <volume>9</volume>
          (
          <issue>2</issue>
          ),
          <fpage>89</fpage>
          -
          <lpage>108</lpage>
          (
          <year>1990</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Hochreiter</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schmidhuber</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>Long short-term memory</article-title>
          .
          <source>Neural computation 9(8)</source>
          ,
          <fpage>1735</fpage>
          -
          <lpage>1780</lpage>
          (
          <year>1997</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Hopp</surname>
            ,
            <given-names>W.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Spearman</surname>
            ,
            <given-names>M.L.</given-names>
          </string-name>
          :
          <article-title>Factory physics</article-title>
          . Waveland Press (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Jacobs</surname>
            ,
            <given-names>F.R.</given-names>
          </string-name>
          :
          <article-title>Manufacturing planning and control for supply chain management</article-title>
          .
          <source>McGraw-Hill</source>
          (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Kiros</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Salakhutdinov</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zemel</surname>
            ,
            <given-names>R.S.:</given-names>
          </string-name>
          <article-title>Unifying visual-semantic embeddings with multimodal neural language models</article-title>
          .
          <source>arXiv preprint arXiv:1411.2539</source>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Adam</surname>
            <given-names>Jr</given-names>
          </string-name>
          ,
          <string-name>
            <surname>E.E.</surname>
          </string-name>
          :
          <article-title>Forecasting error evaluation in material requirements planning (mrp) production-inventory systems</article-title>
          .
          <source>Management Science</source>
          <volume>32</volume>
          (
          <issue>9</issue>
          ),
          <fpage>1186</fpage>
          -
          <lpage>1205</lpage>
          (
          <year>1986</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shih</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Optimal forecast biasing in theoretical inventory models</article-title>
          .
          <source>THE INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH 27(5)</source>
          ,
          <fpage>809</fpage>
          -
          <lpage>830</lpage>
          (
          <year>1989</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Disney</surname>
            ,
            <given-names>S.M.:</given-names>
          </string-name>
          <article-title>Revisiting rescheduling: Mrp nervousness and the bullwhip effect</article-title>
          .
          <source>International Journal of Production Research</source>
          <volume>55</volume>
          (
          <issue>7</issue>
          ),
          <fpage>1992</fpage>
          -
          <lpage>2012</lpage>
          (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Mabert</surname>
            ,
            <given-names>V.A.</given-names>
          </string-name>
          :
          <article-title>The early road to material requirements planning</article-title>
          .
          <source>Journal of Operations Management</source>
          <volume>25</volume>
          (
          <issue>2</issue>
          ),
          <fpage>346</fpage>
          -
          <lpage>356</lpage>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Moreno</surname>
            ,
            <given-names>R.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garcia</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          , et al.:
          <article-title>Model of neural networks for fertilizer recommendation and amendments in pasture crops</article-title>
          .
          <source>In: 2018 ICAI Workshops (ICAIW)</source>
          . pp.
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          . IEEE (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Nahmias</surname>
          </string-name>
          , S., Cheng, Y.:
          <article-title>Production and operations analysis</article-title>
          ,
          <source>vol. 6</source>
          .
          <string-name>
            <surname>McGraw-hill New</surname>
          </string-name>
          York (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Orlicki</surname>
            ,
            <given-names>J.A.</given-names>
          </string-name>
          :
          <article-title>Material requirements planning: the new way of life in production and inventory management</article-title>
          .
          <source>McGraw-Hill</source>
          (
          <year>1975</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Priore</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ponte</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rosillo</surname>
            , R., de la Fuente,
            <given-names>D.:</given-names>
          </string-name>
          <article-title>Applying machine learning to the dynamic selection of replenishment policies in fast-changing supply chain environments</article-title>
          .
          <source>International Journal of Production</source>
          Research pp.
          <fpage>1</fpage>
          -
          <lpage>15</lpage>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <surname>Ptak</surname>
            ,
            <given-names>C.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Demand Driven Material Requirements Planning (DDMRP)</article-title>
          . Industrial Press, Incorporated (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.
          <string-name>
            <surname>Romero-Gelvez</surname>
            ,
            <given-names>J.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gonzales-Cogua</surname>
            ,
            <given-names>W.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herrera-Cuartas</surname>
            ,
            <given-names>J.A.</given-names>
          </string-name>
          :
          <article-title>Cvrptw model for cargo collection with heterogeneous capacity-fleet</article-title>
          .
          <source>In: International Conference on Applied Informatics</source>
          . Springer (
          <year>2019</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <surname>Sridharan</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>BERRY</surname>
          </string-name>
          , W.L.:
          <article-title>Master production scheduling make-to-stock products: a framework for analysis</article-title>
          .
          <source>The International Journal of Production Research</source>
          <volume>28</volume>
          (
          <issue>3</issue>
          ),
          <fpage>541</fpage>
          -
          <lpage>558</lpage>
          (
          <year>1990</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          26.
          <string-name>
            <surname>Taylor</surname>
          </string-name>
          , S.J.,
          <string-name>
            <surname>Letham</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>Forecasting at scale</article-title>
          .
          <source>The American Statistician</source>
          <volume>72</volume>
          (
          <issue>1</issue>
          ),
          <fpage>37</fpage>
          -
          <lpage>45</lpage>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          27.
          <string-name>
            <surname>Vinyals</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaiser</surname>
          </string-name>
          , Ł.,
          <string-name>
            <surname>Koo</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Petrov</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sutskever</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hinton</surname>
          </string-name>
          , G.:
          <article-title>Grammar as a foreign language</article-title>
          .
          <source>In: Advances in neural information processing systems</source>
          . pp.
          <fpage>2773</fpage>
          -
          <lpage>2781</lpage>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          28.
          <string-name>
            <surname>Vinyals</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Toshev</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bengio</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Erhan</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Show and tell: A neural image caption generator</article-title>
          .
          <source>In: Proceedings of the IEEE conference on computer vision and pattern recognition</source>
          . pp.
          <fpage>3156</fpage>
          -
          <lpage>3164</lpage>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          29.
          <string-name>
            <surname>Voß</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Woodruff</surname>
            ,
            <given-names>D.L.</given-names>
          </string-name>
          :
          <article-title>Introduction to computational optimization models for production planning in a supply chain</article-title>
          , vol.
          <volume>240</volume>
          . Springer Science &amp; Business
          <string-name>
            <surname>Media</surname>
          </string-name>
          (
          <year>2006</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          30.
          <string-name>
            <surname>Xu</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ba</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kiros</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cho</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Courville</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Salakhutdinov</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zemel</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bengio</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          :
          <article-title>Show, attend and tell: Neural image caption generation with visual attention</article-title>
          .
          <source>arXiv preprint arXiv:1502.03044</source>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>