**2. Deep-Learning Models**

ANN in the learning phase is unable to utilize information learned from the past time steps while processing the current time step, which is the major drawback of traditional neural networks. An RNN can solve this problem and is one of the deep-learning models designed to handle sequential data. To preserve information, it recursively transfers learning from previous time steps of the network to the current time step. However, it is susceptible to the vanishing gradient problem. As a result, it is unable to remember long-term dependencies.

LSTMs are a special type of RNN that is especially designed to learn both long- and short-term dependencies [15]. Compared to a traditional neural network, LSTM units encompass a 'memory cell' that can retain and maintain information for long periods of time [28]. Figure 3 is a schematic diagram of an LSTM cell. A set of gates are used to customize the hidden states. Three different gates are used, representing input, forget, and output. The functionality of each gate is summarized as follows.


**Figure 3.** Schematic diagram of an LSTM cell.

Here, *wf* , *wi*, *wc* and *wo* are weight matrices. *bf* , *bi*, *bc* and *bo* are biases for individual gates. *σ* indicates a sigmoid activation function. \* stands for element-wise multiplication, and + implies element-wise addition.
