Page 90 - 2023-Vol19-Issue2
P. 90

86 |                                                               Swide & Marhoon

Fig. 3. a block diagram of long short-term memory

                                                                   Fig. 4. the framework for LSTM power consumption
                                                                   forecasting.[22]

ct = ft .ct-1 + it .c˜t                            (4)             actual and predicted values. increased independence, predic-
                                                                   tion is more effective than bagging. It is also quicker since
Ot = s (Wt .Xt +Uo.ht-1 + bo)                      (5)             every tree is trained using a subset of features. In contrast,
                                                                   Bagged decision trees select to partition variables in a greedy
ht = Ot .tanh(ct )                                 (6)             manner that minimizes errors. Therefore, even Bagging can
                                                                   maintain a number of structural similarities, and their predic-
    The prediction strategy framework using the LSTM model         tions are in fact tightly connected. Predictions from the sub
is shown in Fig .4. Three major parts make up this model:          models should therefore be uncorrelated or just weakly linked
• Preprocessing of the initial data occurs mostly at the input     in order for the ensemble of predictions from multiple models
layer;                                                             to perform well. In order to learn the sub-trees and reduce
                                                                   the connection between all sub-tree predictions, RF alters the
    • The training of the data and parameter optimization are      method. The learning algorithm is permitted to choose the
done using the hidden layer;                                       optimum split point for each variable when selecting a split
                                                                   point. The RF algorithm modifies this function to assess only
    • Using the model that was developed in the hidden layer,      a random subset of features. The following are the RF steps:
the output layer predicts the data.
                                                                       1. Create a random subset within the sample (bootstrap-
    The dataset from method was utilized to test the prediction    ping).
performance by calculating the root-mean-squared Error using
a multivariate LSTM method (RMSE). X and Y parameters                  2. Choose a random set of features for each node’s best
are used to access data used for training and testing. deal with   split inside the decision tree.
LSTM model data preprocessing. The neurons used in the
input layers are also what supervised learning uses and defines.       3. For each subgroup, create a model decision tree.
Adam optimizer is chosen as the gradient in the Mean Square            4. Average final forecasting results and the sum of all
Error (MSE)-based model. test the accuracy and data loss of        decision trees’ forecasts.
a model after fitting it. In order to count all of the epochs and
determine the training loss, the model runs. After updating the               IV. RESULT AND DISCUSSION
model’s internal states, a final model is created. Covers the
forecasting of multivariate data. For purposes of evaluation,         A. Results of Energy Consumption Prediction Using Deep
the RMSE is calculated as the squared difference between the       Learning and RF Models:

                                                                       This section discusses a variety of experimental findings
                                                                   for all versions of models created using LSTM and RNN en-
                                                                   hancement. A comparative analysis was also carried out using
   85   86   87   88   89   90   91   92   93   94   95