Page 89 - 2023-Vol19-Issue2
P. 89
85 | Swide & Marhoon
trees and SVM. Deep Neural networks can also be used in a To achieve error-free prediction, all the data gathered from the
number of other ways[18]. Kim et al.(2019) State Explainable two sources were preprocessed. data on the total amount of
Autoencoder was used by to forecast household electricity electricity used from 2016 to 2018. Attributes such as date-
usage using data collected over a five-year period [19]. time, temperature, pressure, humidity, air density, wind speed,
wind direction, and load were finally taken into account (total
Polson et al. (2020) proposed DL-EVT, a mix of deep energy consumed).
learning with extreme value theory, to address the abrupt
peaks, troughs, and fast pricing shifts in energy markets B. Data analytics process
brought on by variations in supply and demand brought on by The historical data acquired from smart meters and smart sen-
intraday limitations. The levels’ peaks and troughs aren’t cap- sors is first grouped in a matrix, where each row contains an
tured by this method [20]. Huang et al .(2021) constructed a observation for one (24-hour) day. This is the data prepara-
hybrid model using deep neural networks to predict short-term tion stage. Each column also conveys a different input/output
electricity prices[21]. feature. The proposed model takes into account the following
input features: the month item, which is coded as a binary
III. THE PROPOSED APPROACH value of 8 digits for the entire dataset; the day item; the out-
side temperature; and the preceding load consumption value.
The suggested strategy consists of two primary models Also, the final column shows the output of the actual load
that, when combined in two steps as shown in Figure (2), consumption. The data are modified to have a mean of zero
can manage the energy consumption pattern throughout the and a standard deviation of one, standardizing each feature
day. The historical data collected by smart meter and smart column.
sensor in the first stage is preprocessed using data analytics.
The utility then employs the suggested Deep Learning (DL) The standardization method can reduce the reliance on
prediction to make use of this standardized data to forecast arbitrary scales and usually enhances the performance of the
the consumer’s hourly load usage. model. The dataset is then split into sets for training and
testing with percentage of 80% and 20%, respectively.
Understanding how crucial data analysis and preparation
are really for time series is necessary before talking about the C. Model for deep learning prediction
forecast. The outcome of the forecast may depend on them. LSTM stands for long short-term memory and is a unique
Preprocessing data into to the correct format is therefore the
primary problem in predicting. type of recurrent neural network (RNN). Using memory units
capable of updating the previous hidden state, this model
Fig. 2. shows the proposed model’s general flowchart. preserves long-term memory. Each neuron receives feedback
from it. The output of an RNN depends not only on the weight
A. Historical data description and input from the current neuron, but also on the input from
Data from Kaggel Western Europe Power Consumption the primary neuron. Long-term sequences of temporal re-
lationships can be understood because to this functionality.
were gathered from 2016 to 2018 in order to anticipate Den- Explosive and vanishing gradient issues are issues with tra-
mark’s overall electricity load. The National Aeronautics and ditional RNN training that are resolved by its own memory
Space Administration (NASA) released the Power data ac- storage unit and gate mechanism. Input, output, forget, and
cess viewer data for latitude 55.9562° north and longitude cell status gates are the four crucial components of the LSTM
10.0320° east where the weather information was acquired. model’s internal structure. Fig. 3 shows how an LSTM cell-
block is constructed.
The cell’s function is parameterized by the following equa-
tions:
ft = s (Wf .Xt +Uf .ht-1 + b f ) (1)
it = s (Wi.Xt +Ui.ht-1 + bi) (2)
c˜t = tanh(Wc.Xt +Ui.ht-1 + bc) (3)