ReLU

The model was fine-tuned with layers to increase its accuracy and precision. We employed three layers for the Deep Neural Network in addition to the input and output layers, as shown in Table 8 for DS1 and Table 9 for DS2, and applied the Rectified Linear Unit activation function (ReLU).



