*7.2. Correlation Analysis for S2SATRA*

Tables 6 and 7 demonstrate the correlation value between the attribute *Ai* for F1 until F12 with respect to *Q ki* <sup>2</sup>*SAT*. For a clear illustration, *H*<sup>0</sup> signifies that there is no correlation between the attribute *Ai* with *<sup>Q</sup> ki* <sup>2</sup>*SAT*. Hence, if the correlation exists between the attributes and the outcome, we will "reject" the decision of *H*<sup>0</sup> and the connotation of "Accept" means the otherwise [39]. In other words, the aim of this analysis is to verify which *Ai* will be chosen to represent the *Ci* in *<sup>Q</sup> ki* <sup>2</sup>*SAT*. Based on Table 8, most of the attributes selected in S2SATRA have a high correlation with *Q ki* <sup>2</sup>*SAT*. The non-correlated attributes will be disregarded in the right way before it can be introduced in the learning phase of HNN. The main concern in the conventional logic mining model is the possible choice of *Ai* that construct *Ci* purely based on the random selection. For example, in F12, the logic mining model without a supervised layer might choose *A*<sup>6</sup> and *A*<sup>8</sup> to construct *Ci* and will have to learn unnecessary attributes that lead to *Q ki* <sup>2</sup>*SAT* = 1. In this context, HNN-2SAT will learn non-optimal *Q ki* <sup>2</sup>*SAT* that corresponds to the datasets which has no correlation with the final outcome. Hence, the effectiveness of knowledge extraction for logic mining will be reduced dramatically because one of the *Ci* is not correlated to the desired outcome. Based on the result, the correlation layer is vital to avoid S2SATRA from choosing the wrong attributes.

**Figure 2.** Synaptic weight analysis for F1: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 3.** Synaptic weight analysis for F2: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 4.** Synaptic weight analysis for F3: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 5.** Synaptic weight analysis for F4: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 6.** Synaptic weight analysis for F5: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 7.** Synaptic weight analysis for F6: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 8.** Synaptic weight analysis for F7: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 9.** Synaptic weight analysis for F8: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 10.** Synaptic weight analysis for F9: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 11.** Synaptic weight analysis for F10: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 12.** Synaptic weight analysis for F11: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

**Figure 13.** Synaptic weight analysis for F13: (**a**) *<sup>C</sup>*(1) *<sup>i</sup>* ; (**b**) *<sup>C</sup>*(2) *<sup>i</sup>* and (**c**) *<sup>C</sup>*(3) *<sup>i</sup>* .

