4.2.1. Multicollinearity Test

Before establishing the model, it is necessary to judge whether there is multicollinearity among variables. If there is multicollinearity, the partial least squares model is the best choice, hence it is necessary to conduct the diagnosis of covariance. The higher the variance inflation factor is, the more serious the multicollinearity is, and there is a positive correlation between them. Usually, the largest variance inflation factor among all independent variables is used as an indicator of multicollinearity. If the value is greater than 10, it means that the corresponding independent variables are approximately linearly combined with other variables in the linear regression analysis. This will affect the least squares estimation and the accuracy and reliability of the linear regression analysis cannot be guaranteed, i.e., the multicollinearity is serious and not suitable for linear regression analysis. Based on this, SPSS was used to verify the results, and the results can be seen in Table 3.


**Table 3.** Results of multicollinearity test.

Analyzing the multicollinearity among the influencing factors of FECO, the variance inflation factor among the independent variables has a maximum of 4.56, thus it can be determined that there is no significant correlation among the respective variables; hence, there is no multicollinearity problem. If the problem is serious, the least squares model cannot be chosen, and the construction of the Tobit model is a better choice.

#### 4.2.2. Unit Root Test

Panel data may have variable data that are not smooth, thus affecting the authenticity and credibility of the final estimation results and creating the problem of pseudo-regression. In order to obtain more robust regression results, the unit root test is conducted on the panel data before parameter estimation. The more commonly used testing methods are the Im-Pesaran-Shin (IPS) test, the Augmented Dickey-Fuller (ADF) test and the Levin-Lin-Chu (LLC) test, whose original hypothesis is the existence of a unit root. To avoid the problem of inaccurate test results brought by a single test method, this paper integrates the above three test methods to verify whether there is a unit root in the panel data. If the above three methods are passed, this indicates that there is no unit root in the sample data. The results are shown in Table 4.



Note: \*\*\*, \*\*\*\* means *p* < 0.05 and *p* < 0.01, respectively.

As can be seen from Table 4, all variables fail to reject the original hypothesis, indicating the existence of a unit root for each variable, while the first-order difference terms of each variable can reject the original hypothesis at the 1% significance level. This indicates that there is no unit root in the first-order difference series of each variable, which in turn indicates that the panel data are first-order single integer and can be subjected to the next step of regression analysis.
