1. Introduction
Owing to the development of information management systems (such as the student card system) in colleges and universities, it becomes convenient and easy for teachers to collect and analyze students’ behaviors, which is one of the most important approaches to learn students’ learning and living habits on campus. For example, a student who wants to get a high Grade Point Average (GPA) score may live a very regular life (such as going to the library at certain times) [
1,
2], since she needs to work hard on her selected courses. The students’ behaviors tell us whether they intend to spend more time on studies [
3,
4]. Based on this, we have the motivation to develop student performance prediction methods from their behaviors. The performance prediction task pays more attention to the students who are possible under-performers. The task aims to let educators obtain early feedback and take immediate action to improve students’ performance.
The research problem described in this work is about Educational Data Mining (EDM), which is a technology for mining potential information from massive learners’ behavior data, and it has been widely applied in scientific research, business, finance, and other fields [
5]. There are many applications of EDM in the education field, such as building learners’ feature model [
6] and recommending courses or learning paths to students according to their learning behaviors [
7]. The purpose of a majority of methods is to improve students’ academic performance and promote their well-rounded development. In this work, we focus on students’ behaviors in order to predict their performance, since in this way we can find out students’ learning difficulties in advance and handle them timely with an intervention trial. At the same time, personalized guidance can be provided to promote students’ comprehensive development. In addition, because students’ behaviors are intuitive, we can have easier access to judge consequences directly and quickly, instead of discovering students’ learning and life problems at the end of the semester.
To study this problem, many researchers have proposed utilizing different technologies such as statistical analysis, data mining, and questionnaire surveys to predict students’ performance from their behavior data. For example, Fei et al. [
8] introduced a temporal model to predict the performance of students who are at risk via formulating an activity sequence from the historical behaviors of a Massive Open Online Courses (MOOC) platform. Another study [
9] adopted the discussion behaviors in online forums to predict students’ final performance by using different data mining approaches. Although these existing methods have achieved great success in predicting students’ performance, they still have the following limitations: (1) Their methods are mainly focused on manually extracting statistical features from the pre-stored data, resulting in the hysteresis in predicting students’ achievement and finding out their problems. (2) Due to the limited representation capability of these manually extracted features, they can only understand students’ behaviors shallowly.
On the one hand, predicting students’ performance timely is helpful for education managers (such as teachers) to find out learners’ problems, and hence adjust their education policy or teaching method. Suppose a student is a freshman who just graduated from high school. She may work harder in the first semester since she continues her learning habit in high school. However, from the second semester, she may be distracted by other college activities, such as club or class activities. In addition, she is even influenced by her growing laziness. If we can only find out the student’s problem at the end of this semester, she will miss lots of courses. A timely prediction method is helpful to avoid these situations. To achieve this goal, we regard the performance prediction problem as a sequence classification task, in which the students’ behaviors in a short period are taken into account.
On the other hand, traditional manually extracted features have limited representation capability, while deep neural networks have achieved great success for their ability to extract high-representative features from various sequences. For example, a recent study [
10] adopted Gated Recurrent Units (GRU) with an attention mechanism to model the user’s sequential behavior and capture their main intention in the current session, which is combined as a unified session representation. Another study [
11] introduced a neural network architecture that can process input sequences and questions, form episodic memories, and generate relevant answers. However, these existing works are mainly focused on the research problem in Natural Language Processing (NLP) and Recommender systems. The study that leverages the power of the Recurrent Neural Network (RNN) to model students’ performance is largely unexplored.
Based on the above observations, we first treat the student performance prediction task as a short-term sequence classification problem. Then, we propose a two-stage classification algorithm by extracting students’ recent behavior sequence characteristics to predict their performance, which consists of a hybrid sequence encoder and an SVM classifier. Concretely, for the sake of discovering useful sequential features from students’ sequential behaviors, we introduce an attention-based HRNN to model their short-term goals by giving a higher weight to the behaviors that are relevant to the students’ last behaviors, which is interpreted as a unified sequence representation later. Then, we further involve the learned features of the classic SVM algorithm to achieve our final SPC framework.
As far as we know, we are the first to treat the student performance prediction as a sequence classification task and leverage the capability of the RNN technique to investigate student behavior patterns. The main contributions of this work are summarized as follows:
We target at predicting student achievement with their short-term behavior sequences and propose an SPC model to explore the potential information of their recent behaviors.
We apply an attention-based HRNN approach to research student behavior characteristics by only focusing on their current short-term behaviors, which allows us to make timely performance prediction.
We conduct extensive experiments on the real-world student smart card dataset to demonstrate the superiority of our proposed method.
The remainder of this paper is organized as follows:
Section 2 explains the main features of the methods used in our research and outlines the proposed method;
Section 3 describes the set of experiments completed and their interpretation report the experimental on the real-world dataset; and, finally, conclusions in
Section 4.
Author Contributions
Conceptualization: X.W.; Methodology: X.Y.; Software: L.G.; Validation: F.L.; Resource: L.X.; Writing—original draft preparation: X.Y.; Writing—review and editing: X.W. and L.G.; Supervision: L.G.; Visualization: X.Y.; Funding acquisition: X.W. All authors have read and agreed to the published version of the manuscript.
Funding
This research is supported by the Social Science Planning Research Project of Shandong Province No.19CTQJ06, the National Natural Science Foundation of China Nos.61602282, 61772321 and China Post-doctoral Science Foundation No.2016M602181.
Acknowledgments
The authors thank all commenters for their valuable and constructive comments.
Conflicts of Interest
The authors declare that there are no conflict of interest regarding the publication of this article.
References
- Cao, Y.; Gao, J.; Lian, D.F.; Rong, Z.H.; Shi, J.T.; Wang, Q.; Wu, Y.F.; Yao, H.X.; Zhou, T. Orderliness predicts academic performance: Behavioural analysis on campus lifestyle. J. R. Soc. Interface 2018, 15, 20180210. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jovanovic, J.; Mirriahi, N.; Gašević, D.; Dawson, S.; Pardo, A. Predictive power of regularity of pre-class activities in a flipped classroom. Comput. Educ. 2019, 134, 156–168. [Google Scholar] [CrossRef]
- Carter, A.S.; Hundhausen, C.D.; Adesope, o. The Normalized Programming State Model: Predicting Student Performance in Computing Courses Based on Programming Behavior. In Proceedings of the Eleventh Annual International Conference on International Computing Education Research, Omaha, NE, USA, 9–13 August 2015. [Google Scholar]
- Conard, M.A. Aptitude is not enough: How personality and behavior predict academic performance. J. Res. Pers. 2006, 40, 339–346. [Google Scholar] [CrossRef]
- Romero, C.; Ventura, S. Data mining in education. Wiley Interdiscip. Rev.-Data Min. Knowl. Discov. 2013, 3, 12–27. [Google Scholar] [CrossRef]
- Amershi, S.; Conati, C. Combining unsupervised and supervised classification to build user models for exploratory. JEDM 2009, 1, 18–71. [Google Scholar]
- Hou, Y.F.; Zhou, P.; Xu, J.; Wu, D.O. Course recommendation of MOOC with big data support: A contextual online learning approach. In Proceedings of the 2018 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), Honolulu, HI, USA, 15–19 April 2018. [Google Scholar]
- Fei, M.; Yeung, D.Y. Temporal models for predicting student dropout in massive open online courses. ICDMW 2015, 256–263. [Google Scholar] [CrossRef]
- Romero, C.; López, M.I.; Luna, J.M.; Ventura, S. Predicting students’ final performance from participation in online discussion forums. Comput. Educ. 2013, 68, 458–472. [Google Scholar] [CrossRef]
- Li, J.; Ren, P.J.; Chen, Z.M.; Ren, Z.C.; Lian, T.; Ma, J. Neural attentive session-based recommendation. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, Singapore, 6–10 November 2017; pp. 1419–1428. [Google Scholar] [CrossRef] [Green Version]
- Kumar, A.; Irsoy, O.; Ondruska, P.; Iyyer, M.; Bradbury, J.; Gulrajani, I.; Zhong, V.; Paulus, R.; Socher, R. Ask me anything: Dynamic memory networks for natural language processing. In Proceedings of the 33rd International Conference on Machine Learning(ICML), New York, NY, USA, 24 June 2016; pp. 1378–1387. [Google Scholar]
- Meghji, A.F.; Mahoto, N.A.; Unar, M.A.; Shaikh, M.A. Predicting student academic performance using data generated in higher educational institutes. 3C Tecnología 2019, 8, 366–383. [Google Scholar] [CrossRef]
- You, J.W. Identifying significant indicators using LMS data to predict course achievement in online learning. Internet High. Educ. 2016, 29, 23–30. [Google Scholar] [CrossRef]
- Conijn, R.; Snijders, C.; Kleingeld, A.; Matzat, U. Predicting student performance from LMS data: A comparison of 17 blended courses using Moodle LMS. IEEE Trans. Learn. Technol. 2016, 10, 17–29. [Google Scholar] [CrossRef]
- Di Mitri, D.; Scheffel, M.; Drachsler, H.; Börner, D.; Ternier, S.; Specht, M. Learning pulse: A machine learning approach for predicting performance in self-regulated learning using multimodal data. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, New York, NY, USA, 13–17 March 2017; pp. 188–197. [Google Scholar] [CrossRef] [Green Version]
- Ma, Y.L.; Cui, C.R.; Nie, X.S.; Yang, G.P.; Shaheed, K.; Yin, Y.L. Pre-course student performance prediction with multi-instance multi-label learning. Sci. China Inf. Sci. 2019, 62, 29101. [Google Scholar] [CrossRef] [Green Version]
- Zhou, M.Y.; Ma, M.H.; Zhang, Y.K.; SuiA, K.X.; Pei, D.; Moscibroda, T. EDUM: Classroom education measurements via large-scaleWiFi networks. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Heidelberg, Germany, 12–16 September 2016; pp. 316–327. [Google Scholar] [CrossRef]
- Sweeney, M.; Rangwala, H.; Lester, J.; Johri, A. Next-term student performance prediction: A recommender systems approach. arXiv 2016, arXiv:1604.01840. [Google Scholar]
- Meier, Y.; Xu, J.; Atan, O.; Van der Schaar, M. Predicting grades. IEEE Trans. Signal Process. 2015, 64, 959–972. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Cho, K.; Hofmann, T. Fully character-level neural machine translation without explicit segmentation. TACL 2017, 5, 365–378. [Google Scholar] [CrossRef] [Green Version]
- Johnson, M.; Schuster, M.; Le, Q.V.; Krikun, K. Google’s multilingual neural machine translation system: Enabling zero-shot translation. TACL 2017, 5, 339–351. [Google Scholar] [CrossRef] [Green Version]
- Ren, P.J.; Chen, Z.M.; Li, J.; Ren, Z.C.; Ma, J.; De Rijke, M. RepeatNet: A repeat aware neural recommendation machine for session-based recommendation. In Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 4806–4813. [Google Scholar] [CrossRef]
- Donkers, T.; Loepp, B.; Ziegler, J. Sequential user-based recurrent neural network recommendations. In Proceedings of the Eleventh ACM Conference on Recommender Systems, Como, Italy, 27–31 August 2017; pp. 152–160. [Google Scholar] [CrossRef]
- Liu, B.; Lane, I. Attention-based recurrent neural network models for joint intent detection and slot filling. arXiv 2016, arXiv:1609.01454. [Google Scholar]
- Chung, Y.A.; Wu, C.C.; Shen, C.H.; Lee, H.Y.; Lee, L. Unsupervised learning of audio segment representations using sequence-to-sequence recurrent neural networks. In Proceedings of the Interspeech 2016, Stockholm, Sweden, 20–24 August 2017; pp. 765–769. [Google Scholar] [CrossRef] [Green Version]
- Zhu, Y.; Li, H.; Liao, Y.K.; Wang, B.D.; Guan, Z.Y.; Liu, H.F.; Cai, D. What to Do Next: Modeling User Behaviors by Time-LSTM. In Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, Melbourne, Australia, 19–25 August 2017; pp. 3602–3608. [Google Scholar] [CrossRef] [Green Version]
- Lykourentzou, I.; Giannoukos, I.; Mpardis, G.; Nikolopoulos, V.; Loumos, V. Early and dynamic student achievement prediction in e-learning courses using neural networks. J. Am. Soc. Inf. Sci. Technol. 2009, 60, 372–380. [Google Scholar] [CrossRef]
- Ramanathan, L.; Parthasarathy, G.; Vijayakumar, K.; Lakshmanan, L.; Ramani, S. Cluster-based distributed architecture for prediction of student’s performance in higher education. Clust. Comput. 2019, 22, 1329–1344. [Google Scholar] [CrossRef]
- Raga, R.C.; Raga, J.D. Early Prediction of Student Performance in Blended Learning Courses Using Deep Neural Networks. In Proceedings of the 2019 International Symposium on Educational Technology (ISET), Hradec Kralove, Czech Republic, 2–4 July 2019; pp. 39–43. [Google Scholar]
- Askinadze, A.; Liebeck, M.; Conrad, S. Predicting Student Test Performance based on Time Series Data of eBook Reader Behavior Using the Cluster-Distance Space Transformation. In Proceedings of the 26th International Conference on Computers in Education(ICCE), Manila, Philppines, 26–30 November 2018. [Google Scholar]
- Okubo, F.; Yamashita, T.; Shimada, A.; Taniguchi, Y.; Shin’ichi, K. On the Prediction of Students’ Quiz Score by Recurrent Neural Network. CEUR Workshop Proc. 2018, 2163. [Google Scholar]
- Yang, T.Y.; Brinton, C.G.; Joe-Wong, C.; Chiang, M. Behavior-based grade prediction for MOOCs via time series neural networks. IEEE J. Selected Topics Signal Process. 2017, 11, 716–728. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar]
- Peng, Z.C.; Hu, Q.H.; Dang, J.W. Multi-kernel SVM based depression recognition using social media data. Int. J. Mach. Learn. Cybern. 2019, 10, 43–57. [Google Scholar] [CrossRef]
- Zhou, Z.H.; Liu, X.H.; Jiang, Y.R.; Mai, J.G.; Wang, Q.N. Real-time onboard SVM-based human locomotion recognition for a bionic knee exoskeleton on different terrains. In Proceedings of the 2019 Wearable Robotics Association Conference (WearRAcon), Scottsdale, AZ, USA, 25–27 March 2019; pp. 34–39. [Google Scholar]
- Reddy, U.J.; Dhanalakshmi, P.; Reddy, P.D.K. Image Segmentation Technique Using SVM Classifier for Detection of Medical Disorders. ISI 2019, 24, 173–176. [Google Scholar] [CrossRef] [Green Version]
- Hussain, M.; Zhu, W.H.; Zhang, W.; Abidi, S.M.R.; Ali, S. Using machine learning to predict student difficulties from learning session data. Artif. Intell. Rev. 2019, 52, 381–407. [Google Scholar] [CrossRef]
- Al-Sudani, S.; Palaniappan, R. Predicting students’ final degree classification using an extended profile. Educ. Inf. Technol. 2019, 24, 2357–2369. [Google Scholar] [CrossRef] [Green Version]
- Vapnik, V.N. An overview of statistical learning theory. IEEE Trans. Neural Netw. 1999, 10, 988–999. [Google Scholar] [CrossRef] [Green Version]
- Oloruntoba, S.A.; Akinode, J.L. Student academic performance prediction using support vector machine. Int. J. Eng. Sci. Res. Technol. 2017, 6, 588–597. [Google Scholar]
- Urrutia-Aguilar, M.E.; Fuentes-García, R.; Martínez, V.D.M.; Beck, E.; León, S.O.; Guevara-Guzmán, R. Logistic Regression Model for the Academic Performance of First-Year Medical Students in the Biomedical Area. Creative Educ. 2016, 7, 2202. [Google Scholar] [CrossRef] [Green Version]
- Arsad, P.M.; Buniyamin, N.; Ab Manan, J.L. A neural network students’ performance prediction model (NNSPPM). In Proceedings of the 2013 IEEE International Conference on Smart Instrumentation, Measurement and Applications (ICSIMA), Kuala Lumpur, Malaysia, 25–27 November 2013; pp. 1–5. [Google Scholar]
- Kumar, S.A.; Vijayalakshmi, M.N. Efficiency of decision trees in predicting student’s academic performance. Comput. Sci. Inf. Technol. 2011, 2, 335–343. [Google Scholar]
- Liaw, A.; Wiener, M. Classification and regression by random Forest. R News 2002, 2, 18–22. [Google Scholar]
- Kingma, D.P.; Ba, J. A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Tan, Y.K.; Xu, X.X.; Liu, Y. Improved recurrent neural networks for session-based recommendations. In Proceedings of the 1st Workshop on Deep Learning for Recommender Systems, Boston, MA, USA, 15 September 2016; pp. 17–22. [Google Scholar]
- Yi, M.X. Analysis and sequencing of the influence factors of the value of review on the low carbon Library Line Based on SPSS data analysis. In Proceedings of the 2018 International Conference on Advances in Social Sciences and Sustainable Development (ASSSD 2018), Fuzhou, China, 7–8 April 2018. [Google Scholar]
Figure 1.
The general framework and dataflow of SPC for student performance.
Figure 2.
The graphical model of SPC, where the first-stage classifier contains attention-based HRNN while the second-stage classifier is SVM. Note that the sequence feature is represented by the concatenation of vectors and , the output means the learned student sequential features.
Figure 3.
Accuracy and recall of the SPC model with the impact of different sequence lengths.
Figure 4.
Accuracy and recall of the SPC model with the impact of different feature dimensions.
Table 1.
Details of the dataset utilized in our experiments.
Table Number | Tables | Fields |
---|
1 | books lending data | students’ id, books lending time, books’ name, books’ ISBN |
2 | card data | students’ id, category of consumption, position of consumption, way of consumption, time of consumption, amount of consumption, balance of consumption |
3 | dormitory open/close data | students’ id, time of entering/leaving dormitory, direction of entering/leaving dormitory |
4 | library open/close data | students’ id, number of library gate, time of entering/leaving library |
5 | students achievement data | students’ id, number of faculty, grades ranking |
Table 2.
Samples of the books lending data.
Samples | id | Time | Name | ISBN |
---|
1 | 27604 | “2014/10/31 | “Visual C++” | “TP312C 535” |
2 | 19774 | “2014/11/01” | “Photoshop CS4” | “TP391.41 678.4” |
3 | 2200 | “2015/03/10” | “HTML+CSS” | “H313-42 695” |
Table 3.
Samples of the card data.
Samples | Id | Category | Position | Way | Time | Amount | Balance |
---|
1 | 11488 | POS consumption | position21 | canteen | 2014/11/12 15:11:46 | 5.1 | 54.22 |
2 | 11488 | POS consumption | position188 | shower | 2014/11/13 21:30:50 | 2.3 | 133.2 |
3 | 11488 | POS consumption | position829 | library | 2014/11/14 07:40:20 | 2.0 | 45.62 |
Table 4.
Samples of the dormitory open/close data.
Samples | Id | Time | Direction 0-in/1-out |
---|
1 | 30901 | 2014/02/24 08:21:50 | 1 |
2 | 30901 | 2014/02/24 10:10:39 | 0 |
3 | 30901 | 2014/02/24 11:14:18 | 1 |
Table 5.
Samples of the library open/close data.
Samples | Id | Gate | Time |
---|
1 | 1826 | entering gate 5 | 2014/10/10 10:14 |
2 | 14164 | entering gate 5 | 2014/10/10 10:15 |
3 | 27218 | entering gate 5 | 2014/10/10 10:16 |
Table 6.
Samples of the students achievement data.
Samples | Id | Faculty | Rank |
---|
1 | 11448 | 8 | 2233 |
2 | 14164 | 8 | 2234 |
3 | 27218 | 8 | 2237 |
Table 7.
The statistic of the resulted dataset.
# Students | # Devices | # Sequences | Avg.length |
---|
9207 | 15 | 126,032 | 51.6 |
Table 8.
The samples of students’ behavioral sequences.
Id | Sequences |
---|
11548 | 1,canteen,library,canteen,0,water,water,1,canteen,library,supermarket,0,shower,water, … |
2027 | canteen,water,canteen,0,shower,1,school hospital,school bus,gate 1,water,supermarket,canteen, … |
11548 | shower,1,library,0,1,canteen,canteen,0,water,water,1,print center,school office,supermarket,0, … |
Table 9.
Different types of manually extracted features.
Consumption Habits (1–4) | Studying Habits (5–8) | Living Habits (9–19) |
---|
average pos amount | borrow book frequency | lunch days |
average market amount | early library frequency | dinner days |
average canteen amount | late library frequency | early breakfast frequency |
printing center frequency | average library duration | average water frequency |
| | early dorm frequency |
| | late dorm frequency |
| | average dorm duration |
| | spring shower frequency weekly |
| | summer shower frequency weekly |
| | fall shower frequency weekly |
| | winter shower frequency weekly |
Table 10.
Performance comparison of the SPC model with baseline methods on our dataset (△ denotes the improvements of SPC over other baseline methods.
Methods | Accuracy | Accuracy | Recall | Recall |
---|
Logistic Regression | 59.29 | 46.57% | 33.33 | 144.73% |
Bayesian | 59.23 | 46.72% | 33.33 | 144.73% |
Decision Tree | 59.62 | 45.76% | 33.34 | 144.66% |
Random Forest | 59.22 | 46.74% | 33.33 | 144.73% |
SVM+LH | 59.80 | 45.32% | 33.33 | 144.73% |
SVM+CH | 60.77 | 43.00% | 33.72 | 141.90% |
SVM+SH | 59.35 | 46.42% | 33.40 | 144.22% |
SPC | 86.90 | -% | 81.57 | -% |
Table 11.
The comparison result of different cycles.
Cycle | Accuracy | Recall | # Sequences | Avg. length |
---|
week | 86.90% | 81.57% | 126,032 | 51.6 |
month | 71.40% | 56.88% | 44,943 | 170 |
Table 12.
KMO and Bartlett’s Test, where Approx. chi-square means the chi-square test; df means the degree of freedom; Sig. means the probability value of the test.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy. | 0.906 |
Bartlett’s Test of Sphericity | Approx. Chi-square | 9009817.326 |
| df | 1225 |
| Sig. | 0.000 |
Table 13.
Total variance explained (extraction method: principal component analysis).
Component | Initial Eigenvalues | Extraction Sums of Squared Loadings |
---|
Total | % of Variance | Cumulative% | Total | % of Variance | Cumulative% |
---|
1 | 13.128 | 26.256 | 26.256 | 13.128 | 26.256 | 26.256 |
2 | 6.723 | 13.446 | 39.702 | 6.723 | 13.446 | 39.702 |
3 | 5.631 | 11.262 | 50.964 | 5.631 | 11.262 | 50.964 |
4 | 4.296 | 8.592 | 59.556 | 4.296 | 8.592 | 59.556 |
5 | 3.669 | 7.339 | 66.894 | 3.669 | 7.339 | 66.894 |
6 | 2.646 | 5.292 | 72.187 | 2.646 | 5.2929 | 72.187 |
7 | 1.806 | 3.611 | 75.798 | 1.806 | 3.611 | 75.798 |
8 | 1.690 | 3.380 | 79.178 | 1.690 | 3.380 | 79.178 |
9 | 1.643 | 3.286 | 82.464 | 1.643 | 3.286 | 82.464 |
10 | 1.150 | 2.299 | 84.764 | 1.150 | 2.299 | 84.764 |
11 | 0.950 | 1.901 | 86.664 | | | |
12 | 0.695 | 1.390 | 88.054 | | | |
13 | 0.496 | 0.992 | 89.046 | | | |
14 | 0.462 | 0.923 | 89.969 | | | |
15 | 0.428 | 0.856 | 90.826 | | | |
16 | 0.381 | 0.763 | 91.859 | | | |
17 | 0.357 | 0.713 | 92.302 | | | |
18 | 0.345 | 0.690 | 92.991 | | | |
19 | 0.303 | 0.607 | 93.598 | | | |
20 | 0.270 | 0.539 | 94.138 | | | |
21 | 0.226 | 0.452 | 94.590 | | | |
22 | 0.223 | 0.447 | 95.037 | | | |
23 | 0.198 | 0.396 | 95.433 | | | |
24 | 0.174 | 0.347 | 95.780 | | | |
25 | 0.155 | 0.310 | 96.091 | | | |
26 | 0.150 | 0.300 | 96.390 | | | |
27 | 0.145 | 0.290 | 96.680 | | | |
28 | 0.132 | 0.264 | 96.944 | | | |
29 | 0.121 | 0.242 | 97.186 | | | |
30 | 0.117 | 0.235 | 97.421 | | | |
31 | 0.111 | 0.222 | 97.643 | | | |
32 | 0.105 | 0.209 | 97.853 | | | |
33 | 0.104 | 0.208 | 98.060 | | | |
34 | 0.094 | 0.188 | 98.248 | | | |
35 | 0.089 | 0.177 | 98.426 | | | |
36 | 0.082 | 0.164 | 98.590 | | | |
37 | 0.076 | 0.152 | 98.742 | | | |
38 | 0.074 | 0.147 | 98.889 | | | |
39 | 0.068 | 0.137 | 99.026 | | | |
40 | 0.064 | 0.127 | 99.153 | | | |
41 | 0.060 | 0.120 | 99.273 | | | |
42 | 0.058 | 0.116 | 99.390 | | | |
43 | 0.054 | 0.108 | 99.498 | | | |
44 | 0.053 | 0.106 | 99.604 | | | |
45 | 0.043 | 0.085 | 99.689 | | | |
46 | 0.039 | 0.079 | 99.768 | | | |
47 | 0.036 | 0.071 | 99.839 | | | |
48 | 0.031 | 0.061 | 99.900 | | | |
49 | 0.026 | 0.053 | 99.953 | | | |
50 | 0.023 | 0.047 | 100.000 | | | |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).