Next Article in Journal
Verification of Methods for Determining Flow Resistance Coefficients for Floodplains with Flexible Vegetation
Next Article in Special Issue
AI-Assisted Enhancement of Student Presentation Skills: Challenges and Opportunities
Previous Article in Journal
Thermal Conductivity of Coconut Shell-Incorporated Concrete: A Systematic Assessment via Theory and Experiment
Previous Article in Special Issue
Developing an AI-Based Learning System for L2 Learners’ Authentic and Ubiquitous Learning in English Language
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Rediscovering the Uptake of Dashboard Feedback: A Conceptual Replication of Foung (2019)

1
School of Journalism, Writing and Media, The University of British Columbia, Vancouver, BC V6T 1Z4, Canada
2
Department of English Language Education, The Education University of Hong Kong, Hong Kong
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(23), 16169; https://doi.org/10.3390/su142316169
Submission received: 1 October 2022 / Revised: 6 November 2022 / Accepted: 29 November 2022 / Published: 3 December 2022
(This article belongs to the Special Issue Language Education in the Age of AI and Emerging Technologies)

Abstract

:
Learning analytics has been widely used in the context of language education. Among the studies that have used this approach, many have developed a dashboard that aims to provide students with recommendations based on data so that they can act on these suggestions and improve their performance. To further our understanding of dashboard research, this study aims to replicate an earlier study using a new data mining strategy, association rule mining, to explore if the new strategy can (1) generate comparable results; and (2) provide new insights into feedback uptake in dashboard systems. The original study was conducted with 423 students at a Hong Kong university and implemented a dashboard for a suite of first-year composition courses. It used a classification tree to identify factors that could predict the uptake of tool-based and general recommendations made by the dashboard. After performing association rule mining with the original data set, this study found that this approach allowed for the identification of additional useful factors associated with the uptake of general and tool-based recommendations with a higher accuracy rate. The results of this study provide new insights for dashboard research and showcase the potential use of association rule mining in the context of language education.

1. Introduction

1.1. Learning Analytics and Dashboards

This study is a replication of a learning analytics study of Foung [1]. Learning analytics refers to “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” [2]. In practice, it concerns the use of data to inform teaching and learning. Some researchers have conducted learning analytics to better understand how students learn in a course. For example, Macfadyen and Dawson [3] conducted a study at a Canadian university to understand how students’ engagement predicts their performance. Such information can be useful for teachers so that they can focus on certain aspects of teaching and improve the learning outcomes of students. Others have developed learning analytics tools that use data to provide information for students or teachers. For example, OnTask adopts a data-driven approach to provide personalized feedback [4]. Another tool, Threadz, was developed to display forum activities visually to inform students and teachers about the interactions on discussion forums [5].
Dashboards, or early alert systems, are one common type of learning analytics tool [6]. They aim to alert students at an early stage of their learning so they can work and improve. They include dashboards provided by common learning management systems, such as Moodle, Blacboard and Canvas [7,8]. In addition to dashboards offered by learning management systems, some researchers are interested in developing dashboards for their specific contexts. For example, Lu Huang, Huang and Yang [9] designed a system that sends notifications to students to arrange face-to-face meetings. Chaudy and Connolly [10] also examined the use of dashboard systems and whether various progress indicators can help students progress. They concluded that the time it takes to understand these indicators plays a role in their success. Lonn, Aguilar and Teasley [11] examined students’ perceptions of dashboard systems and confirmed that a positive perception of the tool is important in motivating them to take its suggestions.
There is no scarcity of studies on dashboards and their development or on learning analytics, more generally. To move forward in this line of research, researchers can adopt analytics-based methods in other disciplines, which is one objective of this study.

1.2. Data Mining

Within the domain of research on learning analytics in language education, data mining is used in a variety of ways. While learning analytics adopts a broad perspective concerning the use of data to inform teaching and learning, data mining refers to the computerized methods that aim to detect patterns in data [12]. A range of data mining strategies have been used including anomaly detection, classification, regression, clustering, summarization and association rule learning [13]. Among these strategies, classification is a technique that aims to assign objects to pre-defined categories [14]. Researchers in this field continue to look for robust methods to enhance classification techniques [15,16]. Some common examples of classification in educational research include classification trees, logistics regression and artificial neural networks.
Association is another technique, which is more commonly used in other forms of research such as marketing and operation management [12]. It is also known as market basket analysis. It aims to uncover interesting relationships hidden in large data sets [17]. In practice, businesses may be interested in whether customers who buy certain goods at a shop are more likely to buy related goods. If they uncover the relationships between these two products, store managers can place them together to potentially boost sales. Association analysis can also be useful in educational research, but its use has been limited. These studies have a broad scope (e.g., across computer science courses [18] and program level [19]), which provides opportunities for wider applications to teaching and learning activities.

1.3. Replication Studies

While dashboard studies are common in language education, it is essential to continue to expand the current research regime for more possibilities. One possible way to expand dashboard research is to conduct replication studies. In general, academic research must offer the potential for others to “authenticate and replicate” [20]. There are many types of replication research. One common approach in pure science is the exact replication or reproduction study [21]. Exact replication studies attempt to follow the protocols and procedures of the original study as closely as possible. Reproduction studies come the closest; in fact, they use the original data set to attempt to reproduce the original study, e.g., [22]. Other close replication studies may alter one or two elements of the original research.
Conceptual replication, however, only aims to replicate the original idea of the research. It may alter a number of research elements to answer a comparable research question [23]. Generally, some replication studies aim to verify the results of the original study so as to give readers confidence, e.g., [22], while other replication studies attempt to extend the original study and provide new insights in the same area of interest. The present replication study falls into the latter category. It is an instance of conceptual research, which alters the data analysis strategy and aims to extend the original study.

1.4. Context—Foung [1]

The original study was performed by Foung [1] and published in the Journal of Applied Research in Higher Education in 2019. It was conducted at a university in Hong Kong with 423 students. The original study reported students’ responses to a dashboard, Course Diagnostic Report (CDR), designed specifically for the suite of first-year composition courses at the research site. The objective was to examine if students’ behaviours would change after receiving feedback and suggestions from the dashboard system. More details about CDR can be found in the methodology chapter of this paper. In the original study, students were invited to try CDR by entering their first assessment results into the tool and receiving feedback and suggestions from it. After students tried the tool, the research team asked the students to complete a questionnaire about (1) their perceptions of the tool; (2) the likelihood of taking tool-based suggestions; (3) the likelihood of taking general suggestions; (4) their attitudes towards the course; (5) linguistic self-confidence; (6) classroom anxiety; and (7) demographic information. Parts (4)–(6) were taken from the Motivational State Questionnaire developed by Guilloteaux and Dörnyei (2008). More information about the questionnaire can be found in the Section 2. To examine how students acted on the recommendations, descriptive statistics and classification tree techniques were employed.
Results suggested that students are generally more likely to act on tool-based recommendations provided by the dashboard (e.g., completing certain course-specific online tasks) than general recommendations (e.g., visiting online activities from the self-access centre of the university). The classification tree for predicting the uptake of tool-based recommendations achieved error rates of 6.7–13.5%, which was considered acceptable. The factors identified in this classification included the likelihood of taking up general recommendations, attitude toward the tool, and average English public exam scores. The classification tree predicting the uptake rate of general recommendations had error rates of 6.0–11.3%. It also used three factors: the likelihood of taking up tool-based recommendations, attitude toward the tool and attitude toward the course. The study suggested that it is sensible for students with “average” abilities to see the potential of acting on suggestions and perform better (compared to those with “outstanding” or “less than satisfactory” ability levels). The study also discussed why expected grade was not a predictor, as previous studies had found it to be: the expectations of students did not vary a lot, so expected grades did not matter much.
While the original study seems to build a strong case for the uptake of suggestions, there are other possibilities within the context of data mining that allow one to examine the relevance of other factors, including that with association rule mining analysis. Therefore, there is room for the current study to extend the original study by re-analyzing the data using this new approach. In particular, this replication study aims to answer the following questions:
  • How can replicating Foung’s [1] study using association analysis produce comparable results?
  • Can the use of association rule mining provide new perspectives in the understanding of dashboard-related research?

2. Materials and Methods

2.1. Procedures

In the original study, the research team conducted class visits after receiving teachers’ permission. During these class visits, they presented the features of CDR and gave students access to it. Then, students had approximately 5–10 min to try it out. They were then given a questionnaire to evaluate the tool.
In the current replication study, the original data set was adopted. Next, the data set was subject to basic cleaning (e.g., excluding blank questionnaire responses). Then, domain scores were computed and converted to categorical variables, which were the same as those used in the original study. More details on the variables can be found in the Section 2.4. After these processing and cleansing procedures, the current research team conducted an association rule analysis to identify new perspectives. In other words, the only major difference between the current replication and the original replication was the data analysis approach, from classification tree analysis to association rule mining.

2.2. Dashboard—CDR

As this study aimed to replicate a study about a dashboard, it is important to describe the dashboard so that the readers can better understand its design and its implications for the uptake of feedback. CDR is a dashboard that aims to provide students with diagnostic feedback based on their first assessment and motivate them to take action to improve their performance. In practice, it is a standalone webpage hosted online as an independent site. After accessing the site, students can see their overall grade on their first assessment, as well as their scores on each of its components (e.g., content, organization, language, referencing).
Then, the page tells students if they are “above average”, “below average” or in the “bottom 5%” for each assessment component. Based on this, the system provides them with various recommendations. Depending on the assessment component, they can be (1) course-based recommendations (e.g., to revisit a certain online activity within the course materials) or (2) general recommendations (e.g., to visit certain tools or resources on the internet or provided by the university). Figure 1 presents a screenshot of the tool from the original study. For example, if a student’s referencing grade is below average, he or she will be asked to visit the research site’s referencing guide—a general recommendation. If students are achieving at an above-average level, a generic recommendation and encouragement are provided.

2.3. Participants

This study adopts Foung’s [1] data set so the participants in this study are the same as those in the original study. All 423 study participants were either first- or second-year students taking one of the first-year English courses offered by the university. There was an almost equal distribution of male and female students (49.6% male and 50.4% female). These students were completing different programs, including business, engineering, applied science, design and health sciences. The students had various levels of English proficiency. Their proficiency was measured based on the secondary school exit exam in Hong Kong, the Hong Kong Diploma of Secondary Education (HKDSE). Approximately 57.4% of the participants achieved an equivalent of 5.48–5.68 overall band scores on the International English Language Testing System (IELTS). Others achieved the equivalent of a 6.31–7.77 overall band score on the IELTS. Students were enrolled in different first-year composition courses based on their proficiency. Therefore, the researchers believed that their language proficiency did not affect their uptake of suggestions.
All students were taking one of the three first-year English courses offered by the university. These courses help students develop their academic literacy skills. There are approximately three assessments in each course. All of the students in the original study had completed the first assessment and received their grades.

2.4. Instruments

The instrument used in the original study included 33 questions representing six key domains and an additional 12 questions on demographic information (e.g., gender and students’ course grades). The objective of this replication study is to re-analyze the data using association rule analysis rather than a classification tree. Therefore, the first part of the results from the original study—on the students’ general perception of the tool—is beyond the scope of this study. For a general idea, however, readers can refer to Table 1, which shows the descriptive statistics from the original study.
Aspects 1–3 aim at capturing the uptake of recommendations in terms of different aspects (i.e., different aspects of perception of the tool, aspects of tool-based action or aspects of general recommendations). Readers can find out more about their theoretical foundations by reviewing the original paper. Aspects 4–6 come from a well-validated instrument, the Motivational State Questionnaire [24].
To facilitate association rule mining, the domain scores for the six key domains were computed and then converted to categorical variables. The same process was followed for three other demographic variables: gender, public exam scores and expected course grade. These procedures were the same as the original study, in which classification tree analysis was completed with categorical variables. Details of these variables can be found in Table 2 below.

2.5. Data Analysis

The objective of this paper was to replicate Foung’s [1] study to see if an association rule mining approach could provide a new perspective on how students take up the recommendations made by CDR. Because of this objective, the current study did not perform descriptive statistics, as was done in the original study. Instead, this study adopted the Apriori algorithm for association rule mining with R version 4.0.3 and library arules version 1.7. To align with the original study, the rule mining process focused on two key factors: (1) the uptake of tool-based recommendations and (2) the uptake of general recommendations.
As previously discussed, there are no common practices for how to choose an optimal/best rule or an evaluation indicator. Therefore, this study used 0.5 as the cut-off for Support. This is higher than some previous studies, e.g., [19]: Support = 0.05 to 0.1 and [18]: Support: 0.01; the authors believed that this would allow for the discovery of rules that are more common and useful. The Confidence cut-off was set at 0.95, which is also higher than previous studies, e.g., [18]: Confidence: 0.8. In fact, the level of accuracy (i.e., Confidence) is higher than that of the original study, which had an accuracy rate of 86.5–94.0%. Moreover, based on the recommendation by [14], rules must achieve a Lift of 1. The rules that meet these criteria are examined further in the following section. Rule evaluation depends on the situation [25]. The current study aimed to determine the factors associated with the uptake of recommendations, not associations between commodities. Therefore, the rules generated were subjected to further discussion instead of being considered individually.

3. Results

This study aimed to replicate the original study and explore whether the use of an association rule approach can provide a new perspective on the uptake of recommendations. In particular, this study derived rules for the factors associated with the uptake of tool-based recommendations and general recommendations. A total of 4 rules for tool-based and 14 rules for general recommendations met the following criteria: Support > 0.5, Confidence > 0.95 and Lift > 1.
Among the rules for tool-based recommendations, the rule with the highest Confidence level achieved a value of 0.97. This rule included two factors—uptake of general recommendations and public exam scores—and had a Support level of 0.53. This means that if a student takes up general recommendations and has an average public exam score, there is a 97% chance that he or she will take up tool-based recommendations. The other three rules included the following factors: a positive perception of the tool and a positive attitude toward language learning. The level of Support for the other rules ranged from 0.51 to 0.53. Table 3 presents an overview of these rules predicting the uptake of tool-based recommendations.
Because this is a replication study, it is also important to highlight the results of the original study. The classification tree used in the original study included only three factors: uptake of general recommendations, attitude toward the tool, and public exam score. Therefore, this replication study identified the same factors for the uptake of tool-based recommendations.
Among the 14 rules for the uptake of general recommendations, the rule that achieved the highest Confidence level achieved a value of 0.97 and included three factors: uptake of tool-based recommendations, attitude to course, and linguistic self-confidence. In practice, this rule occurred 506 times with a Support level of 0.72. Therefore, if a student adopts tool-based recommendations and has a positive attitude toward language learning and high linguistic self-confidence, there is a 97% chance that he or she will take up general recommendations. The remaining rules included a positive perception of the tool, an average score on the public exam and a high expected grade in the course. The Support levels for the remaining rules ranged from 0.51 to 0.84. Table 4 presents the details of these rules for the uptake of general recommendations.
In the original study, the classification tree predicting uptake of general recommendations included three factors: the uptake of tool-based recommendations, neutral and positive attitudes toward the tool and a positive attitude toward the course. It is interesting to note that the current study did not identify attitude towards the course (as a motivational construct) as a factor, but instead considered two other motivational constructs: attitude toward language learning and linguistic self-confidence.

4. Discussion

4.1. RQ1: Are the Results from This Replication Study Comparable to Foung [1]?

The current study attempted to replicate Foung’s [1] study using a new analysis strategy, association rule mining. The results of the current study were largely comparable to those of the original study. The accuracy level (i.e., confidence) was determined by the research team to be greater than 0.95, which is higher than that of the original analysis, which yielded an accuracy level of 86.5–94.0%. While the original accuracy rate was satisfactory when compared to previous similar studies [26], the results in the current study showed a higher degree of accuracy.
In addition to its higher accuracy rate, this replication study identified more factors associated with the uptake of feedback. On the one hand, the three factors associated with tool-based recommendation uptake and the three factors associated with general recommendations in the original study were all identified in the current replication. On the other hand, additional useful factors (one for tool-based recommendations and three for general recommendations) were also identified. With more meaningful factors being identified, a greater variety of actions can be taken with regard to teachers and students (which are discussed in the Section 5). Therefore, it is reasonable to argue that the current replication study produced results comparable to—or even better than—the original study.

4.2. RQ2: Can Association Rule Mining Provide New Perspectives?

This section aims to answer the second research question, concerning whether association rule mining can provide new perspectives. Beyond the previous discussion on the comparability of results, it is plausible to use the current study to claim that new insights can be derived through association rule mining. In particular, it suggests that attitude toward a course can play a role in the uptake of tool-based recommendations, while linguistic self-confidence, average public exam scores and expected grade can do the same for general recommendations.
The “attitude toward the course” factor aligns with more recent studies on the sense-making process of students who use dashboards. In particular, de Quincey et al. [27] argue that trusting the information on a dashboard system is a key factor that allows students to make sense of it. To a certain extent, attitude toward the course echoes this, as trust in the system originates in trust in the course (i.e., attitude). Another recent study by Jivet et al. [28] argues that “reference frames”, which include students’ goals and past performance, can be important factors when students decide whether to take recommendations made by a dashboard. In the current study, high expected grades, linguistic self-confidence and public exam scores echo this finding. For example, when reading a recommendation, students with a certain goal consider their linguistics self-confidence and past public exam scores to determine whether to take it in order to achieve their expected grade. In fact, if they perceive that they have sufficiently high previous exam scores (i.e., past performance) to achieve their expected grades (i.e., linguistic self-confidence), they tend to take the recommendations. This can help explain how the new factors are relevant to the uptake of recommendations and how this replication study provides new perspectives through association rule mining.

5. Implications and Conclusions

5.1. Implications for Dashboard Designers

One key issue in the design of CDR was the lower uptake of general recommendations. As reported in the original study, students are “pragmatic” in doing extra work [29] for courses, while general recommendations do not seem to be relevant to their grades. One way to mitigate this is to unleash the full potential of association rules in the current study. Based on association Rule 14 for general recommendations (see Table 4 above), the dashboard could encourage students to take up general recommendations by stating, “83% of students [Support] who take tool-based recommendations, like you, also take up general recommendations. How about getting started now by visiting this link?” Such a simple message could motivate students to take general recommendations based on a kind of intangible peer pressure derived from association rules. The same could also be applied to assessment-based rules (e.g., “61% of students who want a good grade, have high levels of confidence and take tool-based suggestions are likely to take general suggestions as well. Why don’t you join this 61%?”). Based on association rules, dashboard designers have the flexibility to relate the uptake of general suggestions to assessment-related suggestions (e.g., expected grade). Once again, this motivates students to take general recommendations, which may not be popular.

5.2. Implications for Language Teachers

This replication study shows the potential of association rules in explicating students’ uptake of feedback. One implication for language teachers is similar to one for dashboard designers: the use of association rules can motivate students to learn. The implications of the use of rule-based messages can also be applied to language teachers, who can use the aforementioned rules to ask students to take up general recommendations or engage with other activities more generally (based on rules derived from other association rule analyses).
Another implication is that association rule analysis may outperform other data mining techniques in terms of comprehensibility. Some previous studies discuss how teachers can communicate findings from data mining to inform their teaching, e.g., [30]. These past studies mainly focus on the comprehensibility of other data mining techniques, such as artificial intelligent networks, logistics regression and classification trees. Many conclude that comprehensibility is important for language teachers to communicate the results with students [31]. Based on the results of the current study, it seems that association rule analyses can also be easily understood by language teachers.

5.3. Implications for Big Data Analytics in Language Education

Because of the new insights brought by association rule mining, one obvious implication of the current study is that this approach can be potentially used in more contexts in big data analytics within language education. Previous studies on the use of data mining in language learning have not focused much on association rule mining [32]. Instead, many have focused on more “common” techniques, such as classification trees, data visualization and cluster analysis. While they have made significant contributions to SLA research, this study shows that association rule mining can extend those lines of research by providing a greater variety of rules for deliberation. In this replication study, the association rule mining process successfully uncovered rules (e.g., on motivational constructs) that were not unveiled in the classification tree analysis but are useful for practitioners. Additionally, unlike traditional association rule mining in marketing research, the use of association rule mining in SLA research helps uncover the relationship between student attributes and learning outcomes (e.g., feedback uptake in the current study). Discovering these attributes made the analysis more flexible, as it was not necessary to consider how these attributes could potentially be “placed” together, as in market basket analysis in marketing research. This shows that association rule mining can contribute to SLA research in a slightly different manner than it does in marketing research.

5.4. Conclusions

This study aimed to conceptually replicate a study on the uptake of feedback received from a dashboard within the context of a first-year composition course. By re-analyzing the original data with association rule mining (i.e., market basket analysis), this replication study showed that association rule mining can identify useful rules that include more factors associated with students’ uptake of feedback. This suggests that association rule analysis can help dashboard designers and language teachers better communicate data mining results and provide more avenues for SLA researchers to explore the use of data mining within the field.

Author Contributions

Conceptualization, methodology, formal analysis: D.F.; writing—original draft preparation, D.F. and L.K.; writing—review and editing: D.F. and L.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The original study was approved by the institutional review board.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the original study.

Data Availability Statement

The data presented in this study are available upon reasonable request from the corresponding author.

Acknowledgments

The authors are grateful to all reviewers and the project team members in the original CDR project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Foung, D. Making Good Suggestions in Analytics-Based Early Alert Systems: Shaping Minds and Changing Behaviours. J. Appl. Res. High. Educ. 2019, 12, 109–123. [Google Scholar] [CrossRef]
  2. Society for Learning Analytics Research What Is Learning Analytics? In What is Learning Analytics. Available online: https://www.solaresearch.org/about/what-is-learning-analytics/ (accessed on 1 December 2022).
  3. Macfadyen, L.P.; Dawson, S. Mining LMS Data to Develop an “Early Warning System” for Educators: A Proof of Concept. Comput. Educ. 2010, 54, 588–599. [Google Scholar] [CrossRef]
  4. Pardo, A.; Bartimote, K.; Shum, S.B.; Dawson, S.; Gao, J.; Gašević, D.; Leichtweis, S.; Liu, D.; Martínez-Maldonado, R.; Mirriahi, N.; et al. OnTask: Delivering Data-Informed, Personalized Learning Support Actions. J. Learn. Anal. 2018, 5, 235–249. [Google Scholar] [CrossRef] [Green Version]
  5. Froehlich, F. Social Network Analysis as a Progressive Tool for Learning Analytics: A Sample of Teacher Education Students’ Course Interactions with Threadz on the Canvas Platform. Ph.D. Thesis, University of British Columbia, Vancouver, BC, Canada, 2020. [Google Scholar]
  6. Essa, A.; Ayad, H. Student Success System: Risk Analytics and Data Visualization Using Ensembles of Predictive Models. In Proceedings of the 2nd international conference on learning analytics and knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; pp. 158–161. [Google Scholar]
  7. Subramanian, P.; Zainuddin, N.; Alatawi, S.; Javabdeh, T. A Study of Comparison between Moodle and Blackboard Based on Case Studies for Better LMS. J. Inf. Syst. Res. Innov. 2014, 6, 26–33. [Google Scholar]
  8. Munguia, P.; Brennan, A.; Taylor, S.; Lee, D. A Learning Analytics Journey: Bridging the Gap between Technology Services and the Academic Need. Internet High. Educ. 2020, 46, 100744. [Google Scholar] [CrossRef]
  9. Lu, O.H.; Huang, J.C.; Huang, A.Y.; Yang, S.J. Applying Learning Analytics for Improving Students Engagement and Learning Outcomes in an MOOCs Enabled Collaborative Programming Course. Interact. Learn. Environ. 2017, 25, 220–234. [Google Scholar] [CrossRef]
  10. Chaudy, Y.; Connolly, T. Specification and Evaluation of an Assessment Engine for Educational Games: Empowering Educators with an Assessment Editor and a Learning Analytics Dashboard. Entertain. Comput. 2018, 27, 209–224. [Google Scholar] [CrossRef] [Green Version]
  11. Lonn, S.; Aguilar, S.J.; Teasley, S.D. Investigating Student Motivation in the Context of a Learning Analytics Intervention during a Summer Bridge Program. Comput. Hum. Behav. 2015, 47, 90–97. [Google Scholar] [CrossRef]
  12. Romero, C.; Ventura, S. Data Mining in Education. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2013, 3, 12–27. [Google Scholar] [CrossRef]
  13. Foung, D. Mining Students’ Data to Identify at-Risk Students in an Academic English Course: A Comparison of Two Classification Techniques from a Language Teacher and Statistical Perspective. CALL-EJ 2019, 20, 77–91. [Google Scholar]
  14. Tan, P.-N.; Steinbach, M.; Karpatne, A.; Kumar, V. Introduction to Data Mining, 2nd ed.; Pearson Education, Inc.: New York, NY, USA, 2019; ISBN 978-0-13-312890-1. [Google Scholar]
  15. Nasiri, E.; Berahmand, K.; Li, Y. Robust Graph Regularization Nonnegative Matrix Factorization for Link Prediction in Attributed Networks. Multimed. Tools Appl. 2022, 1–24. [Google Scholar] [CrossRef]
  16. Berahmand, K.; Mohammadi, M.; Saberi-Movahed, F.; Li, Y.; Xu, Y. Graph Regularized Nonnegative Matrix Factorization for Community Detection in Attributed Networks. IEEE Trans. Netw. Sci. Eng. 2022, 1–14. [Google Scholar] [CrossRef]
  17. Agrawal, R.; Imieliński, T.; Swami, A. Mining Association Rules between Sets of Items in Large Databases. In Proceedings of the 1993 ACM SIGMOD International Conference on Management of Data; Association for Computing Machinery, New York, NY, USA, 1 June 1993; pp. 207–216. [Google Scholar]
  18. Alangari, N.; Alturki, R. Association Rule Mining in Higher Education: A Case Study of Computer Science Students. In Smart Infrastructure and Applications: Foundations for Smarter Cities and Societies; Mehmood, R., See, S., Katib, I., Chlamtac, I., Eds.; EAI/Springer Innovations in Communication and Computing; Springer International Publishing: Cham, Switzerland, 2020; pp. 311–328. ISBN 978-3-030-13705-2. [Google Scholar]
  19. Ahmed, S.; Paul, R.; Hoque, A.S.M.L. Knowledge Discovery from Academic Data Using Association Rule Mining. In Proceedings of the 2014 17th International Conference on Computer and Information Technology (ICCIT), Dhaka, Bangladesh, 22–23 December 2014; pp. 314–319. [Google Scholar]
  20. Franklin, M.I. Understanding Research: Coping with the Quantitative–Qualitative Divide; Routledge: London, UK, 2012; ISBN 978-0-203-11886-3. [Google Scholar]
  21. Miłkowski, M.; Hensel, W.M.; Hohol, M. Replicability or Reproducibility? On the Replication Crisis in Computational Neuroscience and Sharing Only Relevant Detail. J. Comput. Neurosci. 2018, 45, 163–172. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Foung, D.; Kohnke, L. The Development and Validation of the Feedback in Learning Scale (FLS): A Replication Study. Educ. Res. Eval. 2022, 27, 164–187. [Google Scholar] [CrossRef]
  23. Makel, M.C.; Plucker, J.A. An Introduction to Replication Research in Gifted Education: Shiny and New Is Not the Same as Useful. Gift. Child Q. 2015, 59, 157–164. [Google Scholar] [CrossRef]
  24. Guilloteaux, M.J.; Dörnyei, Z. Motivating Language Learners: A Classroom-Oriented Investigation of the Effects of Motivational Strategies on Student Motivation. TESOL Q. 2008, 42, 55–77. [Google Scholar] [CrossRef]
  25. Linoff, G.S.; Berry, M.J.A. Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management; John Wiley & Sons: Hoboken, NJ, USA, 2011; ISBN 978-1-118-08745-9. [Google Scholar]
  26. Shahiri, A.M.; Husain, W. A Review on Predicting Student’s Performance Using Data Mining Techniques. Procedia Computer Science 2015, 72, 414–422. [Google Scholar] [CrossRef] [Green Version]
  27. de Quincey, E.; Briggs, C.; Kyriacou, T.; Waller, R. Student Centred Design of a Learning Analytics System. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge; Association for Computing Machinery, New York, NY, USA, 4 March 2019; pp. 353–362. [Google Scholar]
  28. Jivet, I.; Scheffel, M.; Schmitz, M.; Robbers, S.; Specht, M.; Drachsler, H. From Students with Love: An Empirical Study on Learner Goals, Self-Regulated Learning and Sense-Making of Learning Analytics in Higher Education. Internet High. Educ. 2020, 47, 100758. [Google Scholar] [CrossRef]
  29. Huon, G.; Spehar, B.; Adam, P.; Rifkin, W. Resource Use and Academic Performance among First Year Psychology Students. High Educ. 2007, 53, 1–27. [Google Scholar] [CrossRef]
  30. Foung, D.; Chen, J.; Lin, L.H. Unveiling the Inconvenient Truth: The Innovation Process in Implementing a University Dashboard. In Enhancing Learning Design for Innovative Teaching in Higher Education; IGI Global: Hershey, PA, USA, 2020; pp. 162–181. [Google Scholar]
  31. Cruz-Jesus, F.; Castelli, M.; Oliveira, T.; Mendes, R.; Nunes, C.; Sa-Velho, M.; Rosa-Louro, A. Using Artificial Intelligence Methods to Assess Academic Achievement in Public High Schools of a European Union Country. Heliyon 2020, 6, e04081. [Google Scholar] [CrossRef]
  32. Warschauer, M.; Yim, S.; Lee, H.; Zheng, B. Recent Contributions of Data Mining to Language Learning Research. Annu. Rev. Appl. Linguist. 2019, 39, 93–112. [Google Scholar] [CrossRef]
Figure 1. Diagnostic report and recommendations.
Figure 1. Diagnostic report and recommendations.
Sustainability 14 16169 g001
Table 1. Descriptive Statistics (adapted from [1]).
Table 1. Descriptive Statistics (adapted from [1]).
(Index) Aspects to Be Assessed [No. of Items]
Items
NMin.Max.MStd. Dev.
(1) Perception of the tool [5]
Tool: User-friendly423153.870.84
Usefulness: Comparison of results 423153.490.86
Usefulness: Recommendations423153.290.82
Usefulness: Know Strengths and Weaknesses423153.340.85
Having a similar tool in other courses423153.710.90
(2) Tool-based action to be taken [4]
Course-based Online Activities423143.110.75
Books423142.910.74
Online (not based on the course) resources423143.280.70
Online courses (MOOC)423143.080.74
(3) Additional action to be taken [5]
More time for preparation of next assessment423143.060.55
More attentive in class423142.930.56
Meeting the teacher423142.730.63
More exercises to improve proficiency423142.920.52
(4) Attitudes to Course [9]4231.004.783.140.57
(5) Linguistic Self-confidence [7]4231.294.863.100.53
(6) Classroom Anxiety [3]4231.005.002.660.83
Table 2. Converted Variables for Association Rule Analysis (adapted from [1]).
Table 2. Converted Variables for Association Rule Analysis (adapted from [1]).
AspectsAspects to Be AssessedConversion
1Perception of the tool 1–2: Negative; 3: Neutral; 4–5: Positive
2Tool-based action to be taken1–2: Unlikely; 3–4: Likely
3Additional action to be taken
4Attitudes about the course 1–2: Not Good; 3–5: Good
5Linguistic self-confidence
6Classroom anxiety1–2: Not Anxious; 3–5: Anxious
7Gender1: Male; 2: Female
8Grades on Public English Exam HKDSE—4: Average; 5 or above: Good
9Expected grades for the course Course-based—0–2.5: Average; 3–4.5: Good
Table 3. Association Rules Identified for Tool-based Recommendations.
Table 3. Association Rules Identified for Tool-based Recommendations.
No.FactorsSupportConfidenceLift
1 Taking General Recommendation = Likely
Public Exam Score = Average
0.530.971.09
2 Tool Perception = Positive
Taking General Recommendation = Likely
0.510.961.09
3 Tool Perception = Positive0.530.951.08
Table 4. Association Rules Identified for General Recommendations.
Table 4. Association Rules Identified for General Recommendations.
No.FactorsSupportConfidenceLift
1 Tool-based Recommendations = Likely
Attitudes to Course = Good
Linguistics Self-confidence = Good
0.720.971.07
2 Tool-based Recommendations = Likely
Attitudes to Course = Good
Linguistics Self-confidence = Good
Expected Grade = Good
0.580.961.07
3 Tool-based Recommendations = Likely
Attitudes to Course = Good
Expected Grade = Good
0.610.961.07
4 Tool-based Recommendations = Likely
Attitudes to Course = Good
0.770.961.06
5 Tool-based Recommendations = Likely
Attitudes to Tool = Positive
0.510.961.06
6 Attitudes to Course = Good
Public Exam Score = Average
0.510.961.06
7 Tool-based Recommendations = Likely
Linguistics Self-confidence = Good
0.750.961.06
8 Tool-based Recommendations = Likely
Linguistics Self-confidence = Good
Expected Grade = Good
0.910.961.06
9 Attitudes to Tool = Positive0.530.951.06
10 Tool-based Recommendations = Likely
Expected Grade = Good
0.670.951.05
11 Attitudes to Course = Good
Linguistics Self-confidence = Good
0.770.951.05
12 Attitudes to Course = Good
Linguistics Self-confidence = Good
Expected Grade = Good
0.620.951.05
13 Tool-based Recommendations = Likely
Public Exam Score = Average
0.530.951.05
14 Tool-based Recommendations = Likely0.840.951.05
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Foung, D.; Kohnke, L. Rediscovering the Uptake of Dashboard Feedback: A Conceptual Replication of Foung (2019). Sustainability 2022, 14, 16169. https://doi.org/10.3390/su142316169

AMA Style

Foung D, Kohnke L. Rediscovering the Uptake of Dashboard Feedback: A Conceptual Replication of Foung (2019). Sustainability. 2022; 14(23):16169. https://doi.org/10.3390/su142316169

Chicago/Turabian Style

Foung, Dennis, and Lucas Kohnke. 2022. "Rediscovering the Uptake of Dashboard Feedback: A Conceptual Replication of Foung (2019)" Sustainability 14, no. 23: 16169. https://doi.org/10.3390/su142316169

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop