Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting
Abstract
:1. Introduction
2. Background
2.1. Learning Analytics Adoption
2.2. Designing a Capability Model for Learning Analytics
2.3. Evaluation of a Design Science Research Artefact
3. Materials and Methods
3.1. Research Goal and Methods
3.2. Pluralistic Walk-Throughs
- We organized multiple sessions with stakeholders from different institutions as participants. In each session, the participants were from one single organization. They were the capability model’s target users—policymakers, senior managers, program directors, learning analysts, et cetera (characteristic 1). Each pluralistic walk-through lasted between two and three hours.
- To complete a planning task, the participants used the capability model. A digital version of the model was available to support the task (characteristic 2). Details on the digital tool are provided in Section 3.6.
- During the pluralistic walk-through, participants were asked to solve a planning task: plan the implementation of an LA program at their institution to reach a predefined goal they have with LA (characteristic 3). This task led to a ‘roadmap’ in which implementation process for the next two years was planned.
- During the pluralistic walk-throughs, one researcher represented the designers of the capability model. Another researcher took notes on interesting situations when the participants performed the task (characteristic 4). We also video-recorded each walk-through so they could be transcribed and analyzed.
- At the end of each pluralistic walk-through, there was a group discussion moderated by the researchers. During the discussion, participants elaborated on decisions made during the process and discussed whether the capability model provided sufficient support to complete the task (characteristic 5).
- The capability model positively contributes to the adoption of LA by Dutch HEIs.
- The operational descriptions provided by the model help to make the adoption of LA more concrete.
- The capability model is complete. That is, there are no missing capabilities that are important to the adoption of LA by HEIs.
3.3. Expert Evaluation
3.4. Survey
3.5. Participants
3.5.1. Participants for Pluralistic Walk-Throughs
3.5.2. Participants for Expert Evaluation
3.6. Tools and Pilot Session
3.7. Analysis
4. Results
4.1. Effectiveness of the Capability Model
4.2. Perceived Usefulness of the Capability Model
4.3. Completeness of the Capability Model
5. Discussion
5.1. Research Outcomes
5.2. Implications for Research and Practice
5.3. Limitations
5.4. Future Work
Supplementary Materials
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
- All in all, I find the Learning Analytics Capability Model a useful model.
- I would like to be able to use the Learning Analytics Capability Model in my work.
- The Learning Analytics Capability Model provides me with more useful insights and feedback than other similar models I tried/used.
- The Learning Analytics Capability Model enables me to get insights in the capabilities necessary for the successful uptake of learning analytics in my institution.
- The information provided by the Learning Analytics Capability Model helps me identify what capabilities need to be (further) developed at my institution.
- The Learning Analytics Capability Model provides relevant information in what way to operationalize learning analytics capabilities.
- The Learning Analytics Capability Model is easy to understand.
- The use of the Learning Analytics Capability Model is intuitive enough.
- The Learning Analytics Capability Model is overburdened with information.
- (Open question) The Learning Analytics Capability Model is (not) helping me implementing learning analytics at scale at my institution because.
References
- Clow, D. The Learning Analytics Cycle: Closing the Loop Effectively. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; pp. 134–138. [Google Scholar]
- Foster, C.; Francis, P. A Systematic Review on the Deployment and Effectiveness of Data Analytics in Higher Education to Improve Student Outcomes. Assess. Eval. High. Educ. 2019, 45, 822–841. [Google Scholar] [CrossRef]
- Arnold, K.E.; Lonn, S.; Pistilli, M.D. An Exercise in Institutional Reflection: The Learning Analytics Readiness Instrument (LARI). In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge, Indianapolis, IN, USA, 24–28 March 2014; pp. 163–167. [Google Scholar]
- Bichsel, J. Analytics in Higher Education: Benefits, Barriers, Progress, and Recommendations; EDUCAUSE Center for Applied Research: Louisville, CO, USA, 2012. [Google Scholar]
- Ferguson, R.; Clow, D.; Macfadyen, L.; Essa, A.; Dawson, S.; Alexander, S. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, IN, USA, 24–28 March 2014; pp. 251–253. [Google Scholar]
- Greller, W.; Drachsler, H. Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educ. Technol. Soc. 2012, 15, 42–57. [Google Scholar]
- Siemens, G.; Dawson, S.; Lynch, G. Improving the Quality and Productivity of the Higher Education Sector; Teacher Education Ministerial Advisory Group: Canberra, Australia, 2013. [Google Scholar]
- Tsai, Y.-S.; Moreno-Marcos, P.M.; Tammets, K.; Kollom, K.; Gašević, D. SHEILA Policy Framework: Informing Institutional Strategies and Policy Processes of Learning Analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, Australia, 7–9 March 2018; pp. 320–329. [Google Scholar]
- Dawson, S.; Poquet, O.; Colvin, C.; Rogers, T.; Pardo, A.; Gasevic, D. Rethinking Learning Analytics Adoption through Complexity Leadership Theory. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, Australia, 7–9 March 2018; pp. 236–244. [Google Scholar]
- Gasevic, D.; Tsai, Y.-S.; Dawson, S.; Pardo, A. How Do We Start? An Approach to Learning Analytics Adoption in Higher Education. Int. J. Inf. Learn. Technol. 2019, 36, 342–353. [Google Scholar] [CrossRef] [Green Version]
- Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The Current Landscape of Learning Analytics in Higher Education. Comput. Human Behav. 2018, 89, 98–110. [Google Scholar] [CrossRef]
- Broos, T.; Hilliger, I.; Pérez-Sanagustín, M.; Htun, N.; Millecamp, M.; Pesántez-Cabrera, P.; Solano-Quinde, L.; Siguenza-Guzman, L.; Zuñiga-Prieto, M.; Verbert, K.; et al. Coordinating Learning Analytics Policymaking and Implementation at Scale. Br. J. Educ. Technol. 2020, 51, 938–954. [Google Scholar] [CrossRef]
- Knobbout, J.H.; van der Stappen, E. A Capability Model for Learning Analytics Adoption: Identifying Organizational Capabilities from Literature on Big Data Analytics, Business Analytics, and Learning Analytics. Int. J. Learn. Anal. Artif. Intell. Educ. 2020, 2, 47–66. [Google Scholar] [CrossRef]
- Knobbout, J.H.; van der Stappen, E.; Versendaal, J. Refining the Learning Analytics Capability Model. In Proceedings of the 26th Americas Conference on Infor-mation Systems (AMCIS), Salt Lake City, UT, USA, 10–14 August 2020. [Google Scholar]
- Hevner, A.R.; March, S.T.; Park, J.; Ram, S. Design Science in Information Systems Research. MIS Q. 2004, 28, 75–105. [Google Scholar] [CrossRef] [Green Version]
- Peffers, K.; Rothenberger, M.; Tuunanen, T.; Vaezi, R. Design Science Research Evaluation. In Proceedings of the International Conference on Design Science Research in Information Systems, Las Vegas, NV, USA, 14–15 May 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 398–410. [Google Scholar]
- Prat, N.; Comyn-Wattiau, I.; Akoka, J. A Taxonomy of Evaluation Methods for Information Systems Artifacts. J. Manag. Inf. Syst. 2015, 32, 229–267. [Google Scholar] [CrossRef]
- Venable, J. Design Science Research Post Hevner et al.: Criteria, Standards, Guidelines, and Expectations. In Proceedings of the International Conference on Design Science Research in Information Systems, St. Gallen, Switzerland, 4–5 June 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 109–123. [Google Scholar]
- Hevner, A.R. A Three Cycle View of Design Science Research. Scand. J. Inf. Syst. 2007, 19, 87–92. [Google Scholar]
- Bias, R.G. The Pluralistic Usability Walkthrough: Coordinated Empathies. In Usability Inspection Methods; John Wiley & Sons: New York, NY, USA, 1994; pp. 63–76. [Google Scholar]
- Long, P.; Siemens, G.; Conole, G.; Gašević, D. Message from the LAK 2011 General & Program Chairs. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 3–4. [Google Scholar]
- Tsai, Y.-S.; Gašević, D. The State of Learning Analytics in Europe. 2017. Available online: http://sheilaproject.eu/wp-content/uploads/2017/04/The-state-of-learning-analytics-in-Europe.pdf (accessed on 22 February 2023).
- Alzahrani, A.S.; Tsai, Y.-S.; Iqbal, S.; Marcos, P.M.M.; Scheffel, M.; Drachsler, H.; Kloos, C.D.; Aljohani, N.; Gasevic, D. Untangling Connections between Challenges in the Adoption of Learning Analytics in Higher Education. Educ. Inf. Technol. 2022, 1–33. [Google Scholar] [CrossRef]
- Colvin, C.; Dawson, S.; Wade, A.; Gašević, D. Addressing the Challenges of Institutional Adoption. In Handbook of Learning Analytics; Lang, C., Siemens, G., Wise, A., Gasevic, D., Eds.; Society for Learning Analytics Research (SoLAR): Beaumont, AB, Canada, 2017; pp. 281–289. [Google Scholar]
- Tsai, Y.-S.; Gašević, D. Learning Analytics in Higher Education—Challenges and Policies: A Review of Eight Learning Analytics Policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, BC, Canada, 13–17 March 2017; pp. 233–242. [Google Scholar]
- Cosic, R.; Shanks, S.; Maynard, G. A Business Analytics Capability Framework. Australas. J. Inf. Syst. 2015, 19, S5–S19. [Google Scholar] [CrossRef] [Green Version]
- Gupta, M.; George, J.F. Toward the Development of a Big Data Analytics Capability. Inf. Manag. 2016, 53, 1049–1064. [Google Scholar] [CrossRef]
- Tsai, Y.-S.; Rates, D.; Moreno-Marcos, P.M.; Muñoz-Merino, P.J.; Jivet, I.; Scheffel, M.; Drachsler, H.; Kloos, C.D.; Gašević, D. Learning Analytics in European Higher Education—Trends and Barriers. Comput. Educ. 2020, 155, 103933. [Google Scholar] [CrossRef]
- Knobbout, J.H.; van der Stappen, E. Where Is the Learning in Learning Analytics? A Systematic Literature Review on the Operationalization of Learning-Related Constructs in the Evaluation of Learning Analytics Interventions. IEEE Trans. Learn. Technol. 2020, 13, 631–645. [Google Scholar] [CrossRef]
- Barney, J. Firm Resources and Sustained Competitive Advantage. J. Manag. 1991, 17, 99–120. [Google Scholar] [CrossRef]
- Grant, R.M. The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation. Calif. Manag. Rev. 1991, 33, 114–135. [Google Scholar] [CrossRef] [Green Version]
- Amit, R.; Schoemaker, P.J.H. Strategic Assets and Organizational Rent. Strateg. Manag. J. 1993, 14, 33–46. [Google Scholar] [CrossRef]
- Makadok, R. Toward a Synthesis of the Resource-Based and Dynamic-Capability Views of Rent Creation. Strateg. Manag. J. 2001, 22, 387–401. [Google Scholar] [CrossRef]
- Hilliger, I.; Ortiz-Rojas, M.; Pesántez-Cabrera, P.; Scheihing, E.; Tsai, Y.-S.; Muñoz-Merino, P.J.; Broos, T.; Whitelock-Wainwright, A.; Pérez-Sanagustín, M. Identifying Needs for Learning Analytics Adoption in Latin American Universities: A Mixed-Methods Approach. Internet High. Educ. 2020, 45, 100726. [Google Scholar] [CrossRef]
- Nielsen, J. Usability Inspection Methods. In Conference Companion on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 1994; pp. 413–414. [Google Scholar]
- Venable, J.; Pries-Heje, J.; Baskerville, R. FEDS: A Framework for Evaluation in Design Science Research. Eur. J. Inf. Syst. 2016, 25, 77–89. [Google Scholar] [CrossRef] [Green Version]
- March, S.T.; Smith, G.F. Design and Natural Science Research on Information Technology. Decis. Support Syst. 1995, 15, 251–266. [Google Scholar] [CrossRef]
- Thurmond, V.A. The Point of Triangulation. J. Nurs. Scholarsh. 2001, 33, 253–258. [Google Scholar] [CrossRef] [PubMed]
- Kusters, R.J.G.; Versendaal, J. Horizontal Collaborative E-Purchasing for Hospitals: IT for Addressing Collaborative Purchasing Impediments. J. Int. Technol. Inf. Manag. 2013, 22, 65–83. [Google Scholar] [CrossRef]
- Thorvald, P.; Lindblom, J.; Schmitz, S. Modified Pluralistic Walkthrough for Method Evaluation in Manufacturing. Procedia Manuf. 2015, 3, 5139–5146. [Google Scholar] [CrossRef] [Green Version]
- Emaus, T.; Versendaal, J.; Kloos, V.; Helms, R. Purchasing 2.0: An Explorative Study in the Telecom Sector on the Potential of Web 2.0 in Purchasing. In Proceedings of the 7th International Conference on Enterprise Systems, Accounting and Logistics (ICESAL 2010), Rhodes Island, Greece, 28–29 June 2010. [Google Scholar]
- Dahlberg, P. Local Mobility. Ph.D. Thesis, Göteborg University, Gothenburg, Sweden, 2003. [Google Scholar]
- Dyckhoff, A.L.; Zielke, D.; Bültmann, M.; Chatti, M.A.; Schroeder, U. Design and Implementation of a Learning Analytics Toolkit for Teachers. Educ. Technol. Soc. 2012, 15, 58–76. [Google Scholar]
- Rödle, W.; Wimmer, S.; Zahn, J.; Prokosch, H.-U.; Hinkes, B.; Neubert, A.; Rascher, W.; Kraus, S.; Toddenroth, D.; Sedlmayr, B. User-Centered Development of an Online Platform for Drug Dosing Recommendations in Pediatrics. Appl. Clin. Inform. 2019, 10, 570–579. [Google Scholar] [CrossRef] [Green Version]
- Riihiaho, S. The Pluralistic Usability Walk-through Method. Ergon. Des. 2002, 10, 23–27. [Google Scholar] [CrossRef]
- Conway, G.; Doherty, E.; Carcary, M. Evaluation of a Focus Group Approach to Developing a Survey Instrument. In Proceedings of the European Conference on Research Methods for Business & Management Studies, Rome, Italy, 12–13 July 2018; 2018; pp. 92–98. [Google Scholar]
- Gable, G.G. Integrating Case Study and Survey Research Methods: An Example in Information Systems. Eur. J. Inf. Syst. 1994, 3, 112–126. [Google Scholar] [CrossRef] [Green Version]
- Davis, F.D. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 1985. [Google Scholar]
- King, W.R.; He, J. A Meta-Analysis of the Technology Acceptance Model. Inf. Manag. 2006, 43, 740–755. [Google Scholar] [CrossRef]
- Ali, L.; Asadi, M.; Gašević, D.; Jovanović, J.; Hatala, M. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study. Comput. Educ. 2013, 62, 130–148. [Google Scholar] [CrossRef]
- Rienties, B.; Herodotou, C.; Olney, T.; Schencks, M.; Boroowa, A. Making Sense of Learning Analytics Dashboards: A Technology Acceptance Perspective of 95 Teachers. Int. Rev. Res. Open Distrib. Learn. 2018, 19, 186–202. [Google Scholar] [CrossRef]
- Venkatesh, V.; Davis, F.D. A Model of the Antecedents of Perceived Ease of Use: Development and Test. Decis. Sci. 1996, 27, 451–481. [Google Scholar] [CrossRef]
- Herodotou, C.; Rienties, B.; Boroowa, A.; Zdrahal, Z.; Hlosta, M. A Large-Scale Implementation of Predictive Learning Analytics in Higher Education: The Teachers’ Role and Perspective. Educ. Technol. Res. Dev. 2019, 67, 1273–1306. [Google Scholar] [CrossRef] [Green Version]
- Boudreau, M.-C.; Gefen, D.; Straub, D.W. Validation in Information Systems Research: A State-of-the-Art Assessment. MIS Q. 2001, 25, 1–16. [Google Scholar] [CrossRef]
- Carifio, J.; Perla, R. Resolving the 50-Year Debate around Using and Misusing Likert Scales. Med. Educ. 2008, 42, 1150–1152. [Google Scholar] [CrossRef]
- Dahl, S.G.; Allender, L.; Kelley, T.; Adkins, R. Transitioning Software to the Windows Environment-Challenges and Innovations. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 1995, 39, 1224–1227. [Google Scholar] [CrossRef]
- Meixner, B.; Matusik, K.; Grill, C.; Kosch, H. Towards an Easy to Use Authoring Tool for Interactive Non-Linear Video. Multimed. Tools Appl. 2014, 70, 1251–1276. [Google Scholar] [CrossRef]
- Gasevic, D.; Dawson, S.; Jovanovic, J. Ethics and Privacy as Enablers of Learning Analytics. J. Learn. Anal. 2016, 3, 1–4. [Google Scholar] [CrossRef]
- Pardo, A.; Siemens, G. Ethical and Privacy Principles for Learning Analytics. Br. J. Educ. Technol. 2014, 45, 438–450. [Google Scholar] [CrossRef]
- Lukyanenko, R.; Parsons, J. Guidelines for Establishing Instantiation Validity in IT Artifacts: A Survey of IS Research Instantiation Validity and Artifact Sampling View Project Participatory Design of User-Generated Content Systems View Project. In New Horizons in Design Science: Broadening the Research Agenda: 10th International Conference, DESRIST 2015, Dublin, Ireland, 20–22 May 2015; Springer: Berlin, Germany, 2015; Volume 9073, pp. 430–438. [Google Scholar] [CrossRef]
- Baker, R.S. Challenges for the Future of Educational Data Mining: The Baker Learning Analytics Prizes. J. Educ. Data Min. 2019, 11, 1–17. [Google Scholar]
- Knight, S.; Gibson, A.; Shibani, A. Implementing Learning Analytics for Learning Impact: Taking Tools to Task. Internet High. Educ. 2020, 45, 100729. [Google Scholar] [CrossRef]
- Winter, R. Design Science Research in Europe. Eur. J. Inf. Syst. 2008, 17, 470–475. [Google Scholar] [CrossRef]
- Bryman, A.; Bell, E. Business Research Methods, 2nd ed.; Oxford University Press: New York, NY, USA, 2015. [Google Scholar]
- Field, A. Discovering Statistics Using SPSS, 2nd ed.; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2005. [Google Scholar]
Evaluation Criteria | Method for Data Collection | Method for Data Analysis |
---|---|---|
EC1: Is the capability model effective? |
|
|
EC2: Is the capability model perceived useful? |
|
|
EC3: Is the capability model complete? |
|
|
Name | Type | Country | Participants | Intended Improvement |
---|---|---|---|---|
Alpha | University of Applied Sciences | Belgium | 8 | Learning outcomes |
Bravo | University of Applied Sciences | Netherlands | 4 | Learning environment |
Charlie | University of Technology | Netherlands | 4 | Learning process |
Delta | Institution for Senior Secondary Vocational Education | Netherlands | 4 | To be determined |
Echo | University of Applied Sciences | Netherlands | 6 | Learning process |
Alpha | Bravo | Charlie | Delta | Echo | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Capability | 1 | 1 | 1 | 2 | 1 | 2 | 1 | 2 | Total | |
Data | Data usage | 1 | 3 | 3 | 2 | 2 | 11 | |||
Feedback on analytics | 0 | |||||||||
Quality | 0 | 1 ! | 1 | |||||||
Reporting | 2 | 2 | 4 | |||||||
Sourcing and integration | 3 | 3 | 4 * | 4 | 7 * | 21 | ||||
Management | Capability development | 2 | 1 | 0 | 2 ! | 0 | 1 ! | 6 | ||
Culture and readiness | 1 | 1 | 1 | 1 | 4 | |||||
Evidence-based and theory-driven | 1 | 1 | 1 | 1 | 4 | |||||
External environment | 1 | 1 | 2 | |||||||
Funding and investment | 1 | 1 | 2 | |||||||
Identifying benefits | 1 | 1 | 1 | 3 | ||||||
Implementation, Deployment, and Application | 3 | 5 | 6 | 6 | 7 | 7 | 6 | 7 * | 47 | |
Performance monitoring | 1 | 1 | 1 | 1 | 1 | 5 | ||||
Policies and CoP | 1 | 1 | 2 * | 1 | 1 | 6 | ||||
Responsibility and accountability | 1 | 1 | 2 | |||||||
Strategy | 1 | 0 | 1 ! | 2 | ||||||
People | Collaboration | 0 | ||||||||
Combined skills and knowledge | 1 | 1 | ||||||||
Communication | 1 | 2 * | 3 | |||||||
Stakeholder Identification and Engagement | 1 | 1 | 3 | 5 * | 7 | 2 | 2 | 21 | ||
Training | 2 | 2 | 4 | |||||||
Privacy & Ethics | Ethics | 0 | 1 ! | 0 | 1 ! | 0 | 1 ! | 3 | ||
Human decision-making | 0 | |||||||||
Legal compliance | 1 | 1 | 2 | 3 * | 7 | |||||
Security | 0 | |||||||||
Transparency | 0 | |||||||||
Technology | Automation | 1 | 1 | 2 | ||||||
Connectivity | 0 | |||||||||
Infrastructure | 1 | 1 | 2 | 5 * | 2 | 2 | 2 | 2 | 17 | |
System characteristics | 0 | |||||||||
Total | 12 | 14 | 26 | 38 | 21 | 33 | 15 | 19 | 178 |
Question | Mean | Std. Dev. | Min. | Max. |
---|---|---|---|---|
Q1: Useful | 4.2 | 0.4 | 4 | 5 |
Q2: Use in work | 4.0 | 0.4 | 3 | 5 |
Q3: Comparison other models | 3.2 | 0.4 | 3 | 4 |
Q4: Insights | 4.3 | 0.5 | 3 | 5 |
Q5: Identify | 4.3 | 0.5 | 4 | 5 |
Q6: Operationalize | 3.7 | 0.8 | 2 | 5 |
Q7: Easy to understand | 3.9 | 0.8 | 2 | 5 |
Q8: Intuitive | 3.6 | 0.7 | 2 | 5 |
Q9: Overburden with information | 2.9 | 0.9 | 1 | 4 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Knobbout, J.; van der Stappen, E.; Versendaal, J.; van de Wetering, R. Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting. Appl. Sci. 2023, 13, 3236. https://doi.org/10.3390/app13053236
Knobbout J, van der Stappen E, Versendaal J, van de Wetering R. Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting. Applied Sciences. 2023; 13(5):3236. https://doi.org/10.3390/app13053236
Chicago/Turabian StyleKnobbout, Justian, Esther van der Stappen, Johan Versendaal, and Rogier van de Wetering. 2023. "Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting" Applied Sciences 13, no. 5: 3236. https://doi.org/10.3390/app13053236
APA StyleKnobbout, J., van der Stappen, E., Versendaal, J., & van de Wetering, R. (2023). Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting. Applied Sciences, 13(5), 3236. https://doi.org/10.3390/app13053236