Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era
Abstract
:1. Introduction
1.1. Research Gap
1.2. Purpose and Contribution of the Study
1.3. Research Questions
2. Literature Review
AI-Resistant Assessment
Gen AI in Higher Education
3. Methodology
3.1. Context of the Study
3.2. Semi-Structured Interviews
3.3. Focus Group Sessions
- Experience with AI-Resistant Assessment Design: Priority was given to faculty members with substantial experience adapting assessments to AI considerations. This included tangible modifications to assessment practices or insightful reflections on integrating Gen AI into teaching and evaluation.
- Openness to Collaboration: Participants were chosen based on their willingness to engage collaboratively, ensuring meaningful contributions to group discussions. This criterion emphasized participants who were proactive shared best practices and actively engaged in problem-solving challenges.
3.4. Document Analysis: Redesigned Assessment Analysis
- Incorporation of AI-Resistant Design: Elements that directly reflected the training focus, including task complexity, prompts designed to promote critical thinking, and strategies for assessing student originality and engagement.
- Alignment with Workshop Principles: How closely each assessment adhered to the workshop’s objectives, particularly redesigning assessments to integrate AI considerations.
3.5. Data Collection
3.6. Data Analysis Procedures
3.6.1. Triangulation Process
3.6.2. Reporting the Themes Emerged in This Study
3.7. Trustworthiness
3.8. Ethical Considerations
4. Results
4.1. Research Question 1
4.1.1. Preparing Students for Future Work
4.1.2. Technological Adaptation
4.1.3. Academic Integrity
4.1.4. Institution Policy
4.1.5. Ethical Considerations
4.2. Research Question 2
4.2.1. Maintaining Academic Integrity
4.2.2. Time and Resource Demands
4.2.3. Equity and Accessibility Concerns
4.2.4. Resistance to Change
4.2.5. Lack of Clear Guidelines and Training
4.3. Research Question 3
4.3.1. Against: No Use of Gen AI in Assessments
4.3.2. Avoid: Assessments with Which Gen AI Currently Struggles
4.3.3. Adopt: Incorporating Gen AI in Assessments
4.3.4. Explore: Gen AI Partner in Assessments
4.4. Types of the New Assessment Approaches
4.4.1. Product–Process Assessment
4.4.2. Competency-Based Assessment
4.4.3. Authentic Assessment: Real-World Applications in Education
5. Discussion
5.1. Implications for Future Research and Practice
5.2. Practical Implications
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Script | Theme |
---|---|
F 5: Well, it’s been a mix of trial and error, honestly. For example, in one course, I introduced an assignment where students used AI to generate a draft of their research question and then refine it through peer feedback and their own critical thinking. Another time, I asked them to analyze the output of an AI-generated summary of an article and compare it with the original content to identify gaps or biases. These activities helped them engage with the material on a deeper level while also evaluating the tool itself. I’ve also done some group projects where students collaborate to create content using generative AI and then reflect on the process and outcomes. | Attitude Adopt Develop students Motivation Explore Explore Type of new assignments Product–process: Avoid Explore: co-author Human–AI collaboration |
F 11: Absolutely. One example that comes to mind is from a creative writing class I teach. I asked students to use ChatGPT to generate a story prompt and then write their own short story based on that prompt. Afterward, they had to reflect on how AI’s ideas influenced their creative process what they kept, what they changed, and why. It was fascinating because, um, some students said AI pushed them to think outside the box, while others said they found its suggestions too generic and wanted to make them more personal. That reflective component was really insightful and showed me how students are engaging with these tools on multiple levels. | Product–process Motivation Explore |
Theme | Subtheme | Quotation | Research Question |
---|---|---|---|
Preparing Students for Future Work | Fostering Higher-Order Thinking Skills | In my course, AI can answer factual questions, but it can’t evaluate or synthesize ideas. That is where we want students to shine. (F17) | Motivations |
Preparing Students for Future Work | Enhancing Innovation and Creativity | I ask them to use AI creatively, like brainstorming alternative solutions, to show their innovative thinking. (F23) | Motivations |
Technological Adaptation | Preparing Learners for AI-Enhanced Workplaces | I have to prepare my students in my courses to be familiar with AI tools; if I do not do so, they will be behind AI, and you know AI is everywhere now. (F2) | Motivations |
Ethical Considerations | Balancing Innovation with Learning Integrity | Students might rely too heavily on AI, risking the undermining of their learning processes. | Challenges |
Ethical Considerations | Promoting Responsible AI Use | Encouraging learners to use AI ethically ensures they develop the judgment needed to navigate these tools effectively. | Challenges |
Assessment Redesign | Creating AI-Resistant Assessments | Redesigning assessments to focus on the process and not just the product helps in mitigating over-reliance on AI. | Redesign Practices |
Assessment Redesign | Product–Process Collaboration | For creative writing, students used AI for initial prompts and reflected on how they revised it, emphasizing personal input over generic AI suggestions. (F11) | Redesign Practices |
Assessment Redesign | Co-Design with Students | Including students in redesigning assessments that integrate AI encourages transparency and ownership. | Redesign Practices |
Engagement | Exploring AI as a Creative Partner | Group projects where students collaborate to create content using AI helped them reflect on the strengths and limitations of the tools. (F5) | Motivations |
Institutional Policy | Faculty and Student Training | Workshops on ethical AI integration help clarify the boundaries and expectations for using these tools. | Challenges |
Theme | Subtheme | Quotation |
---|---|---|
Preparing Students for Future Work | Fostering Higher-Order Thinking Skills | In my course, AI can answer factual questions, but it can’t evaluate or synthesize ideas. That is where we want students to shine. (F17) |
Enhancing Innovation and Creativity | I ask them to use AI creatively, like brainstorming alternative solutions, to show their innovative thinking. (F23) | |
Active Learning Engagement | Students need to collaborate with AI in ways that emphasize their original contributions and unique problem-solving skills. | |
Technological Adaptation | Preparing Learners for AI-Enhanced Workplaces | I have to prepare my students in my courses to be familiar with AI tools; if I do not do so, they will be behind AI, and you know AI is everywhere now. (F2) |
Integrating AI Tools for Enhanced Learning | Using tools like ChatGPT in projects allowed students to explore real-world tasks and learn collaboratively. | |
Academic Integrity | Addressing Plagiarism Challenges | We are exploring ways to assess originality and ensure students engage deeply with the material rather than outsourcing their learning to AI. |
Authentic Assessments to Mitigate Misuse | We design tasks requiring critical thinking and creativity, skills that AI tools currently cannot authentically replicate. | |
Institutional Policy | Developing AI Usage Guidelines | Policies need to guide ethical AI use while promoting innovation and maintaining academic standards. |
Faculty and Student Training | Workshops on ethical AI integration help clarify the boundaries and expectations for using these tools. | |
Ethical Considerations | Balancing Innovation with Learning Integrity | Students might rely too heavily on AI, risking the undermining of their learning processes. |
Promoting Responsible AI Use | Encouraging learners to use AI ethically ensures they develop the judgment needed to navigate these tools effectively. | |
Assessment Redesign | Creating AI-Resistant Assessments | Redesigning assessments to focus on the process and not just the product helps in mitigating over-reliance on AI. |
Co-Design with Students | Including students in redesigning assessments that integrate AI encourages transparency and ownership. |
Theme | Subtheme | Quotation |
---|---|---|
Maintaining Academic Integrity | Distinguishing AI from Student Work | It’s becoming increasingly difficult to identify where the student’s work ends and the AI’s begins. (F5) |
Time and Resource Demands | Strain on Faculty Time and Efforts | We’re expected to innovate while balancing heavy workloads—there’s just not enough time to experiment and redesign. (FG4) |
Time and Resource Demands | Limited Institutional Resources | Developing innovative assessments requires resources that are simply not available in most cases. (Observation from focus groups) |
Equity and Accessibility Concerns | Inequality in AI Access | Not all students are familiar with AI, and some don’t even have reliable Internet. It’s hard to make assessments equitable under these circumstances. (F13) |
Equity and Accessibility Concerns | Digital Infrastructure Limitations | Students from rural areas face issues even accessing online resources, let alone experimenting with AI. (Paraphrased from focus group discussions) |
Resistance to Change | Faculty Reluctance to Innovate | Some colleagues don’t see the urgency of redesigning assessments, and students often push back on tasks that demand more effort. (F25) |
Resistance to Change | Learner Resistance to New Formats | Traditional exams are what they’re used to—when we introduce AI-resistant tasks, there’s often backlash. (Observation from interviews) |
Lack of Clear Guidelines and Training | Absence of Institutional Frameworks | We’re left to figure this out on our own; there’s no institutional framework to guide us. (FG1) |
Lack of Clear Guidelines and Training | Insufficient Professional Development | There’s minimal training provided for understanding how to integrate or avoid AI in assessments. (General comment from faculty) |
Appendix B. Sample of Redesigned Assignments
References
- Abubakar, U., Falade, A. A., & Ibrahim, H. A. (2024). Redefining student assessment in Nigerian tertiary institutions: The impact of AI technologies on academic performance and developing countermeasures. Advances in Mobile Learning Educational Research, 4(2), 1149–1159. [Google Scholar] [CrossRef]
- Ahmed, S., Zaki, A., & Bentley, Y. (2024). AI and personalised grading criteria. In Utilizing AI for assessment, grading, and feedback in higher education (pp. 85–113). IGI Global. [Google Scholar] [CrossRef]
- Akyıldız, S. T., & Ahmed, K. H. (2021). An overview of qualitative research and focus group discussion. International Journal of Academic Research in Education, 7(1), 1–15. [Google Scholar] [CrossRef]
- Al-Zahrani, A. M. (2024). The impact of generative AI tools on researchers and research: Implications for academia in higher education. Innovations in Education and Teaching International, 61(5), 1029–1043. [Google Scholar] [CrossRef]
- Awadallah Alkouk, W., & Khlaif, Z. N. (2024). AI-resistant assessments in higher education: Practical insights from faculty training workshops. In Frontiers in education (Vol. 9, p. 1499495). Frontiers Media SA. [Google Scholar] [CrossRef]
- Azevedo, R., & Gašević, D. (2019). The role of self-regulation in learning with technology. In F. Fischer, C. E. Hmelo-Silver, S. R. Goldman, & P. Reimann (Eds.), International handbook of the learning sciences (pp. 449–460). Routledge. [Google Scholar]
- Aziz, M. N. A., Yusoff, N. M., & Yaakob, M. F. M. (2020). Challenges in using authentic assessment in 21st century ESL classrooms. International Journal of Evaluation and Research in Education, 9(3), 759–768. [Google Scholar] [CrossRef]
- Berndtsson, J. (2017). Combining semi-structured interviews and document analysis in a study of private security expertise. In Researching non-state actors in international security (pp. 81–95). Routledge. [Google Scholar]
- Boud, D., & Bearman, M. (2022). The assessment challenge of social and collaborative learning in higher education. Educational philosophy and theory, 56, 1–10. [Google Scholar] [CrossRef]
- Bozkurt, A., Junhong, X., Lambert, S., Pazurek, A., Crompton, H., Koseoglu, S., Farrow, R., Bond, M., Nerantzi, C., Honeychurch, S., Bali, M., Dron, J., Mir, K., Stewart, B., Costello, E., Mason, J., Stracke, C. M., Romero-Hall, E., Koutropoulos, A., . . . Jandrić, P. (2023). Speculative futures on ChatGPT and generative artificial intelligence (AI): A collective reflection from the educational landscape. Asian Journal of Distance Education, 18(1), 53–130. [Google Scholar]
- Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. [Google Scholar] [CrossRef]
- Cabellos, B., De Aldama, C., & Pozo, J. (2024). University teachers’ beliefs about the use of generative artificial intelligence for teaching and learning. Frontiers in Psychology, 15, 1468900. [Google Scholar] [CrossRef]
- Cabero-Almenara, J., Palacios-Rodríguez, A., Loaiza-Aguirre, M. I., & Del Rosario De Rivas-Manzano, M. (2024). Acceptance of educational artificial intelligence by teachers and its relationship with some variables and pedagogical beliefs. Education Sciences, 14(7), 740. [Google Scholar] [CrossRef]
- Cazan, A., & Indreica, E. (2014). Traditional assessment of learning versus online assessment. eLearning and Software for Education, 3, 96–101. [Google Scholar] [CrossRef]
- Couldry, N. (2020). The (in)visibility of AI in educational assessments: Examining the implications of datafication for educational practices. In K. R. L. Baumberger, & T. M. George (Eds.), Digital transformations in education: The role of AI in educational innovation (p. 1139). Springer. [Google Scholar]
- Deeley, S. J. (2018). Using technology to facilitate effective assessment for learning and feedback in higher education. Assessment & Evaluation in Higher Education, 43(3), 439–448. [Google Scholar]
- Chiu, T. K., Xia, Q., Zhou, X., Chai, C. S., & Cheng, M. (2023). Systematic literature review on opportunities, challenges, and future research recommendations of artificial intelligence in education. Computers and Education: Artificial Intelligence, 4, 100118. [Google Scholar] [CrossRef]
- Dimitriadou, E., & Lanitis, A. (2023). A critical evaluation, challenges, and future perspectives of using artificial intelligence and emerging technologies in smart classrooms. Smart Learning Environments, 10(1), 12. [Google Scholar] [CrossRef]
- Donaghy, K. (2023, November 16). Multimodality and multimodal literacy: What are they and why are they important in ELT? Oxford University Press. Available online: https://teachingenglishwithoxford.oup.com/2023/11/16/multimodality-and-multimodal-literacy-elt/#:~:text=Multimodal%20literacy%20refers%20to%20the,curricula%20to%20include%20mult (accessed on 7 October 2024).
- Donnell, F. O., Porter, M., & Fitzgerald, S. (2024). The role of artificial intelligence in higher education: Higher education students use of ai in academic assignments. Irish Journal of Technology Enhanced Learning, 8(1). [Google Scholar] [CrossRef]
- Eze, C. A. (2024). The role of educators in upholding academic integrity in an ai-driven era. In AI and ethics, academic integrity and the future of quality assurance in higher education. [Google Scholar]
- Fakhar, H., Lamrabet, M., Echantoufi, N., Ouadrhiri, K., El Khattabi, K., & Ajana, L. (2024). ChatGPT as an intelligent self-continuous professional development tool for teachers. Statistics, Optimization & Information Computing, 13(2), 488–507. [Google Scholar]
- Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and considerations for higher education practice. Education Sciences, 13(11), 1109. [Google Scholar] [CrossRef]
- Feuerriegel, S., Hartmann, J., Janiesch, C., & Zschech, P. (2024). Generative AI. Business & Information Systems Engineering, 66(1), 111–126. [Google Scholar] [CrossRef]
- Geampana, A., & Perrotta, M. (2024). Using interview excerpts to facilitate focus group discussion. Qualitative Research, 14687941241234283. [Google Scholar] [CrossRef]
- George, B., & Wooden, O. (2023). Managing the strategic transformation of higher education through artificial intelligence. Administrative Sciences, 13(9), 196. [Google Scholar] [CrossRef]
- Grassini, S. (2023). Development and validation of the AI attitude scale (AIAS-4): A brief measure of general attitude toward artificial intelligence. Frontiers in Psychology, 14, 1191628. [Google Scholar] [CrossRef]
- Gundumogula, M., & Gundumogula, M. (2020). Importance of focus groups in qualitative research. International Journal of Humanities and Social Science (IJHSS), 8(11), 299–302. [Google Scholar] [CrossRef]
- Huber, E., Harris, L., Wright, S., White, A., Raduescu, C., Zeivots, S., Cram, A., & Brodzeli, A. (2024). Towards a framework for designing and evaluating online assessments in business education. Assessment & Evaluation in Higher Education, 49(1), 102–116. [Google Scholar]
- Jin, Y., Yan, L., Echeverria, V., Gašević, D., & Martinez-Maldonado, R. (2024). Generative AI in higher education: A global perspective of institutional adoption policies and guidelines. arXiv, arXiv:2405.11800. [Google Scholar] [CrossRef]
- Kaldaras, L., Akaeze, H. O., & Reckase, M. D. (2024). Developing valid assessments in the era of generative artificial intelligence [Conceptual Analysis]. In Frontiers in Education (Vol. 9). Frontiers Media SA. [Google Scholar] [CrossRef]
- Khlaif, Z. N., Ayyoub, A., Hamamra, B., Bensalem, E., Mitwally, M. A. A., Ayyoub, A., Hattab, M. K., & Shadid, F. (2024). University teachers’ views on the adoption and integration of generative AI tools for student assessment in higher education. Education Sciences, 14(10), 1090. [Google Scholar] [CrossRef]
- Kostanek, E., & Li, F. (2025). The false sense of achievement: Navigating academic integrity and assessment challenges of GenAI. In Educational Assessments in the Age of Generative AI (pp. 33–58). IGI Global Scientific Publishing. [Google Scholar]
- Lambert, S. D., & Loiselle, C. G. (2008). Combining individual interviews and focus groups to enhance data richness. Journal of advanced nursing, 62(2), 228–237. [Google Scholar] [CrossRef] [PubMed]
- Luo, J. (2024). A critical review of GenAI policies in higher education assessment: A call to reconsider the “originality” of students’ work. Assessment & Evaluation in Higher Education, 49(5), 651–664. [Google Scholar]
- Lye, C. Y., & Lim, L. (2024). Generative artificial intelligence in tertiary education: Assessment redesign principles and considerations. Education Sciences, 14(6), 569. [Google Scholar] [CrossRef]
- Martínez-Comesaña, M., Rigueira-Díaz, X., Larrañaga-Janeiro, A., Martínez-Torres, J., Ocarranza-Prado, I., & Kreibel, D. (2023). Impact of artificial intelligence on assessment methods in primary and secondary education: Systematic literature review. Revista De Psicodidáctica (English Ed), 28(2), 93–103. [Google Scholar] [CrossRef]
- Miller, W. (2024). Adapting to AI: Reimagining the role of assessment professionals. Intersection: A Journal at the Intersection of Assessment and Learning, 5(4), 99–113. [Google Scholar] [CrossRef]
- Moorhouse, B. L., Wan, Y., Wu, C., Kohnke, L., Ho, T. Y., & Kwong, T. (2024). Developing language teachers’ professional generative AI competence: An intervention study in an initial language teacher education course. System, 125, 103399. [Google Scholar] [CrossRef]
- Nadeem, M., Farag, W., & Helal, M. (2024, May 2–4). Rethinking assessment methodologies in the era of artificial intelligence: Expanding beyond ChatGPT’s scope. 2024 Mediterranean Smart Cities Conference (MSCC), Tetuan, Morocc. [Google Scholar] [CrossRef]
- Noroozi, O., Soleimani, S., Farrokhnia, M., & Banihashem, S. K. (2024). Generative AI in education: Pedagogical, theoretical, and methodological perspectives. International Journal of Technology in Education, 7(3), 373–385. [Google Scholar] [CrossRef]
- Petihakis, G., Farao, A., Bountakas, P., Sabazioti, A., Polley, J., & Xenakis, C. (2024, July 30–August 2). AIAS: AI-ASsisted cybersecurity platform to defend against adversarial AI attacks. 19th International Conference on Availability, Reliability and Security (pp. 1–7), Vienna, Austria. [Google Scholar]
- Pisica, A. I., Edu, T., Zaharia, R. M., & Zaharia, R. (2023). Implementing artificial intelligence in higher education: Pros and cons from the perspectives of academics. Societies, 13(5), 118. [Google Scholar] [CrossRef]
- Poliandri, D., Perazzolo, M., Pillera, G. C., & Giampietro, L. (2023). Dematerialized participation challenges: Methods and practices for online focus groups. Frontiers in Sociology, 8, 1145264. [Google Scholar] [CrossRef]
- Rudolph, J., Tan, S., & Tan, S. (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? Journal of Applied Learning and Teaching, 6(1), 342–363. [Google Scholar] [CrossRef]
- Ruiz-Rojas, L. I., Acosta-Vargas, P., De-Moreta-Llovet, J., & Gonzalez-Rodriguez, M. (2023). Empowering education with generative artificial intelligence tools: Approach with an instructional design matrix. Sustainability, 15(15), 11524. [Google Scholar] [CrossRef]
- Sajja, R., Sermet, Y., Cikmaz, M., Cwiertny, D., & Demir, I. (2024). Artificial intelligence-enabled intelligent assistant for personalized and adaptive learning in higher education. Information, 15(10), 596. [Google Scholar] [CrossRef]
- Sembey, R., Hoda, R., & Grundy, J. (2024). Emerging technologies in higher education assessment and feedback practices: A systematic literature review. Journal of Systems and Software, 211, 111988. [Google Scholar] [CrossRef]
- Shahriar, S., Lund, B. D., Mannuru, N. R., Arshad, M. A., Hayawi, K., Bevara, R. V. K., Mannuru, A., & Batool, L. (2024). Putting GPT-4o to the sword: A comprehensive evaluation of language, vision, speech, and multimodal proficiency. Applied Sciences, 14(17), 7782. [Google Scholar] [CrossRef]
- Shivshankar, S., & Acharya, N. (2025). AI in assessment and feedback. In Next-generation AI methodologies in education (pp. 119–146). IGI Global Scientific Publishing. [Google Scholar]
- Spivakovsky, O. V., Omelchuk, S. A., Kobets, V. V., Valko, N. V., & Malchykova, D. S. (2023). Institutional policies on artificial intelligence in university learning, teaching and research. Information Technologies and Learning Tools, 97(5), 181. [Google Scholar] [CrossRef]
- Stăncescu, I. (2017). The importance of assessment in the educational process—Science teachers’ perspective. The European Proceedings of Social & Behavioural Sciences, 27, 753–759. [Google Scholar] [CrossRef]
- Swiecki, Z., Khosravi, H., Chen, G., Martinez-Maldonado, R., Lodge, J. M., Milligan, S., Selwyn, N., & Gašević, D. (2022). Assessment in the age of artificial intelligence. Computers and Education Artificial Intelligence, 3, 100075. [Google Scholar] [CrossRef]
- Umar, M., Congman, R., & Akram, H. (2024). Perceived entrepreneurial orientation and perceived academic entrepreneurial intention: A mediating role of knowledge creation. Policy Research Journal, 2(4), 953–967. [Google Scholar]
- Waladi, C., & Lamarti, M. S. (2024). Adaptive AI-Driven Assessment for Competency-Based Learning Scenarios. In Innovative Instructional Design Methods and Tools for Improved Teaching (pp. 215–226). IGI Global. [Google Scholar]
- Williamson, B., & Eynon, R. (2020). Historical Threads, Digital Transitions, and the Implications of AI for Learning, Education, and the Field of the Learning Sciences. British Journal of Educational Technology, 51(4), 1117–1130. [Google Scholar] [CrossRef]
- Yin, R. K. (2013). Validity and generalization in future case study evaluations. Evaluation, 19(3), 321–332. [Google Scholar] [CrossRef]
- Zhang, K., & Aslan, A. B. (2021). AI technologies for education: Recent research & future directions. Computers and Education: Artificial Intelligence, 2, 100025. [Google Scholar]
- Zhou, X., Schofield, L., Zhang, J., Abuelmaatti, A., & Howell, L. (2024). Building bridges in AI: Enhancing AI literacy for students and staff across disciplines. In PGR Student Partners–Empowering doctoral students through partnership and co-creation in institutional equality, diversity and inclusion change projects (Vol. 20). Educational Developments. [Google Scholar]
- Zawacki-Richter, O., Bäcker, E. M., & Vogt, S. (2019). Systematic review of research on artificial intelligence in higher education. International Journal of Educational Technology in Higher Education, 16(1), 39. [Google Scholar] [CrossRef]
Variable | Frequency | % | |
---|---|---|---|
Gender | Male | 89 | 57.4 |
Female | 66 | 42.6 | |
Age (Years) | 25–35 | 43 | 27.7 |
36–45 | 68 | 43.9 | |
46–55 | 25 | 16.1 | |
56+ | 19 | 12.3 | |
Frequency of Gen AI Use | Daily | 58 | 37.4 |
Weekly | 63 | 40.6 | |
Monthly | 19 | 12.3 | |
Occasionally | 15 | 9.7 | |
Discipline | Medical Sciences | 15 | 9.7 |
Humanities and Educational Sciences | 30 | 19.4 | |
Engineering Sciences | 24 | 15.5 | |
Social Sciences | 35 | 22.6 | |
Natural Sciences (Physics, Math, etc.) | 28 | 18 | |
Business and Communication | 23 | 14.8 | |
University | An Najah National University | 25 | 16.1 |
Birzeit University | 22 | 14.2 | |
Hebron University | 16 | 10.3 | |
Al-Quds University | 17 | 11 | |
Ministry of Higher Education | 15 | 9.7 | |
Palestine Technical University | 20 | 12.9 | |
Arab American University | 20 | 12.9 | |
Palestine Ahliya University | 20 | 12.9 |
Variable | Frequency | % | |
---|---|---|---|
Gender | Male | 38 | 62.3 |
Female | 23 | 37.7 | |
Age (Years) | 25–35 | 11 | 18 |
36–45 | 19 | 29.5 | |
46–55 | 21 | 36.1 | |
56+ | 10 | 16.4 | |
Frequency of Gen AI Use | Daily | 18 | 29.5 |
Weekly | 16 | 26.2 | |
Monthly | 12 | 19.7 | |
Occasionally | 15 | 24.6 | |
Discipline | Medical Sciences | 7 | 11.5 |
Humanities and Educational Sciences | 12 | 19.7 | |
Engineering Sciences | 10 | 16.4 | |
Social Sciences | 8 | 13.1 | |
Natural Sciences (Physics, Math, etc.) | 13 | 21.3 | |
Business and Communication | 11 | 18 |
Assessment Type | Example | Purpose of Assessment | Gen AI Integration Level |
---|---|---|---|
Against | Exams, oral exams, discussion | Evaluate students lower skills | Not allowed to use Gen AI |
Avoid | Performance based assessment, Personal reflection, Individual portfolio,.. | Evaluate of higher order levels | Lower level of integrating Gen AI |
Adapt | Brainstorming. Ideation | Higher order thinking | Higher level of Gen AI integration (part, full) |
Explore | Solving-problem, co-design, learner-AI partners, creativity in learning. | Gen AI partner in assignments | Higher level of Gen AI integration (part, full) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khlaif, Z.N.; Alkouk, W.A.; Salama, N.; Abu Eideh, B. Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era. Educ. Sci. 2025, 15, 174. https://doi.org/10.3390/educsci15020174
Khlaif ZN, Alkouk WA, Salama N, Abu Eideh B. Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era. Education Sciences. 2025; 15(2):174. https://doi.org/10.3390/educsci15020174
Chicago/Turabian StyleKhlaif, Zuheir N., Wejdan Awadallah Alkouk, Nisreen Salama, and Belal Abu Eideh. 2025. "Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era" Education Sciences 15, no. 2: 174. https://doi.org/10.3390/educsci15020174
APA StyleKhlaif, Z. N., Alkouk, W. A., Salama, N., & Abu Eideh, B. (2025). Redesigning Assessments for AI-Enhanced Learning: A Framework for Educators in the Generative AI Era. Education Sciences, 15(2), 174. https://doi.org/10.3390/educsci15020174