Rethinking Assessment: The Future of Examinations in Higher Education
Abstract
:1. Introduction
2. Background to the Study
2.1. Aligning Learning and Assessments
2.2. Three Domains of Learning
2.2.1. Cognitive Domain
2.2.2. Affective Domain
2.2.3. Psychomotor Domain
2.3. Practices Used in Physical Exams
3. Learning Taxonomies—Application in Exams
3.1. Bloom (1956)
3.2. Bloom’s Taxonomy Revised—Taxonomy of Anderson et al. (2001)
3.3. The SOLO Taxonomy
3.4. Six Facets of Understanding (Wiggins and McTighe)
3.5. Taxonomy of Significant Learning (Fink)
4. Challenges of Transition to Online Exams
4.1. How to Prevent Academic Misconduct during Online Exams?
4.2. Use of Software and Video Proctoring
4.3. Exam Design
4.4. Additional or Other Assessments
4.5. Using Honour Codes, Warnings, and Penalties
5. Methodology
6. Research Findings
6.1. Voice of Teachers
6.1.1. A Common Approach to Conduct Online Exams
6.1.2. The Preparation of Online Exam Papers Is More Time-Consuming than That of Physical Exams
I spend a lot more time setting up online exam questions because students copy, right? They have Whatsapp groups etc. They simply chat and circulate answers. So, to avoid that, I try my best to customize the questions so that different students will get different parameters in the questions. So, I use student index numbers and change the values in certain questions so that students will get different diagrams and values as answers. which will prevent the students from copying… and I have found that effective. So, it takes a lot of time to prepare exam questions. In physical exams, you prepare only one paper…
6.1.3. Conducting Exams and Enabling Security Measures in Online Exams via Various Tools
In the proctoring tool, students must sign the agreement form, and then take a picture with their ID read their details with the camera and microphone on. Once it is opened, they will have to keep the application open till the end of the exam. During that period, we randomly capture the screen and record the sounds. In that case, students have to switch on the camera and unmute the microphone. We make students aware of what will happen if they cheat. Before exams, we explain [it to them], so I think because of the proctoring tools and the awareness, they did not cheat…
6.1.4. Student Worries over Technical Issues in Online Exams
6.1.5. Challenges to Assessment Security and Academic Integrity in Online Exams
They try to escape always…I had one incident where we captured a screen of a student pausing (the camera). I don’t know what caused it but they used something to cheat, because that student’s pause is always the same for the entire two hours of the examination…
6.1.6. Viva as a Popular Alternative Assessment
I know it’s difficult and time-consuming, but if we can have follow-up exams with a short viva session, we can tally the answers and check whether it is the same student who has answered the exam questions… It is good because we can test whether the student has knowledge or not… I think, in some foreign countries, they stick to only MCQ questions, and they randomize the questions and answers so that the students will end up receiving a different set of questions and answers.
In my subject, I can do a lot of practical stuff (presentation, reports, surveys, etc.).. so I can mitigate plagiarism because it involves individual work… so in my case, it’s really difficult to copy or plagiarise…
6.1.7. Back to Physical Exams after the Pandemic
The government has a responsibility to expand internet access to all, treating it as an essential commodity. If properly set, online exams are a good way of continuing exams in the future as well. We cannot expect written exams to stay in the future, because you can see all over the world that there are so many other options coming in for examinations and studies. Installing checks and installing methods to maintain integrity (in online exams) would be the way to solve this problem in my opinion…
6.2. Voice of Students
6.2.1. Similar Experiences, Different Institutes
6.2.2. Improving Student Experience in Online Exams
We don’t have a good exam system. Video conference systems are good for lectures, but (during exams) if we can break the students into 10 and put them into breakout rooms and assign invigilators that would be good…
6.2.3. Greatest Challenges Faced in Lockdown
Before (the pandemic) we had friends studying together—there was a study group. This time we had to do everything alone, and that was challenging.
We were scared whether we could finish it on time, and what the questions will be like because definitely, they will not be like exam questions in physical exams…and we did not do even one practical in the laboratory. So, it was difficult to answer theory questions based on practicals, because we did not have a proper idea about how to do the experiments…
6.2.4. Students Prefer Alternate Exams
Quizzes and assignments will help polish our knowledge and they will support our exams. So, final marks can be taken [computed] from all these activities, so if there are any issues with online exams the [negative] effects will be reduced with these options.
Participant 8: Reports and presentations will be more effective than exams.
Participant 11: I would say viva and MCQs are very good, but in elaborate assignments where you do a lot of research, you learn a lot… so there’s some sort of hard work going into that assignment…
Participant 10: Group projects are better than exams…because we did a math lab project and I learned many things I couldn’t have learned by studying for an exam…
6.2.5. Let’s Go Back to “Normal”
7. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Anu, V. Online Learning Challenges & Solutions Read All about It. 2021. Available online: https://www.embibe.com/exams/online-learning-challenges-and-solutions/amp/ (accessed on 19 June 2021).
- Quacquarelli Symonds. How COVID-19 Is Impacting Prospective International Students. 2020. Available online: https://www.qs.com/portfolio-items/how-covid-19-is-impacting-prospective-international-students-across-the-globe/ (accessed on 15 June 2021).
- Southwell, D.; Gannaway, D.; Orrell, J.; Chalmers, D.; Abraham, C. Strategies for effective dissemination of the outcomes of teaching and learning projects. J. High. Educ. Policy Manag. 2010, 32, 55–67. [Google Scholar] [CrossRef]
- Gamage, K.A.A.; de Silva, E.K.; Gunawardhana, N. Online delivery and assessment during COVID-19: Safeguarding academic integrity. Educ. Sci. 2020, 10, 301. [Google Scholar] [CrossRef]
- QAA. ‘No Detriment’ Policies: An Overview. 2020. Available online: https://www.qaa.ac.uk/docs/qaa/guidance/no-detriment-policies-an-overview.pdf (accessed on 25 June 2021).
- QQI. Guiding Principles for Alternative (Devised in Response to the COVID-19 Emergency Restrictions). 2020. Available online: https://www.qqi.ie/Downloads/Guiding%20Principles%20for%20Alternative%20Assessment%20%28COVID-19%29%2018-11-20.pdf (accessed on 24 July 2021).
- Rummer, R.; Schweppe, J.; Schwede, A. Open-book versus closed-book tests in university classes: A field experiment. Front Psychol. 2019, 10, 463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Black, P.J. University examinations. Phys. Educ. 1968, 3, 93–99. [Google Scholar] [CrossRef]
- Ilgaz, H.; Gülgün, A.A. Providing Online Exams for Online Learners: Does it really matter for them? Educ. Inf. Technol. 2020, 25, 1255–1269. [Google Scholar] [CrossRef]
- Stowell, J.; Bennett, D. Effects of online testing on student exam performance and test anxiety. J. Educ. Comput. Res. 2010, 42, 161–171. [Google Scholar] [CrossRef]
- Suryani, A.W. Individualized Excel-Based Exams to Prevent Students from Cheating. J. Account. Bus. Educ. 2016, 5, 14–24. [Google Scholar] [CrossRef]
- Hartley, J.; Nicholls, L. Time of day, exam performance and new technology. Br. J. Educ. Technol. 2008, 39, 555–558. [Google Scholar] [CrossRef]
- Tippins, N.T.; Beaty, J.; Drasgow, F.; Gibson, W.M.; Pearlman, K.; Segall, D.O. Unproctored Internet testing in employment settings. Pers. Psychol. 2006, 59, 189–225. [Google Scholar] [CrossRef]
- Laine, K.; Anderson, M. Electronic Exam in Electronics Studies. In Proceedings of the 44th SEFI Conference, Tampere, Finland, 12–15 September 2016; pp. 12–15. [Google Scholar]
- Bloemers, W.; Oud, A.; Dam, K.V. Cheating on Unproctored Internet Intelligence Tests: Strategies and Effects. Pers. Assess. Decis. 2016, 2, 21–29. [Google Scholar] [CrossRef] [Green Version]
- King, C.G.; Guyette, R.W.; Piotrowski, C. Online exams and cheating: An empirical analysis of business students’ views. J. Educ. Online 2009, 6, 1–11. [Google Scholar] [CrossRef]
- Gil-Jaurena, I.; Softic, S.K. Aligning learning outcomes and assessment methods: A web tool for e-learning courses. Int. J. Educ. Technol. High Educ. 2016, 13, 17. [Google Scholar] [CrossRef] [Green Version]
- Poutasi, K. SPANZ 2017. Available online: https://www.nzqa.govt.nz/assets/About-us/Future-State/NZQA-SPANZ-address-2017.pdf (accessed on 25 July 2021).
- Cartner, H.; Hallas, J. Aligning assessment, technology, and multi-literacies. E-Learn. Digit. Media 2020, 17, 131–147. [Google Scholar] [CrossRef]
- Airasian, P.W.; Miranda, H. The role of assessment in the revised taxonomy. Theory Pract. 2002, 41, 249–254. [Google Scholar] [CrossRef]
- Ajjawi, R.; Tai, J.; Nghia, T.L.H.; Boud, D.; Johnson, L.; Patrick, C.J. Aligning assessment with the needs of work-integrated learning: The challenges of authentic assessment in a complex context. Assess. Eval. High. Educ. 2020, 45, 304–316. [Google Scholar] [CrossRef] [Green Version]
- Guerrero-Roldán, A.E.; Noguera, I. A model for aligning assessment with competences and learning activities in online courses. Internet High Educ. 2018, 38, 36–46. [Google Scholar] [CrossRef]
- Hoque, E.M. Three Domains of Learning: Cognitive, Affective and Psychomotor. J. EFL Educ. Res. 2016, 2, 45–52. [Google Scholar]
- O’Neill, G.; Murphy, F. Guide to Taxonomies of Learning. UCD Teach Learn. 2010. Available online: http://www.ucd.ie/t4cms/UCDTLA0034.pdf (accessed on 14 July 2021).
- Clay, B. Is This a Trick Question? A Short Guide to Writing Effective Test Questions. Kansas Curriculum Center, USA. 2001. Available online: https://kgi.contentdm.oclc.org/digital/collection/p16884coll42/id/147/ (accessed on 5 July 2021).
- Bloom, B.; Engelhart, M.; Furst, E.; Hill, W.; Krathwohl, D. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I; Longmans, Green & Co: New York, NY, USA; Toronto, ON, Canada, 1956. [Google Scholar]
- Bissell, A.N.; Lemons, P.P. A new method for assessing critical thinking in the classroom. Bioscience 2006, 56, 66–72. [Google Scholar] [CrossRef] [Green Version]
- Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching and Assessing; Longman: New York, NY, USA, 2001. [Google Scholar]
- Leung, C.F. Assessment for learning: Using SOLO taxonomy to measure design performance of Design & Technology students. Int. J. Technol. Des. Educ. 2000, 10, 149–161. [Google Scholar]
- Newton, G.; Martin, E. Research and Teaching: Blooming, SOLO Taxonomy, and Phenomenography as Assessment Strategies in Undergraduate Science Education. J. Coll. Sci. Teach. 2013, 43, 78–90. [Google Scholar] [CrossRef]
- Lucander, H.; Bondemark, L.; Brown, G.; Knutsson, K. The structure of observed learning outcome (SOLO) taxonomy: A model to promote dental students’ learning. Eur. J. Dent. Educ. 2010, 14, 145–150. [Google Scholar] [CrossRef]
- Wiggins, G.; McTighe, J. Understanding by Design; Association for Supervision and Curriculum Development: Alexandria, Egypt, 1998; pp. 85–97. [Google Scholar]
- Branzetti, J.; Gisondi, M.A.; Hopson, L.R.; Regan, L. Aiming Beyond Competent: The Application of the Taxonomy of Significant Learning to Medical Education. Teach. Learn. Med. 2019, 31, 466–478. [Google Scholar] [CrossRef]
- Wuthisatian, R. Student exam performance in different proctored environments: Evidence from an online economics course. Int. Rev. Econ. Educ. 2020, 35, 100196. [Google Scholar] [CrossRef]
- Milone, A.S.; Cortese, A.M.; Balestrieri, R.L.; Pittenger, A.L. The impact of proctored online exams on the educational experience. Curr. Pharm. Teach. Learn. 2017, 9, 108–114. [Google Scholar] [CrossRef]
- Williams, J.B.; Wong, A. The efficacy of final examinations: A comparative study of closed-book, invigilated exams and open-book, open-web exams. Br. J. Educ. Technol. 2009, 40, 227–236. [Google Scholar] [CrossRef]
- Jordan-Fleming, M.K. Excellence in Assessment: Aligning Assignments and Improving Learning. Assess. Update 2017, 29, 10–12. [Google Scholar] [CrossRef]
- Güzer, B.; Caner, H. The Past, Present and Future of Blended Learning: An in Depth Analysis of Literature. Procedia Soc. Behav. Sci. 2014, 116, 4596–4603. [Google Scholar] [CrossRef] [Green Version]
- Becker, D.; Connolly, J.; Lentz, P.; Morrison, J. Using the Business Fraud Triangle to Predict Academic Dishonesty among Business Students. Acad. Educ. Lead. J. 2006, 10, 37. [Google Scholar]
- Lancaster, T.; Clarke, R. Rethinking Assessment by Examination in the age of contract cheating. In Plagiarism across Europe and Beyond; ENAI: Brno, Czech Republic, 2017; pp. 215–228. [Google Scholar]
- Trost, K. Psst, have you ever cheated? A study of academic dishonesty in Sweden. Assess. Eval. High. Educ. 2014, 34, 367–376. [Google Scholar] [CrossRef]
- Cluskey, G.R.; Ehlen, C.R.; Raiborn, M.H. Thwarting online exam cheating without proctor supervision. J. Acad. Bus. Ethics 2011, 4, 1–8. [Google Scholar]
- Lin, M.J.; Levitt, S.D. Catching Cheating Students. Economica 2020, 87, 885–900. [Google Scholar] [CrossRef] [Green Version]
- Ryznar, M. Giving an Online Exam (2 September 2020). Indiana University Robert H. McKinney School of Law Research Paper No. 2020-16. Available online: http://doi.org/10.2139/ssrn.3684958 (accessed on 25 July 2021).
- Corrigan-Gibbs, H.; Gupta, N.; Northcutt, C.; Cutrell, E.; Thies, W. Deterring cheating in online environments. ACM Trans. Comput.-Hum. Interact. 2015, 22, 1–23. [Google Scholar] [CrossRef]
- Chirumamilla, A.; Sindre, G.; Guyen-Duc, A. Cheating in e-exams and paper exams: The perceptions of engineering students and teachers in Norway. Assess. Eval. High. Educ. 2020, 45, 940–957. [Google Scholar] [CrossRef]
- Golden, J.; Kohlbeck, M. Addressing cheating when using test bank questions in online Classes. J. Account. Educ. 2020, 52, 100671. [Google Scholar] [CrossRef]
- Karim, N.A.; Shukur, Z. Review of User Authentication Methods in Online Examination. Asian J. Inf. Technol. 2015, 14, 166–175. [Google Scholar]
- Bearman, M.; Dawson, P.; O’Donnell, M.; Tai, J.; Jorre, T.J.D. Ensuring Academic Integrity and Assessment Security with Redesigned Online Delivery. 2020. Available online: http://dteach.deakin.edu.au/2020/03/23/academic-integrity-online/ (accessed on 25 July 2021).
- QAA. Assessing with Integrity in Digital Delivery Introduction. 2020. Available online: https://www.qaa.ac.uk/docs/qaa/guidance/assessing-with-integrity-in-digital-delivery.pdf (accessed on 27 June 2021).
- Ashri, D.; Sahoo, B.P. Open Book Examination and Higher Education during COVID-19: Case of University of Delhi. J. Educ. Technol. Syst. 2021, 50, 73–86. [Google Scholar] [CrossRef]
Staff | Students | ||
---|---|---|---|
Name | Discipline | Name | Discipline |
Participant 1 | Civil Engineering | Participant 6 | Electronic Engineering |
Participant 2 | Electrical and Electronic Engineering | Participant 7 | Civil Engineering |
Participant 3 | Information and Communication Engineering | Participant 8 | Agriculture and Plantation Engineering |
Participant 4 | Chemistry and Nanotechnology | Participant 9 | Literature |
Participant 5 | Communication skills, English, French | Participant 10 | Automobile Technology |
Participant 11 (MSc. student) | Food Science and Technology | ||
Participant 12 (MSc. student) | Biotechnology | ||
Participant 13 (Ph.D. student) | Financial Engineering |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gamage, K.A.A.; Pradeep, R.G.G.R.; de Silva, E.K. Rethinking Assessment: The Future of Examinations in Higher Education. Sustainability 2022, 14, 3552. https://doi.org/10.3390/su14063552
Gamage KAA, Pradeep RGGR, de Silva EK. Rethinking Assessment: The Future of Examinations in Higher Education. Sustainability. 2022; 14(6):3552. https://doi.org/10.3390/su14063552
Chicago/Turabian StyleGamage, Kelum A. A., Roshan G. G. R. Pradeep, and Erandika K. de Silva. 2022. "Rethinking Assessment: The Future of Examinations in Higher Education" Sustainability 14, no. 6: 3552. https://doi.org/10.3390/su14063552
APA StyleGamage, K. A. A., Pradeep, R. G. G. R., & de Silva, E. K. (2022). Rethinking Assessment: The Future of Examinations in Higher Education. Sustainability, 14(6), 3552. https://doi.org/10.3390/su14063552