Reality vs. Expectations of Assessment in STEM Education: An Exploratory Case Study of STEM Schools in Egypt
Abstract
:1. Introduction
2. Literature Review
2.1. STEM Education
2.2. Assessment for or of Learning
2.3. Assessment in STEM
2.4. STEM Assessment Frameworks
2.5. Assessment Strategies in STEM
3. Methods
3.1. Purpose of the Research
3.2. Research Design
3.3. Context (Setting)
Goals
- 1:
- Students must demonstrate a deep understanding of the scientific mathematical and social dimensions of Egypt’s grandest challenges as a country.
- 2:
- Students must demonstrate understanding of the content and ways of knowing that display scientific, mathematical and technological literacy and subject matter proficiency.
- 3:
- Students must exhibit self-motivation, self-direction and a hunger for continued learning.
- 4:
- Students must exhibit the ability to think independently, creatively and analytically.
- 5:
- Students must exhibit the ability to question, collaborate and communicate at a high level.
- 6:
- Students must demonstrate the capacity to become socially responsible leaders
- 7:
- Students must be to apply their understanding to advance creativity, innovation and invention with a real world vision with a consciousness and eye toward a more contemporary Egypt.
- 8:
- Goals 1–7 must be implemented and viewed through the lens of a Digital Platform. Students must become fluid with technology to ensure that they maximize digital methods of data storage and communication [54], p. 45.
Assessment Structure |
---|
Grades 10 and 11 according to ministerial decree number 382/2012 |
Students will be awarded a total score, which will be calculated based on four different indicators, as follows:
|
Grade 12 according to ministerial decree number 238/2013 |
The secondary STEM certificate is limited to 3rd year examinations. The committee in charge of developing the final year examinations are listed as follows:
|
3.4. Participants
Total Number of Participants = 22 | |
---|---|
Number of teachers from MoE&TE STEM schools | 19 |
Number of teachers from other aspiring STEM schools | 3 |
Subjects | |
Science | 13 (biology–chemistry–physics) |
Technology | 1 computer science |
Engineering | 2 leading STEAM projects |
Arts | 1 music |
Mathematics | 2 |
Language arts | 3 (2 English and 1 homeroom KG teacher) |
Range of teaching experience in STEM | 2 to 7 years |
Overall teaching Experience | 5 to 28 years |
Name * | School * | Subject | Experience in Teaching | Experience in STEM Teaching | Comments |
---|---|---|---|---|---|
Ebtisam | STEM School 1 | Biology | More than 20 years | 10 years | Capstone Coordinator |
Hadeer | STEM School 2 | Chemistry | 15 years | 5 years in 2 STEM schools | |
Omar | STEM School 3 | Computer Science | 18 years | 5 years | |
Dalila | STEM school 4 | English | 20 years | 3 years | |
Kareem | STEM School 5 | Biology | 10 years | 5 years | Capstone coordinator |
Mona | STEM School 6 | Science | 10 years | One year |
4. Data Collection
4.1. Survey
4.2. Semi Structured Interviews
5. Results
Data Analysis
6. Discussion
7. Limitations and Further Research
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
- What subject do you teach?
- How long have you been teaching in your (STEM) school?
- How long have you been teaching overall?
- Roughly, how many students are there in your school?
- What is your school mission and/or vision?
- How often do you include integrated STEM activities in your classes? Please list the STEM activities that you implement.
- What kind of assessment do you use in your (STEM) school at your disciplinary level: science, math, English, social studies?
- What kind of assessment do you use in your STEM school at the capstone/integrated project level, if any?
- Please share a sample learning outcome and a sample assessment you use to assess that learning outcome.
- Are you the person in charge of designing your own assessments? If not, who and why? Explain if this differs according to grade level.
- How different are your assessments in the STEM school to those used in your previous non-STEM schools?
- How are your assessments for STEM different to assessments using a disciplinary approach (e.g., science or mathematics)?
Appendix B
- In what ways do you implement STEM lesson plans in your classroom?
- What are your goals for students when using STEM in your classroom?
- What does assessment mean for you?
- How do you assess student learning in a STEM lesson or unit?
- Use follow-up questions if they do not address content learning, teamwork, engineering design
- Follow-up to find out if they use rubrics
- Follow-up to find out if they use STEM or engineering notebooks and how these are assessed
- In what ways is assessment in STEM different to science?
- In what ways is STEM assessed in standardized state or national testing?
- What is the most common type of assessment do you use in your STEM setting?
- Follow up question: Which do you believe is the most effective type of these
- assessments in your STEM setting?
- What is the difference between these forms of assessment you use and those used in the mainstream non-STEM education systems?
Appendix C. Poster Level of Proficiency
Criteria | Distinguished | Accomplished | Developing | Pre-Novice |
---|---|---|---|---|
Abstract (Poster) %5 | Distinguished includes all of the “Accomplished” criteria and the following. ___Abstract clearly ties together the entire project from Grand Challenge to chosen solution, design requirements and prototype, testing results and conclusions. ___Abstract alone generates excitement and desire to learn more about the topic. ___Writing is professional, organized and well developed. | ___Abstract is a brief description of the entire work described in the poster. ___Abstract understandable without reading the entire poster. ___Includes (1) purpose of the study, (2) brief statement of what was done (without including minor details of the methods), (3) brief statement of major findings, and (4) major conclusions. ___Writing is clear and readable. | ___Abstract present and relatively complete but not prepared according to all guidelines. | ___No abstract. |
Introduction (Poster) %20 | Distinguished includes all of the “Accomplished” criteria and the following. ___Moves clearly from a broad view of the Grand Challenge and research on various solutions to an increasingly narrow focus on the team’s chosen solution, design requirements, and prototype, justifying each choice. ___Provides a smooth transition from “what” choices they made and “why” they made them to the upcoming Materials/Methods section (the “how” section of the poster). | ___Connection is made to Egypt’s Grand Challenges. ___Clearly and objectively identifies the problem and summarizes prior solution attempts strengths and weaknesses. ___Includes design requirements for a new solution that can be tested. _____Summarizes how the team’s solution was chosen and how it addresses design requirements. | ___Introduction present and relatively complete but does not address all points indicated. | ___No introduction. |
Materials and Methods (Poster) %10 | Distinguished includes all of the “Accomplished” criteria and the following. ___The methods are clear enough to permit a reader to explain the method to another professional and be able to replicate the method. | ___A summary of the test plan for the prototype includes a summary of tests conducted and how they address design requirements. ___Materials lists and/or illustrations are summarized. | ___Test Plan Methods and Materials lists present and relatively complete but does not address all points indicated. | ___No Methods or no Materials list. |
Results (Poster) %15 | Distinguished includes all of the “Accomplished” criteria and the following. ___Supporting documentation (Capstone Portfolio) contains all data collected and is so well organized it could be handed to a new team to replicate the work with high fidelity. ___The visual representation of the results alone (without the words in the Results section) leads the reader to a conclusion about the results. | ___All types of results are presented, whether positive or negative. ____The Capstone Portfolio is available to show data for tests or scenarios that were conducted. ___Includes a table or figure that is appropriate for the type of results being described. | ___Results present and relatively complete but does not address all points indicated. | ___No results. |
Discussion (Poster) %35 | Distinguished includes all of the “Accomplished” criteria and the following. ___Conclusions are drawn from the test results and then compared with the team’s research on other solutions. ___Recommendations are practical and directed towards a future research, engineering or policy group. Recommendations are clearly informed by the problem, their proposed solution, and their findings. ___Students can articulate specific evidence of learning transfer from two or more of their content classes (learning outcomes) this Semester into their Capstone. | ___Discussion ties performance results to the original question being addressed and to the Grand Challenge. ___Proposed solution is supported with robust STEM principles and demonstrates applied learning transfer. ___Analysis is supported by pictures, graphs, charts and other visuals, and test results. ___Recommendations for future study are provided, including specific ways the project could be improved in the future. _____Writing is clear, organized and well developed. It explains, questions or persuades. It is written to meet the needs of the intended audience, and it uses forms that are common among STEM disciplines (E.g., notes, descriptive/narrative accounts, research reports). | ___Discussion present and relatively complete but does not address all points indicated. | ___No discussion. |
Literature Cited (Poster) %5 | Distinguished includes all of the “Accomplished” criteria and the following: ___At least five citations are peer-review publications. | ___Includes only sources cited in the poster text (at least 5 sources). ___Includes only papers actually read by the students. ___Is prepared according to the American Psychological Association (APA) style guidelines. | ___Literature cited present and relatively complete but does not address all points indicated. | ___No literature cited (appropriate only if no citations used in the text). |
(Poster) Title, Name, Affiliation, Size, Layout, Graphics, Tables, Photos, Other Images %10 | Distinguished includes all of the “Accomplished” criteria and the following. ___The poster demonstrates brevity (focused, well synthesized and straight to the point) while targeting a professional audience. ___Visuals are well titled and labeled and tell clear stories with no other supporting text necessary. | ___Text is readable from a distance of about 1 m. ___Title is at top of the poster, short, descriptive of the project and easily readable at a distance of about 2 m (words about 1.5–2.5 cm tall).___Includes presenter’s name and school’s name in a section about 20–30% smaller than the title. ___Illustrations, tables, figures, photographs or diagrams have unique identification numbers and a key to identify symbols. ___Text includes references to specific graphics or pictures ___Legends include full explanation and where appropriate, color keys, scale, etc. ___All images presented in appropriate layout and size relative to text. | ___Elements are present but not meeting all of the requirements in the “Accomplished” column, or some elements are missing. | ___No graphics, tables, photos or other images (the poster should include at least some images). |
Prototype Level of Proficiency | ||||
Criteria | Distinguished | Accomplished | Developing | Pre-Novice |
Construction of a testable prototype (Prototype) %50 | Distinguished includes all of the “Accomplished” criteria and the following. ___Prototype is directly relevant to the chosen solution (E.g., an actual water treatment step). ___Students can demonstrate the functionality of their prototype or show visual proof of functionality in a different environment (e.g., Lab). ___If a software prototype (simulation) is used, modeling software such as LabView, Excel, or a programming language is demonstrated and can be tested. | ___Students can describe design requirements chosen for this prototype. Choice of design requirements is logical and well-reasoned. ___A prototype has been constructed that was suitable for testing. ___If a software prototype (simulation) was used, the selection of modeling methods is logical and justified and students can identify the functions or relationships contained in their software prototype. | ___A prototype or model has been constructed, some justification for the selection of design requirements and modeling approach is provided, but it is incomplete. | ___No prototype or model; or no evidence that constructed prototype or model would facilitate test of any of the design requirements. |
Capstone Portfolio: Prototype Testing and Data Collection Plan (Prototype) %20 | Distinguished includes all of the “Accomplished” criteria and the following. ___Capstone Portfolio contains a test plan that clearly connects every type of test listed to a specific design requirement, and all chosen design requirements are addressed by the test plan. ___Capstone Portfolio clearly communicates how each test in the test plan isolates what is being tested and acknowledges other interfering factors that might affect the results. ___Capstone Portfolio contains measurement methods that have quantified error described explicitly (e.g., +/−0.1 Volts). ___Capstone Portfolio indicates that the total materials expenditures are within budget (evaluated by administration). | ___Capstone Portfolio contains a test plan for the prototype which provides the scenarios to be tested and how they relate to design requirements. ___Capstone Portfolio contains a test plan that describes tests to be conducted in a thorough and clearly understandable manner. ___Capstone Portfolio test plan supports repetition and testing by others. ___Capstone Portfolio indicates that the total materials expenditures are below 1.5 times the budget (evaluated by administration). | ___Testing plan exists and partially describes the testing to be conducted; limited justification of why the tests were selected. | ___Testing plan is missing altogether, or fails to demonstrate any understanding of why tests relate to the design requirements. |
Capstone Portfolio: Testing, data collection and analysis (Prototype) %30 | Distinguished includes all of the “Accomplished” criteria and the following. ___The Materials list is clear enough that a reader can replicate the work precisely. ___Capstone Portfolio accurately records all data collected through the project phases. ___Capstone Portfolio is sufficiently well organized that it can be turned over to another group to replicate the prototype and tests. ___Capstone Portfolio lists 10 learning outcomes from their other subjects that they have transferred and applied in their capstone. Each learning outcome must have one paragraph clearly explaining how this learning outcome was transferred to their Capstone project. | ___Capstone Portfolio contains a material list with cost (includes receipts if purchased) and/or illustrations, needed to replicate the prototype or model. ___Capstone Portfolio demonstrates whether the prototype met design requirements with data from each portion of the test procedure ___Capstone Portfolio contains analysis supported by graphs, charts and/or other visuals. ___Capstone Portfolio lists 5 learning outcomes from their other subjects that they have transferred and applied in their capstone. Each learning outcome should have one paragraph explaining how this learning outcome was transferred to their capstone. | ___Documentation is provided for some tests that were conducted, but some are not described. ___Analysis of the effectiveness of the design is generally described and has limited support using pictures, graphs, charts and other visuals. | ___No documentation presented for test results, or results presented are not tied to the testing plan in any logical way. |
References
- Freeman, B.; Marginson, S.; Tytler, R. The age of STEM: Educational policy and practice across the world in science, technology, engineering and mathematics (1st ed.). Taylor Fr. 2014, 26, 303. [Google Scholar] [CrossRef]
- Cunningham, C. Engineering in Elementary STEM Education: Curriculum Design, Instruction, Learning, and Assessment; Teachers College Press: New York, NY, USA, 2018; ISBN 13:978-0807758779. [Google Scholar]
- Faxon-Mills, S.; Hamilton, L.S.; Rudnick, M.; Stecher, B.M. New Assessments, Better Instruction? Designing Assessment Systems to Promote Instructional Improvement; RAND Corporation: Santa Monica, CA, USA, 2013. [Google Scholar]
- Kinash, S.; Knight, D. Assessment @ Bond; Office of Learning and Teaching, Bond University: Gold Coast, QLD, Australia, 2013; ISBN 9781922183118. [Google Scholar]
- Roller, S.A.; Cunningham, E.P.; Marin, K.A. Photographs and Learning Progressions. YC Young Child. 2019, 74, 26–33. [Google Scholar]
- Sato, M.; Wei, R.C.; Darling-Hammond, L. Improving teachers’ assessment practices through professional development: The case of national board certification. Am. Educ. Res. J. 2008, 45, 669–700. [Google Scholar] [CrossRef] [Green Version]
- Baird, J.; Andrich, D.; Hopfenbeck, T.N.; Stobart, G. Assessment and learning: Fields apart? Assessment in Education. Princ. Policy Pract. 2017, 24, 317–350. [Google Scholar]
- Johnson, C.C.; Moore, T.J.; Utley, J.; Breiner, J.; Burton, S.R.; Peters-Burton, E.E.; Walton, J.B. The STEM road map for grades 6–8. In STEM Road Map 2.0; Routledge: London, UK, 2021; pp. 102–132. [Google Scholar]
- National Research Council. STEM Integration in K-12 Education: Status, Prospects, and an Agenda for Research; National Academies Press: Washington, DC, USA, 2014. [Google Scholar]
- Biggs, J. Teaching for Quality Learning at University—What the Student Does, 2nd ed.; Open University Press: London, UK, 2003. [Google Scholar]
- Biggs, J.B.; Tang, C.K. Teaching for Quality Learning at University: What the Student Does, 4th ed.; Open University Press: London, UK, 2011. [Google Scholar]
- Martone, A.; Sireci, S. Evaluating Alignment between Curriculum, Assessment, and Instruction. Rev. Educ. Res. 2009, 79, 1332–1361. Available online: http://rer.aera.net (accessed on 1 August 2022). [CrossRef] [Green Version]
- McMahon, T. Achieving constructive alignment: Putting outcomes first. Aukštojo Moksl. Kokyb. 2006, 3, 10–19. [Google Scholar]
- Borrego, M. Constructive alignment of interdisciplinary graduate curriculum in engineering and science: An analysis of successful IGERT proposals. J. Eng. Educ. 2010, 99, 355–369. [Google Scholar] [CrossRef]
- Chase, A. A report of the STEM education track of the 2017 assessment institut. Assess. Update 2018, 30, 6–7. [Google Scholar] [CrossRef]
- El Nagdi, M.; Roehrig, G.H. Gender equity in STEM education: The case of an Egyptian girls’ school. In Theorizing STEM Education in the 21st Century; IntechOpen Publications: London, UK, 2019; pp. 315–317. Available online: https://www.intechopen.com/chapters/67951 (accessed on 1 April 2022).
- National Research Council. Monitoring Progress toward Successful K-12 STEM Education: A Nation Advancing? National Academies Press: Washington, DC, USA, 2013. [Google Scholar]
- Perignat, E.; Katz-Buonincontro, J. STEAM in practice and research: An integrative literature review. Think. Ski. Creat. 2019, 31, 31–43. [Google Scholar] [CrossRef]
- Bybee, R.W. Advancing STEM Education: A 2020 Vision. Technol. Eng. Teach. 2010, 70, 30. [Google Scholar]
- Bybee, R.W. The Case for STEM Education: Challenges and Opportunities; NSTA press: Arlington, TX, USA, 2013. [Google Scholar]
- Zollman, A. Learning for STEM literacy: STEM literacy for learning. Sch. Sci. Math. 2012, 112, 12–19. [Google Scholar] [CrossRef]
- Roehrig, G.H.; Dare, E.A.; Ellis, J.A.; Ring-Whalen, E. Beyond the basics: A detailed conceptual framework of integrated STEM. Discip. Interdiscip. Sci. Educ. Res. 2021, 3, 1–18. [Google Scholar] [CrossRef]
- Goldstein, H. A response to assessment and learning: Fields apart? Assess. Educ. Princ. Policy Pract. Assess. Learn. 2017, 24, 388–393. [Google Scholar] [CrossRef]
- Ring, E.A.; Dare, E.A.; Crotty, E.A.; Roehrig, G.H. The evolution of teacher conceptions of STEM education throughout an intensive professional development experience. J. Sci. Teach. Educ. 2017, 28, 444–467. [Google Scholar] [CrossRef]
- Breiner, J.M.; Harkness, S.S.; Johnson, C.C.; Koehler, C.M. What is STEM? A discussion about conceptions of STEM in education and partnerships. Sch. Sci. Math. 2012, 112, 3–11. [Google Scholar] [CrossRef]
- Herschbach, D.R. The STEM initiative: Constraints and challenges. J. STEM Teach. Educ. 2011, 48, 96–122. [Google Scholar] [CrossRef] [Green Version]
- El Nagdi, M.; Roehrig, G. Identity evolution of STEM teachers in Egyptian STEM schools in a time of transition: A case study. Int. J. STEM Educ. 2020, 7, 1–16. [Google Scholar] [CrossRef]
- Roehrig, G.; El-Deghaidy, H.; García-Holgado, A.; Kansan, D. A closer look to STEM education across continents: Insights from a multicultural panel discussion. In Proceedings of the 2022 IEEE Global Engineering Education Conference (EDUCON), Tunis, Tunisia, 28–31 March 2022; pp. 1873–1880. [Google Scholar]
- El-Deghaidy, H. STEAM Methods. In Designing and Teaching the Secondary Science Methods Course; Brill: Leiden, The Netherlands, 2017; pp. 71–87. [Google Scholar]
- Stiggins, R. From formative assessment to assessment for learning: A path to success in standards-based schools. Phi Delta Kappan 2005, 87, 324–328. [Google Scholar] [CrossRef] [Green Version]
- Westbroek, H.B.; van Rens, L.; van den Berg, E.; Janssen, F. A practical approach to assessment for learning and differentiated instruction. Int. J. Sci. Educ. 2020, 42, 955–976. [Google Scholar] [CrossRef] [Green Version]
- Palm, T. Performance assessment and authentic assessment: A conceptual analysis of the literature. Pract. Assess. Res. Eval. 2008, 13, 4. Available online: https://scholarworks.umass.edu/pare/vol13/iss1/4 (accessed on 15 August 2021). [CrossRef]
- Villarroel, V.; Bloxham, S.; Bruna, D.; Bruna, C.; Herrera-Seda, C. Authentic assessment: Creating a blueprint for course design. Assess. Eval. High. Educ. 2018, 43, 840–854. [Google Scholar] [CrossRef] [Green Version]
- Wiggins, G. The case for authentic assessment. Pract. Assess. Res. Eval. 1990, 2, 1–3. [Google Scholar]
- Tan, A.L.; Leong, W.F. Mapping Curriculum Innovation in STEM Schools to Assessment Requirements: Tensions and Dilemmas. Theory Pract. 2014, 53, 11–17. [Google Scholar] [CrossRef]
- Gao, X.; Peishan, L.; Shen, J.; Sun, H. Reviewing assessment of student learning in interdisciplinary STEM education. Int. J. STEM Educ. 2020, 7, 1–14. [Google Scholar] [CrossRef]
- Harwell, M.; Moreno, M.; Phillips, A.; Guzey, S.S.; Moore, T.J.; Roehrig, G.H. A Study of STEM Assessments in Engineering, Science, and Mathematics for Elementary and Middle School Students. Sch. Sci. Math. 2015, 115, 66–74. [Google Scholar] [CrossRef]
- Hansen, M.; Gonzalez, T. Investigating the relationship between STEM learning principles and student achievement in math and science. Am. J. Educ. 2014, 120, 139–171. [Google Scholar] [CrossRef]
- Ing, M. Can parents influence children’s mathematics achievement and persistence in STEM careers? J. Career Dev. 2014, 41, 87–103. [Google Scholar] [CrossRef]
- Seage, S.J.; Türegün, M. The Effects of Blended Learning on STEM Achievement of Elementary School Students. Int. J. Res. Educ. Sci. 2020, 6, 133–140. [Google Scholar] [CrossRef]
- Van der Vleuten, C.P. Assessment in the context of problem-based learning. Adv. Health Sci. Educ. Theory Pract. 2019, 24, 903–914. [Google Scholar] [CrossRef] [Green Version]
- Bicer, A.; Capraro, R.M.; Capraro, M.M. Integrated STEM assessment model. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 3959–3968. [Google Scholar] [CrossRef]
- Moore, T.J.; Stohlmann, M.S.; Wang, H.H.; Tank, K.M.; Glancy, A.W.; Roehrig, G.H. Implementation and integration of engineering in K-12 STEM education. In Engineering in Pre-College Settings: Synthesizing Research, Policy, and Practices; Purdue University Press: West Lafayette, IN, USA, 2014; pp. 35–60. [Google Scholar]
- Arikan, S.; Erktin, E.; Pesen, M. Development and validation of a STEM competencies assessment framework. Int. J. Sci. Math. Educ. 2020, 20, 1–24. [Google Scholar] [CrossRef]
- Pellegrino, J.W. Proficiency in science: Assessment challenges and opportunities. Science 2013, 340, 320–323. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Septiani, A.; Rustaman, N.Y. Implementation of performance assessment in STEM (Science, Technology, Engineering, Mathematics) education to detect science process skill. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2017; Volume 812, p. 012052. [Google Scholar]
- Yin, R.K. Case Study Research: Design and Methods, 5th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
- Creswell, J. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 4th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
- Greene, J.C.; Caracelli, V.J.; Graham, W.F. Toward a conceptual framework for mixed method evaluation designs. Educ. Eval. Policy Anal. 2016, 11, 255–275. [Google Scholar] [CrossRef]
- U.S. Agency for International Development (USAID). Egypt STEM School Project (ESSP) FINAL REPORT. ESSP Final Report, USAID/Egypt Cooperative Agreement No. AID 263-A-12-00005. 2017. Available online: https://pdf.usaid.gov/pdf_docs/PA00THDB.pdf (accessed on 15 August 2021).
- U.S. Agency for International Development (USAID). Support for STEM Secondary Education: Egypt. U.S. Agency for International Development. 2020. Available online: https://www.usaid.gov/egypt/documents/support-stem-secondary-education (accessed on 23 April 2022).
- Ministerial Decree 382. Rules of Students’ Admission, Study and Assessment for STEM High School. 2012. Available online: http://moe.gov.eg/stem/12-382.pdf (accessed on 15 August 2021).
- Ministerial Decree 238. System of Examination and Certificate of Completion Awarded to STEM High School Student. 2013. Available online: https://manshurat.org/node/2620 (accessed on 15 August 2021).
- Rissmann-Joyce, S.; El Nagdi, M. A case study: Egypt’s first STEM schools: Lessons learned. In Proceedings of the Global Summit on Education (GSE2013), Kuala Lumpur, Malaysia, 11–12 March 2013. [Google Scholar]
- Saldaña, J. The Coding Manual for Qualitative Researchers, 2nd ed.; SAGE: Newcastle upon Tyne, UK, 2013. [Google Scholar]
- Wiggins, G.; Wiggins, G.P.; McTighe, J. Understanding by Design; ASCD: Alexandria, VA, USA, 2005. [Google Scholar]
Classroom Assessment Tools | Type |
---|---|
Brainstorming | Formative |
Presentations | Formative |
Class discussions | Formative |
Quizzes | Summative at the of a learning unit |
Inquiry based assessment | Formative/summative |
Think pair share | Formative |
Mini project-based assessment | Formative |
Individual and group Hands on activities | Formative |
KWL (what you Know–what you Want to know–what you have Learned) | Formative |
Gallery walks | Formative |
Exit ticket | Summative/Formative |
Lab investigation | Formative/Summative |
Jigsaw activities | Formative |
Students leading class | Formative |
Short written exercises | Formative/summative |
Students debate | Formative |
Reaction discussion/paper to a video | Formative |
Frayer model | Formative/Summative |
Games | Formative |
Songs | Formative |
Final exams | |
At the disciplinary level | At the integrated level |
University readiness test (URT) (evaluated centrally) Concept inventory test (CT) (evaluated centrally) | Capstone project (evaluated by external judges) |
Breakdown of the Different Assessment Modes in the Capstone Projects | |||
---|---|---|---|
Formative and Ongoing | Weight | Summative | Weight |
Journaling (include reflection questions on the process and transfer questions) | 40% | Poster evaluation | 40% |
Portfolio | 10% | Prototype evaluation | 10% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
El Nagdi, M.A.; Roehrig, G.H. Reality vs. Expectations of Assessment in STEM Education: An Exploratory Case Study of STEM Schools in Egypt. Educ. Sci. 2022, 12, 762. https://doi.org/10.3390/educsci12110762
El Nagdi MA, Roehrig GH. Reality vs. Expectations of Assessment in STEM Education: An Exploratory Case Study of STEM Schools in Egypt. Education Sciences. 2022; 12(11):762. https://doi.org/10.3390/educsci12110762
Chicago/Turabian StyleEl Nagdi, Mohamed Ali, and Gillian H. Roehrig. 2022. "Reality vs. Expectations of Assessment in STEM Education: An Exploratory Case Study of STEM Schools in Egypt" Education Sciences 12, no. 11: 762. https://doi.org/10.3390/educsci12110762