Next Article in Journal
Education Practices Mediated by Digital Technologies: Mobilization and Teachers’ Strategies in Primary and Secondary Schools in Germany
Previous Article in Journal
Using Role Models and Game-Based Learning to Attract Adolescent Girls to STEM
 
 
Article
Peer-Review Record

Measuring and Comparing High School Teachers’ and Undergraduate Students’ Knowledge of Complex Systems

Educ. Sci. 2024, 14(8), 837; https://doi.org/10.3390/educsci14080837
by Lin Xiang 1,*, Zitsi Mirakhur 2, Andrew Pilny 3 and Rebecca Krall 1
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3:
Educ. Sci. 2024, 14(8), 837; https://doi.org/10.3390/educsci14080837
Submission received: 14 May 2024 / Revised: 6 July 2024 / Accepted: 29 July 2024 / Published: 1 August 2024
(This article belongs to the Section STEM Education)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

Excellent article and methodology.

Author Response

Comments and Suggestions for Authors

Excellent article and methodology.

Response: Thank you!

Reviewer 2 Report

Comments and Suggestions for Authors

The paper presents an exploratory study of knowledge on complex systems (CS). As shown by the authors, CS are a curricular relevant topic, however, there seems to be only a limited footprint of this in curriculum design. The authors test a questionaire with a variety of probands and report their findings.

I find the paper very well written and structured, i.e. very readable and understandable.

Contentwise, I (as a science educator) have to confess that I myself would not have had the topic of CS on my radar. Thus, I feel it would be worthwhile to give additional information on curricular considerations and relevance in the Introduction. What would be possible benefits and applications of teaching CS (beside the obvious reference "climate change")?

The design and testing of the questionaire follows very straight the script "theoretical aspects of CS" -> aspect-specific item groups -> testing and evaluation. This is good and shall not be critized but I am not sure whether I find this meaningful: the best case scenario would be that we end up with a diagnosis tool enabling us to identify specific knowledge gaps of our students. However, even in a world totally neglecting teaching of CS more than three quarters of the probands identify complex systems correctly (fig. 3), the knowledge differences between different proband groups within a certain aspects are negligible (Fig 2 or table 4 & 5), and the knowledge level is generally (amazingly!) high. Here, I would appreciate a  separate round of discussion: what is point in using this tool  and how would we use it to improve CS teaching?

For the statistical description I would write a discussion of item correlations on my wish list. E.g. (to challenge the authors, not as a serious proposal...), fig. 2 suggest that it would suffice to just ask about two aspects (e.g. "element" and "stochasticity") to get a valid placement of probands. Also, I feel that it would be important to show (and see) that the item correlation is indeed high (enough) within one aspect but significant different with items of other aspects.

Author Response

Comments and Suggestions for Authors

The paper presents an exploratory study of knowledge on complex systems (CS). As shown by the authors, CS are a curricular relevant topic, however, there seems to be only a limited footprint of this in curriculum design. The authors test a questionnaire with a variety of probands and report their findings.

I find the paper very well written and structured, i.e. very readable and understandable.

Response: Thank you!

Contentwise, I (as a science educator) have to confess that I myself would not have had the topic of CS on my radar. Thus, I feel it would be worthwhile to give additional information on curricular considerations and relevance in the Introduction. What would be possible benefits and applications of teaching CS (beside the obvious reference "climate change")? The design and testing of the questionnaire follows very straight the script "theoretical aspects of CS" -> aspect-specific item groups -> testing and evaluation. This is good and shall not be critized but I am not sure whether I find this meaningful: the best case scenario would be that we end up with a diagnosis tool enabling us to identify specific knowledge gaps of our students. However, even in a world totally neglecting teaching of CS more than three quarters of the probands identify complex systems correctly (fig. 3), the knowledge differences between different proband groups within certain aspects are negligible (Fig 2 or table 4 & 5), and the knowledge level is generally (amazingly!) high. Here, I would appreciate a separate round of discussion: what is point in using this tool and how would we use it to improve CS teaching?

Response: Thank you for the questions and suggestions. The current work has two aims. Our first aim was indeed to develop a diagnosis tool for CS knowledge because there are no existing survey instruments to quickly assess such knowledge in secondary and postsecondary science education. We hope educators (e.g., science teachers and science teacher educators) can use this tool to quickly assess the CS knowledge of their students (e.g., high school students, undergraduates, and science teachers in a professional development session) before teaching CSs. Our second aim of the study was to evaluate high school teachers’ and undergraduates’ CS knowledge using the tool. We fully agree that CS is often under-taught in science classes. Therefore, it is important for educators to understand what knowledge learners have developed before formally learning about CS. We suggest that the survey be used as a pre-assessment tool, and educators may use the assessment results as the foundation for designing meaningful CS learning experiences. We have added these clarifications in the discussion section on pages 15-17, highlighted in green.

For the statistical description I would write a discussion of item correlations on my wish list. E.g. (to challenge the authors, not as a serious proposal...), fig. 2 suggest that it would suffice to just ask about two aspects (e.g. "element" and "stochasticity") to get a valid placement of probands. Also, I feel that it would be important to show (and see) that the item correlation is indeed high (enough) within one aspect but significantly different with items of other aspects.

Response: Thank you for pointing this out. We agree that we need a discussion on the path coefficients (i.e., correlations) among the survey constructs. Therefore, we have added the relevant discussion regarding the path coefficient significance and sizes among the five survey constructs on page 15, which are highlighted in green.  

In terms of your comments on fig 2, we suspect that you meant Figure 1. You suggest that it would suffice to ask just two aspects. We respectfully disagree with it. As we explained in the manuscript, complex systems are multifaceted. The five selected aspects are related AND distinct. Even though the two constructs exhibit a significant path coefficient, they capture very different features of CS. For example, the element construct asks about the properties of the CS elements (e.g., Numerosity, Element Type, System Environment), but the micro-interaction construct asks about the elements’ behavioral features (e.g., Simultaneous, Multiple causalities, Continuous). The element construct is related to the micro-interaction construct, but they asked respondents’ opinions on very different features of CS. Thus, including all five constructs in the survey enables us to assess respondents’ CS knowledge in a more holistic manner.

Reviewer 3 Report

Comments and Suggestions for Authors

Dear Author/s,

I have analyzed the various sections of your manuscript and evaluated the strengths and weaknesses. The detailed analysis is as follows:

  1. Introduction

Strengths

  • Effective argumentation on why knowledge of complex systems is important. Current global issues, such as climate change, pandemics, and biodiversity loss, are referenced.
  • The inclusion of American educational standards (NGSS), which recognize complex systems as a key concept. This adds credibility and shows the connection of the topic to global educational norms.
  • Correctly identified gap in the current state of research regarding high school teachers, which justifies the need for this study.

Weaknesses

  • The statement "Many STEM practitioners and researchers deal with and study CSs" without appropriate citations is too general and can be considered a vague assertion. Such statements should be supported by sources.
  • The authors reference only American educational standards. The title suggests a general study of high school teachers' and undergraduate students' knowledge, not just in the American context. It might be worth expanding the title to include the location of the study, as the participant groups are also based in the United States.

 

2. Theoretical Framework

 

Strengths

  • The authors' statement, "Here, we take the position that not all systems with many constituents or more complicated internal processes are complex systems because it is important to distinguish between complicated systems and complex systems," shows the connection between the researchers' stance and their research question, specifically how participants understand this crucial distinction.
  • The assertion at the end of this section that problems may arise from the inability to differentiate between complicated and complex systems is significant. It underscores the importance of education in complex systems. This assertion is directly related to the research questions, particularly the question about the ability of teachers and students to distinguish between complicated and complex systems. The claim that an instrument to assess knowledge about complex systems is the starting point for designing educational experiences is well justified.

Weaknesses

  • Similar to the introduction, this section focuses primarily on American educational standards. Given that the title of the paper does not explicitly refer to the American context, introducing a broader geographical perspective could enhance the study's universal relevance, or providing an explanation of why this is not done.
  • The assertion could be more prominently highlighted to ensure that readers easily grasp its significance.

 

3. Students’ and Teachers’ Understanding of CSs

 

Strengths

  • A comprehensive literature review on the understanding of complex systems (CS) by students and teachers highlights the importance of the topic and demonstrates that the authors are well-versed in existing research.
  • The main issues faced by students and teachers in understanding CS are clearly identified, such as reductionist perspectives, lack of understanding of decentralization, unpredictability, and randomness in CS.

Weaknesses

  • The statement "Decades of studies have suggested that people often do not develop accurate CS knowledge from daily experiences" is too general and not supported by specific sources at the beginning of this section. It should be more precise.

 

4. The Development of Complex Systems Knowledge Survey (CSKS)

 

Strengths

  • The section is well-organized and provides a detailed description of the CSKS survey development process, explaining the steps of the research and the logic behind the selection of specific constructs.
  • The authors clearly justify the selection of the five constructs (elements, micro-interactions, decentralization, stochasticity, emergence) as crucial for understanding complex systems. They also highlight their importance in distinguishing between complicated and complex systems.
  • The authors reference a wide range of literature, which strengthens the credibility and theoretical solidity of the discussed concepts.

Weaknesses

  • The use of the general term "current literature" is too broad and imprecise. It is unclear which specific studies or articles the authors are referring to, leading to ambiguity. Consider starting with: "This study is guided by specific principles of complex systems theory, including the work of scholars such as [Author, Year], who emphasize the importance of elements, interactions, and emergent behaviors in complex systems."

 

5. Materials and Methods

 

Strengths

  • This section provides a detailed description of the demographics of the study participants, including information on gender, years of teaching experience, educational level of students, and racial affiliation. This helps to understand who the participants were and how diverse the study group was.
  • The justification for selecting lower-year students to examine their knowledge acquired at the high school level is particularly valuable.
  • The data collection process is well-described, including the use of the Qualtrics platform, sending invitation and reminder emails, and the survey closing period. This ensures transparency in the description of the study sample and the research process.
  • The empirical data analysis was conducted using statistical tools SPSS 29 and Smart-PLS 4.1, which are appropriate for survey data analysis and structural equation modeling. This indicates a professional approach to data analysis.
  • The process of survey validation is thoroughly described and includes the assessment of convergent validity, indicator collinearity, and indicator weights and loadings according to the guidelines of Hair et al. [46]. This demonstrates attention to the reliability and accuracy of the research tool.
  • Researchers removed items only when they had low and non-significant outer loadings (< 0.5) and were not essential for the related construct, which aligns with best practices in survey data analysis.
  • Internal reliability analysis was performed only for the 12 binary system identification questions, which is appropriate given their nature as reflective indicators.
  • The analysis of path coefficients and R² values allows for understanding the relationships between constructs, which is crucial for answering the research questions.
  • The scoring system for responses is clear and logical, and the results were converted to percentages for further statistical analysis, facilitating the interpretation of results.
  • The use of t-tests and ANOVA to compare results between respondent groups is appropriate and well-described. The significance level set at 0.05 is standard and widely accepted.

Weaknesses

  • At the beginning of the section, there is a general statement without specific sources: "Our target survey respondents were high school teachers and undergraduates." This statement is too general and lacks a description of the study area and its justification.
  • The use of non-probability methods to distribute the survey, based on known teacher networks, is criticized for potential biases in sample selection. There is a lack of detailed explanation of why this method was chosen and what its limitations might be.
  • It is worth noting that both among teachers and students, the group of non-STEM respondents is smaller. This could have influenced the study results, yet there is no explanation of this potential limitation.

 

6. Results

 

6.1. Instrument Validation

 

Strengths

  • This section provides a detailed description of the survey validation process, including the removal of ten Likert items due to low and insignificant outer loadings and unclear language. The example given regarding the ambiguity of time is well justified.
  • The analysis of path coefficients and R² values provides insight into the relationships between constructs and variables, which is crucial for understanding the study results.
  • The decision to retain an item despite its low outer loading when it reflects a key aspect of a construct is well justified and shows that the authors understand the theoretical importance of these items.

 

Weaknesses

  • The use of the Partial Least Squares (PLS) method for analyzing formative indicators is appropriate and in line with best practices. The detailed PLS analysis results are clearly presented in Table 3. It would be beneficial to discuss more thoroughly what the lack of significant path coefficients between the five constructs and the system identification variable means.
  • Figure 1 is clear and well-organized, making it easy to understand the relationships between different constructs. Each path is clearly labeled, and the values of the path coefficients and R² are prominently displayed. For example, a high R² value for "Emergence" (0.489) indicates a strong influence of other constructs on this variable. It should be explained that the R² value for "Complex system identification" (0.092) is low, suggesting that the model does not well explain the variability in the ability to identify complex systems in this case. Further analysis and potential model modification are needed to explain this value. Some path coefficients are very low or insignificant, such as the path between "Decentralization" and "Complex system identification" (0.007), indicating a lack of strong influence of this construct on the dependent variable. This should be mentioned. The diagram can serve as inspiration for analyzing alternative or new models. An alternative visualization could emphasize significant paths (e.g., bold lines for significant coefficients), helping to quickly identify key relationships.

 

6.2. Characteristics and Differences of Teachers’ and Undergraduate Students’ Knowledge of CSs

 

Strengths

  • This section provides a detailed comparison of results between teachers and students, allowing for an understanding of the differences in knowledge about complex systems (CS), including results related to elements, micro-interactions, decentralization, stochasticity, and emergence. The statement that teachers have higher scores than students is well justified.
  • The use of t-tests and ANOVA to analyze differences in results is appropriate and well described. The test results are clearly presented in tables, making interpretation easier.
  • Figure 2, which compares the results of STEM and non-STEM students across different years of study, is clear and helps in visualizing the results.

 

6.3. Teachers’ and Undergraduates’ Identification of CSs

 

Strengths

  • This section provides a detailed comparative analysis of the ability of teachers and students to identify complex systems, providing valuable information about the differences in perception between these groups.
  • The use of chi-square tests to analyze differences in responses between teachers and students is appropriate and well justified. The results of these tests are clearly presented and interpreted.

 

6.4. Complex System Examples from Teachers and Undergraduates

 

Strengths

  • In this section, the use of chi-square tests to analyze differences between teachers and students, as well as between different groups of teachers, is appropriate and well justified. The results of these tests are clearly presented and interpreted.

 

7. Discussion and Implications

 

Strengths

  • This section discusses the challenges related to developing and validating the Complex Systems (CS) knowledge survey. The authors highlight the difficulties in defining and measuring various aspects of CS. Examples such as the reference to the work of Hair et al. [49] and complexity science theory strengthen the argument.
  • The authors provide constructive conclusions and suggestions for future research, such as the need for further studies on the criteria used by respondents to identify CS and the inclusion of a Likert scale for system identification assessment.
  • The authors emphasize the importance of the study results for education, pointing out the need for precise definitions of CS in curricula and the necessity to design better educational experiences for teachers and students.
  • The authors indicate specific steps to be taken in future research to improve the research tool and enhance the understanding of CS. This demonstrates a thoughtful approach to the continued development of the instrument.

 

Best Regards

Reviewer

Author Response

Comments and Suggestions for Authors

Dear Author/s,

I have analyzed the various sections of your manuscript and evaluated the strengths and weaknesses. The detailed analysis is as follows:

1.  Introduction Strengths

  • Effective argumentation on why knowledge of complex systems is important. Current global issues, such as climate change, pandemics, and biodiversity loss, are referenced.
  • The inclusion of American educational standards (NGSS), which recognize complex systems as a key concept. This adds credibility and shows the connection of the topic to global educational norms.
  • Correctly identified gap in the current state of research regarding high school teachers, which justifies the need for this study.

Weaknesses

  • The statement "Many STEM practitioners and researchers deal with and study CSs" without appropriate citations is too general and can be considered a vague assertion. Such statements should be supported by sources.
  • Response: We have added sources to support the statement on page 1.
  • The authors reference only American educational standards. The title suggests a general study of high school teachers' and undergraduate students' knowledge, not just in the American context. It might be worth expanding the title to include the location of the study, as the participant groups are also based in the United States.
  • Response: Thanks for bringing up this valuable point. In the revised manuscript, we have specified the geographic location of the study in the abstract, introduction, and methods section. Meanwhile, we have made revisions on page 1 to specify that complex systems education is also stressed in other countries, such as Germany and the Netherlands.

2.  Theoretical Framework Strengths

  • The authors' statement, "Here, we take the position that not all systems with many constituents or more complicated internal processes are complex systems because it is important to distinguish between complicated systems and complex systems," shows the connection between the researchers' stance and their research question, specifically how participants understand this crucial distinction.
  • The assertion at the end of this section that problems may arise from the inability to differentiate between complicated and complex systems is significant. It underscores the importance of education in complex systems. This assertion is directly related to the research questions, particularly the question about the ability of teachers and students to distinguish between complicated and complex systems. The claim that an instrument to assess knowledge about complex systems is the starting point for designing educational experiences is well justified.

Weaknesses

  • Similar to the introduction, this section focuses primarily on American educational standards. Given that the title of the paper does not explicitly refer to the American context, introducing a broader geographical perspective could enhance the study's universal relevance, or providing an explanation of why this is not done.
  • Response: Response: Thanks for bringing up this valuable point. In the revised manuscript, we have specified the geographic location of the study in the abstract, introduction, and methods section. We also added the context of Dori et al.’s study, which gathered data from systems scientists and practitioners in the International Council on Systems Engineering (INCOSE), which is an international organization.
  • The assertion could be more prominently highlighted to ensure that readers easily grasp its significance.
  • Response: Thanks. We have modified the assertion to highlight the significance of the instrument on page 5.

3.  Students’ and Teachers’ Understanding of CSs Strengths

  • A comprehensive literature review on the understanding of complex systems (CS) by students and teachers highlights the importance of the topic and demonstrates that the authors are well-versed in existing research.
  • The main issues faced by students and teachers in understanding CS are clearly identified, such as reductionist perspectives, lack of understanding of decentralization, unpredictability, and randomness in CS.

Weaknesses

  • The statement "Decades of studies have suggested that people often do not develop accurate CS knowledge from daily experiences" is too general and not supported by specific sources at the beginning of this section. It should be more precise.
  • Response: We fully agree that sources are needed to support a statement and believe this paragraph has been written in alignment with this style. The sentence mentioned in your comment serves as the topic sentence of the paragraph, and the rest of the paragraph provides the findings from the existing studies that support and elaborate on the ideas in the topic sentence.

4.  The Development of Complex Systems Knowledge Survey (CSKS) Strengths

  • The section is well-organized and provides a detailed description of the CSKS survey development process, explaining the steps of the research and the logic behind the selection of specific constructs.
  • The authors clearly justify the selection of the five constructs (elements, micro-interactions, decentralization, stochasticity, emergence) as crucial for understanding complex systems. They also highlight their importance in distinguishing between complicated and complex systems.
  • The authors reference a wide range of literature, which strengthens the credibility and theoretical solidity of the discussed concepts.

Weaknesses

  • The use of the general term "current literature" is too broad and imprecise. It is unclear which specific studies or articles the authors are referring to leading to ambiguity. Consider starting with: "This study is guided by specific principles of complex systems theory, including the work of scholars such as [Author, Year], who emphasize the importance of elements, interactions, and emergent behaviors in complex systems."

Response: Given complex systems are multifaceted, we decided to include specific studies when introducing each of the five survey constructs in the successive paragraphs of this section to help readers make connections easily.  In this revised manuscript, we added two citations to support the “current literature ” on page 6.

5.  Materials and Methods Strengths

  • This section provides a detailed description of the demographics of the study participants, including information on gender, years of teaching experience, educational level of students, and racial affiliation. This helps to understand who the participants were and how diverse the study group was.
  • The justification for selecting lower-year students to examine their knowledge acquired at the high school level is particularly valuable.
  • The data collection process is well-described, including the use of the Qualtrics platform, sending invitation and reminder emails, and the survey closing period. This ensures transparency in the description of the study sample and the research process.
  • The empirical data analysis was conducted using statistical tools SPSS 29 and Smart-PLS 4.1, which are appropriate for survey data analysis and structural equation modeling. This indicates a professional approach to data analysis.
  • The process of survey validation is thoroughly described and includes the assessment of convergent validity, indicator collinearity, and indicator weights and loadings according to the guidelines of Hair et al. [46]. This demonstrates attention to the reliability and accuracy of the research tool.
  • Researchers removed items only when they had low and non-significant outer loadings (< 0.5) and were not essential for the related construct, which aligns with best practices in survey data analysis
  • Internal reliability analysis was performed only for the 12 binary system identification questions, which is appropriate given their nature as reflective indicators.
  • The analysis of path coefficients and R² values allows for understanding the relationships between constructs, which is crucial for answering the research questions.
  • The scoring system for responses is clear and logical, and the results were converted to percentages for further statistical analysis, facilitating the interpretation of results.
  • The use of t-tests and ANOVA to compare results between respondent groups is appropriate and well-described. The significance level set at 0.05 is standard and widely accepted.

Weaknesses

  • At the beginning of the section, there is a general statement without specific sources: "Our target survey respondents were high school teachers and undergraduates." This statement is too general and lacks a description of the study area and its justification.

Response: On page 9, we have adapted our language to explain how, in our selection of target respondents or teachers, we are being responsive to the limitations in the existing literature. In contrast, studying undergraduate students allows us to compare our results to existing studies.

  • The use of non-probability methods to distribute the survey, based on known teacher networks, is criticized for potential biases in sample selection. There is a lack of detailed explanation of why this method was chosen and what its limitations might be.

Response: Thank you for catching this oversight on our part. On page 9, we have included language that highlights why we chose non-probability methods and the ways in which this might limit the generalizability of our findings.

  • It is worth noting that both among teachers and students, the group of non- STEM respondents is smaller. This could have influenced the study results, yet there is no explanation of this potential limitation.

Response: We have added a disciplinary focus to our discussion of the disproportionality in the teacher samples on page 9 and page 20. In regard to undergraduates, we would like to point out that there is no significant difference between the STEM (49%) and non-STEM (51%) majors in the undergraduate student samples.

6.  Results

  • Instrument Validation Strengths

This section provides a detailed description of the survey validation process, including the removal of ten Likert items due to low and insignificant outer loadings and unclear language. The example given regarding the ambiguity of time is well justified.

  • The analysis of path coefficients and R² values provides insight into the relationships between constructs and variables, which is crucial for understanding the study results.
  • The decision to retain an item despite its low outer loading when it reflects a key aspect of a construct is well justified and shows that the authors understand the theoretical importance of these items.

Weaknesses

  • The use of the Partial Least Squares (PLS) method for analyzing formative indicators is appropriate and in line with best practices. The detailed PLS analysis results are clearly presented in Table 3. It would be beneficial to discuss more thoroughly what the lack of significant path coefficients between the five constructs and the system identification variable means.

Response: Agreed. We have added more interpretations on the non-significant path coefficients between the five constructs and the system identification variable in the discussion section on pages 17-18. It is highlighted in green because it was also requested by the other reviewer.

  • Figure 1 is clear and well-organized, making it easy to understand the relationships between different constructs. Each path is clearly labeled, and the values of the path coefficients and R² are prominently displayed. For example, a high R² value for "Emergence" (0.489) indicates a strong influence of other constructs on this variable. It should be explained that the R² value for "Complex system identification" (0.092) is low, suggesting that the model does not well explain the variability in the ability to identify complex systems in this case. Further analysis and potential model modification are needed to explain this value. Some path coefficients are very low or insignificant, such as the path between "Decentralization" and "Complex system identification" (0.007), indicating a lack of strong influence of this construct on the dependent variable. This should be mentioned. The diagram can serve as inspiration for analyzing alternative or new models. An alternative visualization could emphasize significant paths (e.g., bold lines for significant coefficients), helping to quickly identify key relationships.
  • Response: Thanks for the great suggestions. We have revised the paragraph in the results section to clearly report the path coefficients and R2 on page 12, highlighted in yellow, and increased the discussions on the path coefficients on pages 17-18. We have also improved Figure 1 based on your suggestions on page 13. The revised figure provides a clearer visualization of the key relationships among the model variables.

6.2.  Characteristics and Differences of Teachers’ and Undergraduate Students’ Knowledge of CSs

Strengths

  • This section provides a detailed comparison of results between teachers and students, allowing for an understanding of the differences in knowledge about complex systems (CS), including results related to elements, micro- interactions, decentralization, stochasticity, and emergence. The statement that teachers have higher scores than students is well justified.
  • The use of t-tests and ANOVA to analyze differences in results is appropriate and well described. The test results are clearly presented in tables, making interpretation easier.
  • Figure 2, which compares the results of STEM and non-STEM students across different years of study, is clear and helps in visualizing the results.

6.3.  Teachers’ and Undergraduates’ Identification of CSs

Strengths

  • This section provides a detailed comparative analysis of the ability of teachers and students to identify complex systems, providing valuable information about the differences in perception between these groups.
  • The use of chi-square tests to analyze differences in responses between teachers and students is appropriate and well justified. The results of these tests are clearly presented and interpreted.

6.4.  Complex System Examples from Teachers and Undergraduates

Strengths

  • In this section, the use of chi-square tests to analyze differences between teachers and students, as well as between different groups of teachers, is appropriate and well justified. The results of these tests are clearly presented and interpreted.

7.  Discussion and Implications

Strengths

  • This section discusses the challenges related to developing and validating the Complex Systems (CS) knowledge survey. The authors highlight the difficulties in defining and measuring various aspects of CS. Examples such as the reference to the work of Hair et al. [49] and complexity science theory strengthen the argument.
  • The authors provide constructive conclusions and suggestions for future research, such as the need for further studies on the criteria used by respondents to identify CS and the inclusion of a Likert scale for system identification assessment.
  • The authors emphasize the importance of the study results for education, pointing out the need for precise definitions of CS in curricula and the necessity to design better educational experiences for teachers and students.
  • The authors indicate specific steps to be taken in future research to improve the research tool and enhance the understanding of CS. This demonstrates a thoughtful approach to the continued development of the instrument.

Round 2

Reviewer 2 Report

Comments and Suggestions for Authors

Thank you for the new version, I find it a significant improvement. I am looking forward to hear/read more on this...

Back to TopTop