Next Article in Journal
Chemical, Anatomical, and Productivity Responses of Cowpea (Vigna unguiculata L.) to Integrated Biofertilizer Applications with PGPR, Cyanobacteria, and Yeast
Next Article in Special Issue
Graph Theory: Enhancing Understanding of Mathematical Proofs Using Visual Tools
Previous Article in Journal
Digital Transformation and Enterprise Sustainability: The Moderating Role of Regional Virtual Agglomeration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Lab-Tec@Home: Technological Innovation in Control Engineering Education with Impact on Complex Thinking Competency

by
David Sotelo
1,
José Carlos Vázquez-Parra
2,
Marco Cruz-Sandoval
3 and
Carlos Sotelo
1,*
1
School of Engineering and Sciences, Tecnologico de Monterrey, Monterrey 64849, Mexico
2
Institute for the Future of Education, Tecnologico de Monterrey, Guadalajara 45138, Mexico
3
Center for the Future of Cities, Tecnologico de Monterrey, Monterrey 64849, Mexico
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(9), 7598; https://doi.org/10.3390/su15097598
Submission received: 17 March 2023 / Revised: 14 April 2023 / Accepted: 27 April 2023 / Published: 5 May 2023
(This article belongs to the Special Issue Technology-Enhanced Science Learning)

Abstract

:
The objective of this paper is to present the results of the implementation process of the Lab-Tec@Home kit, an educational innovation that allows students to set up home laboratories to test and validate basic control engineering concepts. Specifically, the purpose of this study is to measure the acquisition and development of students’ perceived achievement of complex thinking competency during this educational practice, considering how valuable it is to improve their perception of additional skills while fulfilling the objectives of the innovation. We implemented a validated instrument to measure this competency before and after using the Lab-Tec@Home kit and in a control group where this educational innovation practice was not carried out. Although the results did not show a statistically significant difference in the level of perception or the improvement achieved between the groups, they indicated an impact on the homogeneity of the students’ perceived skills, which is a value of this educational tool and its greatest finding and contribution.

1. Introduction

Undoubtedly, the Covid-19 pandemic challenged public health and the continuity of the economy, industry, and people’s daily lives. Additionally, education and academia were strongly impacted by the isolation measures taken at the height of this health crisis, forcing institutions to innovate or stop their training processes. While the theoretical classes found excellent support through various virtual video link tools, the practical subjects required innovative proposals, the Lab-Tec@Home kit being an example [1].
Although Lab-Tec@Home was designed as an educational innovation proposed as a response to the restrictions of the pandemic, it should not be discarded post-isolation because of its convenience as a distance learning practice and its value in strengthening and developing skills in control engineering courses. Thus, this article presents the results of a study conducted during the implementation of Lab-Tec@Home to complement a post-pandemic face-to-face class. The intent was to measure its impact on students’ perceived achievement of the complex thinking competency and sub-competencies. The objective was to argue the validity of Lab-Tec@Home as an educational innovation that facilitates engineering processes and improves students’ perception of additional competencies valuable for their professional future.
The hypothesis that motivates this study is whether Lab-Tec@Home is a valid option for the development of complex thinking competency as well as an educational practice in a traditional laboratory.
To achieve this, this article is structured as follows. First, the educational practice carried out is outlined theoretically, considering the elements that make up Lab-Tec@Home, as well as what the students did concretely. Then, the relevance of complex thinking as a life-long skill is explored. In a second moment, the study frames the methodology used, the main results, and a discussion of them, relating them to some previous studies.

2. Theoretical Framework

2.1. Background of Lab-Tec@Home

During the COVID-19 pandemic, all educational institutions worldwide shut down, affecting more than 60% of students and causing massive disruption to educational systems [2]. Hence, massive online courses were implemented to comply with social distance measures. However, in terms of the course delivery mode, at the end of the semester, students had the feeling that the learning experience was less effective [3], especially for courses with practical content, such as those required in engineering education. Therefore, based on the experience of universities during the pandemic, it was necessary to develop effective online learning courses (Figure 1):
(a)
Effective learning: This is strongly needed due to students’ perception of virtual work overload. Learning time should be optimized, and appropriate learning activities for practical tasks should be carried out to ensure that engineering students do not lack technical knowledge.
(b)
Effective teaching: This depends directly on teacher training and involvement. Teachers must develop strategies for students to help them counteract the feeling of loss of status as a student.
(c)
Challenge: For superior learning outcomes, the courses must be highly engaging and inspiring. This is achieved when theoretical concepts are applied in real-life contexts.
(d)
Continuity of guidance: Students expect professors to align with the online model. They expect synchronous contact with the professor, informative responses to their questions, and active and interactive online sessions.
(e)
High-level courses: Although education was forced to shift to distance learning, students should receive the same educational quality as face-to-face courses.
(f)
Technological infrastructure: This issue relates to the virtual strategies implemented and the platform used by the educational institution to conduct online courses. Students expect teachers to understand that not everyone has the same technical resources, such as adequate internet bandwidth.
Developing a novel and technological solution to improve the hands-on experience in teaching control engineering during COVID, the Lab-Tec@Home project is oriented toward Education 4.0 [4].
Lab-Tec@Home is a laboratory kit designed as an alternative to conventional laboratories to overcome the restrictions of social distancing during pandemic times; with the kit, users can develop the competencies and skills required in control engineering. The main aim of the suggested kit was not to create an exact replica of a genuine concrete greenhouse but rather to put together a model of a tomato greenhouse that students could utilize to recognize the constituents and factors in a feedback control system [1].
To ensure that students comprehend engineering topics through hands-on experience, we considered the relevance of the learning modality provided, seeking that the Lab-Tec@Home experience would be used most efficiently. Thus, depending on the stages of the pandemic, three different teaching-learning formats were used to deliver the theoretical and practical contents flexibly and accessibly:
  • Online distance learning: During the outbreak of the Covid-19 pandemic, students worked from their places of origin [5]. Therefore, Lab-Tec@Home kits were sent regardless of their location in Mexico to carry out distance learning with real-time interaction; this favored effective student participation [6].
  • Hybrid learning: Considering the measure of social distancing but avoiding only massive online learning, students mix traditional and online classes. Thus, using the Lab-Tec@Home kit, half of the group worked in their own homes while the others performed their activities face to face.
  • Face-to-face learning: In this case, the students attended the course traditionally, supported by information and communication technologies. The Lab-Tec@Home kit allowed students to enhance the active learning experience, avoiding the uneven participation commonly present in a conventional laboratory, where there is usually only one workstation to perform the activities in teams.
In this way, although education was forced to shift among learning modalities [7], students using the Lab-Tec@Home kit received the same educational quality regardless of the course delivery modality [8].

2.2. About Lab-Tec@Home

The Lab-Tec@Home kit consists of two training packages, the first associated with continuous control learning and the second with control logic.
In Figure 2, the ‘Research and Roadmap Using Labkit’ is presented. According to the control engineering course, students have different theoretical classes and learning activities. Then, a pre-laboratory assignment is carried out to allow students identify the components of the labkit and have the prototype set for the assessment. Henceforth, during the solution proposal, learners start with the design cycle where the programming, connection of components, and data are performed to reach the main objective of the assignment. Finally, in the result and feedback, the professor identifies different opportunity areas in the learning process based on the proposed control strategy provided for the students.

2.2.1. Continuous Control Kit

Proportional, integral, and derivative (PID) control is one of the essential topics in control theory [1]. The real context of the lab kit concerns the supplementary light-emitting diodes (LED) in greenhouses. The kit lets students recognize the elements in a control closed loop (Figure 3 and Table 1).
Ten- and twenty-centimeter jumpers connect the components in a protoboard. Moreover, required electronic devices such as a potentiometer 10 [kΩ] establish the references signal (set-point), resistance 1 [kΩ], and capacitor 220 [µF] to allow the storage voltage due to the lighting conditions (Figure 4).
Thus, considering the collected data from the kit using a USB 5 cable [6] and the open-source Arduino™ software, we conducted the process identification by the three graphical methods. Hence, the lighting conditions must be regulated based on the design and implementation of conventional PID control strategies such as:
  • Ziegler-Nichols 1/4 Decay.
  • Conventional Ziegler-Nichols (IAE servo control).
  • Conventional Ziegler-Nichols (ITAE regulatory).

2.2.2. Logic Control Kit

Students use the LabTec@Home control logic kit to explore theoretical concepts related to binary code, Boolean algebra, and Karnaugh Maps, i.e., to implement their logic functions using the logic gates (Figure 5).
To develop the LabTec@Home control logic PCB presented in Figure 6, we used the online open-source EasyEDA design software. The PCB is energized through a female USB connector, which powers every component of the PCB, such as the buzzer, LEDs, and logic gates.
Thus, the proper technology implementation in education, such as the control logic kit shown in Figure 7, enhances the users’ learning experience by maximizing their academic and practical performance [9] without the need to develop control logic practices in laboratories.

2.2.3. Complex Thinking Competency

Complex thinking refers to the ability of a person to analyze situations from a multidimensional and integrated viewpoint [10]. It is considered a cross-cutting or transversal competency because, regardless of the discipline, it allows cognitive elements to resolve relevant problems and difficulties any contemporary professional faces [11]. Due to its characteristics, it is considered a macro competency because, for its full deployment, it comprises four sub-competencies (critical thinking, systemic thinking, scientific thinking, and creative or innovative thinking) which allow perceiving reality profoundly and thoroughly [12].
Considering the sub-competencies comprising complex thinking, Figure 8, it is possible to note the breadth of its cognitive aspects, which allows individuals to be more efficient and accurate when addressing the problems of their environment [13]. At the formative level, complex thinking does not imply a specific pedagogical process because, due to its characteristics, it can be developed in different ways and situations, as proposed in this study based on using the Lab-Tec@Home Kit.

3. Methodology

Two sample groups were considered for this study, experimental and control. The experimental population consisted of 5 groups (classes) with 138 students in which Lab-Tec@Home was implemented. The control population consisted of 2 classes with 32 students who did not use Lab-Tec@Home; they had traditional classes in a conventional laboratory. The implementation occurred during the August–December 2022 semester, with facilitators supporting both groups.
The selection of the groups, to determine whether they were experimental or control, was carried out randomly in order to reduce selection bias. Seeking to avoid that the facilitators had any impact on the results or introduced bias, we looked for people with the same background and characteristics, distributing the control and experimental groups proportionally, i.e., each had one group of each.
The disparity of students between the experimental and control groups is due to the number of students enrolled per group as well as to limitations imposed by the ethics committee. Even so, the numbers are significant and sufficient for the analyses proposed in this study.
The present study is regulated by an institutional ethics committee and the Writing Lab. As this was an exploratory study, it was requested to limit the demographic information requested from the students, considering that it was not relevant to the final results. It is hoped that with the positive results presented in this article, a complementary study can be generated to provide more information.
To assess the perceived proficiency level of complex thinking competency, the validated eComplexity tool [18] was utilized. Its purpose is to gauge the participants’ perception of their mastery level regarding the competency of reasoning for complexity and its sub-competencies. The instrument has undergone both theoretical and statistical validation as well as scrutiny by a team of experts in the field. The experts’ evaluation of the criteria was averaged as follows: clarity (3.31), coherence (3.38), and relevance (3.54). Based on the theoretical and content validation through expert judgment, the eComplexity tool was found to be highly valid and reliable [18]. The tool consists of 25 items divided into four sub-competencies, namely Systemic Thinking, Scientific Thinking, Critical Thinking, and Innovative Thinking. The tool can be self-administered, and each item is assessed using a Likert scale with options ranging from strongly disagree to strongly agree; see Table 2.
Thus, the implementation process was carried out in three stages: application of the instrument in a diagnostic manner, implementation of the educational practice with the use of the Lab-Tec@Home kit, application of the instrument as an evaluation of the development achieved. It is important to point out that, although Lab-Tec@Home can be carried out individually or in teams, the diagnosis and evaluation of the acquisition and development of the perception of achievement of the competency and its sub-competencies is something that was carried out individually.
Regarding data processing, computational software applications called R [19] and Rstudio [20] were utilized to conduct a multivariate descriptive statistical analysis. The multivariate descriptive analysis involved analyzing central tendency measures such as means and standard deviations. This analysis was further complemented with bar graphs, violin plots, principal component analysis (PCA), bi-plot analysis of shape (i.e., α = 1), and t-test to measure the significance in mean value differences. The primary goal of the mean analysis was to determine a reference value for the dataset in relation to a particular variable. Additionally, the standard deviation helped determine the degree of dispersion of the remaining values with respect to the mean. The analysis of the violin plot allowed for observing and examining the data distribution. The violin plot combined a box-and-whisker plot (i.e., boxplot) and the empirical kernel density into a single visualization that showed the data structure [21].
On the other hand, the principal component analysis is a method that enables understanding the performance of students concerning sub-competencies. It helps to overcome collinearity issues and simplify the complexity of data by using a set of independent and uncorrelated variables known as principal components. These components are obtained from the original variables by capturing the maximum variability. The number of components corresponds to the number of variables, as per the works of [11,22]. The principal component analysis was supported by biplot analysis, which allowed observing the behavior of students concerning variables using two principal components that captured the maximum variability in the data [23]. For analysis, we used the biplot of shape (i.e., α = 1) to facilitate observation of our observations (students). Finally, to determine the significance between mean values based on gender, we performed a t-test analysis, considering a p-value of 90% (i.e., p-value of 0.10).

4. Results

We calculated the arithmetic mean for the students’ perception in both groups at the two measurement moments. Table 3 shows the means and standard deviations analysis for each item evaluated. In it, it is observed that, in most cases, there is an improvement in the perceived achievement of the complex thinking competency and its sub-competencies. Only the sub-competency of systems thinking in the control group showed a regression (4.13 to 4.11). This table shows that, in the initial diagnosis, students in the experimental group perceived themselves better than those in the control group in the competency and its sub-competencies, which also translates into better results in the final evaluation. The experimental group achieved the highest mean in systems thinking (4.20) and the lowest in scientific thinking (3.89). In the control group, this is repeated, with the best mean for systems thinking (4.11) and the best for scientific thinking (3.85).
Figure 9 illustrates the perception of complex thinking competency in the initial and final diagnoses of both groups (experimental and control). It shows that the participants in both groups perceived themselves as improving after their training practices, either with Lab-Tec@Home or in a conventional laboratory. However, the experimental group achieved a higher means than the control group, primarily due to its better results in the initial diagnosis. This figure also shows that the standard deviation in the students’ perceived complex thinking competency was lower in the experimental group.
Table 4 analyzes significant differences in both groups’ initial and final means of perceived complex thinking competency. The table shows no significant differences between the mean values of perceived achievement.
Regarding the level of perception by sub-competencies, the graphs in Figure 10 allow contrasting the initial and final measurements between both groups (Experimental and Control). This graph corroborates Table 3 by showing that the experimental group presents better means and better concentration of responses. Thus, there is a trend towards the scaling of the participants’ perception of performance with the use of Lab-Tec@Home.
Table 5 displays the statistically significant differences in the means of the sub-competencies of complex thinking, contrasting both groups (experimental and control). The table shows no significant differences between the mean values of perceived achievement.
Table 6 presents the principal component analysis performed on the experimental and control groups for the sub-competencies of complex thinking. The table shows the first two components capturing the maximum variability in the data. Together Principal Component 1 (PC1) and Principal Component 2 (PC2) captured 84.7% of the variability. PC1 captured 75.4%, and PC2 accounted for 9.2%. Additionally, the table shows that PC1 is strongly correlated with the sub-competency of systems thinking. This component would explain the students’ ability to analyze problematic situations from different perspectives. On the other hand, PC2 highly correlates with innovative thinking, explaining the students’ ability to generate new ideas and explore various options for proposing different and fresh solutions.
From Figure 11, this biplot analysis shows that the students in the experimental group are the ones who perceive themselves in the best way in the development of each of the sub-competencies, reaching the highest levels of the scale. Some students in the experimental group were high in the sub-competencies of systemic, critical, and innovative thinking. On the other hand, six students in the control group were high in critical, scientific, and systemic thinking. In this context, students outside the confidence ellipses for each group (i.e., extreme right) are of concern. It means there were students with a very low perception of developing the sub-competencies and, consequently, the complex thinking competency.

5. Discussion of Results

The first results relate to both groups’ means and standard deviations. As seen, although the experimental group achieved a better score (4.11) than the control group (4.02), it was not statistically significant since the level of development in both cases was the same (+0.08–2%). It is possible to point out that the difference found in the general result of the complex thinking competency corresponds to the fact that the groups were initially diagnosed with different levels and not to a statistically significant difference (Table 5). Figure 9 corroborates this data. Although there seems to be a difference, it is due to the starting point of the measurement. We can highlight that the experimental group had a lower standard deviation, i.e., its responses were more concentrated around the mean than the control group.
Table 4 complements this information, showing the results of the significance level (t) in the variations of the complex thinking competency of both groups. As seen in this table, none of the groups had a statistically significant variation; however, the experimental group was closer to being significant. Thus, although there was no statistically significant variation, an improvement in perceived complex thinking can be noted, especially in the group that carried out the Lab-Tec@Home practice.
Concerning the development of the sub-competencies, the highest level in the experimental group was systemic thinking (4.20), although this was not the one that achieved the most improvement, which was scientific thinking, with a variation of more than 3%. In the control group, the highest level was innovative thinking, although the most improved was scientific thinking, surpassing the experimental group (+6%). This last result is relevant, as it implies an area of opportunity for Lab-Tec@Home since the participants perceived that, theoretically, they learn more in traditional teaching.
Delving into the data, Figure 10 shows the results of each sub-competency, contrasting the initial and final levels of both groups. Something relevant in these violin graphs is the distribution of the answers. Systems thinking is the only sub-competency that decreased perception from diagnosis to final evaluation; this occurred in the control group, where students perceived themselves as less competent (−0.5%) in identifying the parts that make up a problem and understanding how they are interconnected. This was not the case with Lab-Tec@Home; participants improved by 1.6% in perception.
An identifiable problem, however, is that this sub-competency had a significant standard deviation towards the lower part of the scale in the final evaluation, indicating that some students, even after the traditional or Lab-Tec@Home practice activity, continued to perceive themselves as unskilled in systems thinking. Although it can be concluded that Lab-Tec@Home showed better results in systems thinking, this variation from the control group was not statistically significant (Table 5).
As for scientific thinking, as noted above, although the experimental group achieved a better level of perception (4.20), the best development was in the control group (+6%). As shown in Figure 10, the violins of the experimental group showed better levels on the scale but reflected little difference between the initial diagnosis and the final evaluation, in contrast to the control group, where the improvement of responses in the final evaluation was notable: the violin shows a better concentration of responses around the mean and in the upper part of the graph. However, Table 5 shows no statistically significant difference between the two groups.
In the case of critical thinking, better results were obtained by the experimental group (4.19), significantly better than the control group (4.02). However, the initial level of both groups was unequal, so analyzing the level of development shows that in both cases, it is the same (+2%). So, although the final averages seem to be very different, it is due to the initial level of the participants of both groups. Table 5 confirms this, reflecting no statistically significant difference. What is interesting to note in Figure 10 is that the experimental group considerably reduced the results in the lower part of the violin mean, but the control group did not. Thus, unlike the control group, the experimental group participants generally perceived themselves with better critical thinking skills.
Finally, concerning innovative thinking, the experimental group’s results were better than the control group; however, unlike the previous cases, there was a difference in their level of development. The experimental group improved three times as much as the control group (1.5% vs. 0.5%). As with critical thinking, Figure 10 shows that one of the most significant changes was not in the level attained by the mean but in the improvement in the standard deviation of the participants, i.e., a significant reduction of students who perceived themselves as incompetent. Although, as in the previous results, the difference between the two groups was not statistically significant (Table 6), it does not limit the differences that can be identified regarding the distribution of students’ perceptions.
For a broader view of these results, Table 6 and Figure 11 present the results of a principal component analysis and a corresponding biplot graph. The latter corroborates what has been previously pointed out, the better concentration of responses of the experimental group students around the mean. The biplot shows that the control group responses were dispersed throughout the scale compared to the experimental group. Thus, although there was no significant difference, the control group had more students perceiving themselves as incompetent.
In conclusion, these results indicate that Lab-Tec@Home students had a statistically similar level of perceived achievement of the complex thinking competency and its sub-competencies compared to students who performed hands-on practice in a traditional laboratory, which in itself is a valuable result. These results resemble those achieved by previous studies such as [24,25,26], who demonstrate the relevance of developing complex thinking in control engineering processes, although, unlike them, this study achieves it with an educational practice at home. It should be considered that the use of educational practices at home such as Lab-Tec@Home was already common before the pandemic; however, these practices used to be associated with the development of complementary competencies to academic training in the classroom or laboratories. We can notice this in the studies of [27,28,29], who use home practices for the development of communication and socialization skills as a tool to reduce educational disadvantages. Thus, this study not only allows us to appreciate valuable results regarding the relationship of this type of practices and the development of complex thinking, but it is also relevant because of the originality of carrying it out through a home practice. Thus, it is demonstrated that Lab-Tec@Home has an impact on the development of complex thinking competencies and sub-competencies perceived by the students and achieves this in a more uniform way among all participants than the group with traditional laboratory practices.
The lack of statistically significant differences does not imply that the results were not relevant but rather that the differences between the skills achieved were similar and not different between the two groups. This lack of difference is in itself an achievement of Lab-Tec@Home, as it allows us to note that at the level of skills developed, it can have an impact similar to that achieved by the use of a laboratory. These results open the possibility for future lines of research that analyze other competencies developed with Lab-Tec@Home, as well as the way in which its use impacts on technical and professional skills. Undoubtedly, the major finding of this study is centered on the homogeneity achieved in the results of the experimental group, which provides greater certainty to teachers that the educational practice has a similar impact on all students. It will be relevant to analyze in future research how this translates into grades in order to demonstrate whether this homogeneity translates into more balanced grades.

6. Conclusions

Undoubtedly, the COVID-19 pandemic challenged all educational institutions, especially those training programs that, due to their characteristics or needs, required physical infrastructure that was impossible to access due to health restrictions. Thus, Lab-Tec@Home was born, a laboratory kit designed to develop competencies and skills associated with control engineering that students could use from home, making up for their lack of access to a laboratory. Although the restrictions implemented by COVID-19 have passed, this does not imply that the pedagogical tools that gave good results should be discarded, considering that they could have an additional impact on developing complementary skills.
In this sense, the proposed hypothesis is proven, demonstrating that Lab-Tec@Home is a valid option for the development of complex thinking competency, as well as an educational practice in a traditional laboratory.
Thus, it can be concluded that Lab-Tec@Home promotes a more equitable level of perception among students, who perceive themselves as generally more competent, as opposed to a traditional group, where not everyone achieves this appreciation of their abilities. On a practical level, this study allows us to point out that, regardless of the impact Lab-Tec@Home may have on acquiring and developing disciplinary competencies associated with control engineering, this educational innovation practice is valuable in improving the perception of knowledge equity in the classroom in complex thinking and its sub-competencies.
In conclusion, this research fulfilled its objective indirectly since the impact identified was not necessarily related to the level attained or the improvement achieved but to the accessibility of knowledge and the equitable perception of students using Lab-Tec@Home.

Author Contributions

Conceptualization, D.S. and C.S.; methodology, D.S., C.S., J.C.V.-P. and M.C.-S.; validation, J.C.V.-P. and M.C.-S.; investigation, D.S., C.S., J.C.V.-P. and M.C.-S.; resources, D.S. and C.S.; writing—original draft, C.S., D.S. and J.C.V.-P.; writing—review and editing, C.S. and D.S., J.C.V.-P. and M.C.-S.; supervision, D.S. and C.S.; project administration, C.S. and D.S., J.C.V.-P. and M.C.-S.; funding acquisition, J.C.V.-P. All authors have read and agreed to the published version of the manuscript.

Funding

The authors acknowledge the financial and technical support of Writing Lab, Institute for the Future of Education, and Tecnologico de Monterrey, Mexico, in the production of this work and the financial support of Challenge-Based Research Funding Program 2022. Project ID # I001-IFE001-C1-T1-E.

Institutional Review Board Statement

Implementation regulated and approved by the interdisciplinary research group R4C, with the technical support of the Writing Lab of the Institute for the Future of Education of Tecnologico de Monterrey. Approval Code: 2023-02-08-735 Approval Date: August 2022.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available under reasonable request due to privacy restrictions.

Acknowledgments

The authors acknowledge the financial and technical support of Writing Lab, Institute for the Future of Education, and Tecnologico de Monterrey, Mexico, in the production of this work and the financial support of Challenge-Based Research Funding Program 2022. Project ID # I001-IFE001-C1-T1-E.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Sotelo, D.; Sotelo, C.; Ramirez-Mendoza, R.A.; López-Guajardo, E.A.; Navarro-Duran, D.; Niño-Juárez, E.; Vargas-Martinez, A. Lab-Tec@ Home: A cost-effective kit for online control engineering education. Electronics 2022, 11, 907. [Google Scholar] [CrossRef]
  2. Alqahtani, A.Y.; Rajkhan, A.A. E-learning critical success factors during the covid-19 pandemic: A comprehensive analysis of e-learning managerial perspectives. Educ. Sci. 2020, 10, 216. [Google Scholar] [CrossRef]
  3. Casper, A.A.; Rambo-Hernandez, K.E.; Park, S.; Atadero, R.A. The impact of emergency remote learning on students in engineering and computer science in the United States: An analysis of four universities. J. Eng. Educ. 2022, 111, 703–728. [Google Scholar] [CrossRef]
  4. Carrasco-Navarro, R.; Luque-Vega, L.F.; Nava-Pintor, J.A.; Guerrero-Osuna, H.A.; Carlos-Mancilla, M.A.; Castañeda-Miranda, C.L. MEIoT 2D-CACSET: IoT Two-Dimensional Cartesian Coordinate System Educational Toolkit Align with Educational Mechatronics Framework. Sensors 2022, 22, 4802. [Google Scholar] [CrossRef] [PubMed]
  5. Trojer, L.; Ambele, R.M.; Kaijage, S.F.; Dida, M.A. A review of the Development Trend of Personalized Learning Technologies and its Applications. Int. J. Adv. Sci. Res. Eng. 2022, 8, 75–91. [Google Scholar]
  6. Aljawarneh, S.A. Reviewing and exploring innovative ubiquitous learning tools in higher education. J. Comput. High. Educ. 2020, 32, 57–73. [Google Scholar] [CrossRef]
  7. Wu, J.S.; Chien, T.H.; Chien, L.R.; Yang, C.Y. Using artificial intelligence to predict class loyalty and plagiarism in students in an online blended programming course during the COVID-19 pandemic. Electronics 2021, 10, 2203. [Google Scholar] [CrossRef]
  8. Magyari, A.; Chen, Y. FPGA remote laboratory using IoT approaches. Electronics 2021, 10, 2229. [Google Scholar] [CrossRef]
  9. Guerrero-Osuna, H.A.; Nava-Pintor, J.A.; Olvera-Olvera, C.A.; Ibarra-Pérez, T.; Carrasco-Navarro, R.; Luque-Vega, L.F. Educational Mechatronics Training System Based on Computer Vision for Mobile Robots. Sustainability 2023, 15, 1386. [Google Scholar] [CrossRef]
  10. Tobón, S.; Luna-Nemecio, J. Complex thinking and sustainable social development: Validity and reliability of the complex-21 scale. Sustainability 2021, 13, 6591. [Google Scholar] [CrossRef]
  11. Cruz-Sandoval, M.; Vázquez-Parra, J.; Carlos-Arroyo, M.; Amézquita-Zamora, J. Student Perception of the Level of Development of Complex Thinking: An Approach Involving University Women in Mexico. J. Latinos Educ. 2022, 1–13. [Google Scholar] [CrossRef]
  12. Drucker, J. Sustainability and complexity: Knowledge and authority in the digital humanities. Digit. Scholarsh. Humanit. 2021, 36, ii86–ii94. [Google Scholar] [CrossRef]
  13. Pacheco, C.S.; Herrera, C.I. A conceptual proposal and operational definitions of the cognitive processes of complex thinking. Think. Ski. Creat. 2021, 39, 100794. [Google Scholar] [CrossRef]
  14. Cui, L.; Zhu, Y.; Qu, J.; Tie, L.; Wang, Z.; Qu, B. Psychometric properties of the critical thinking disposition assessment test 417 amongst medical students in China: A cross-sectional study. BMC Med. Educ. 2021, 21, 10. [Google Scholar] [CrossRef]
  15. Suryansyah, S.; Kastolani, W.; Somantri, L. Scientific thinking skills in solving global warming problems. IOP Conf. Ser. Earth Environ. Sci. 2021, 683, 012025. [Google Scholar] [CrossRef]
  16. Zhou, Q. Development of creative thinking skills through aesthetic creativity in middle school educational music course. Think. Ski. Creat. 2021, 40, 100825. [Google Scholar] [CrossRef]
  17. Jaaron, A.A.; Backhouse, C.J. Operationalisation of service innovation: A systems thinking approach. Serv. Ind. J. 2018, 38, 561–583. [Google Scholar] [CrossRef]
  18. Castillo-Martínez, I.; Ramirez-Montoya, M.; Torres-Delgado, G. Reasoning for complexity competency instrument (e-Complexity): Content validation and expert judgment. Assess. Eval. High. Educ. 2021; in press. [Google Scholar]
  19. RCoreTeam. A Language and Environment for Statistical Computing. 2017. Available online: https://www.r-project.org/ (accessed on 7 March 2023).
  20. RStudioTeam. RStudio: Integrated Development for R (2022.2.2.485). Available online: http://www.rstudio.com/ (accessed on 7 March 2023).
  21. Hintze, J.L.; Nelson, R.D. Violin plots: A box plot-density trace synergism. Am. Stat. 1998, 52, 181–184. [Google Scholar]
  22. O’Sullivan, D.; Unwin, D. Reducing the number of variables: Principal Component Analysis. In Geographic Information Analysis; John Wiley & Sons: Hoboken, NJ, USA, 2002; pp. 343–355. [Google Scholar]
  23. Gabriel, K.R. The biplot graphic display of matrices with application to principal component analysis. Biometrika 1971, 58, 453–467. [Google Scholar] [CrossRef]
  24. Bangemann, T.; Riedl, M.; Thron, M.; Diedrich, C. Integration of classical components into industrial cyber–physical systems. Proc. IEEE 2016, 104, 947–959. [Google Scholar] [CrossRef]
  25. Adriaensen, A.; Decré, W.; Pintelon, L. Can complexity-thinking methods contribute to improving occupational safety in industry 4.0? A review of safety analysis methods and their concepts. Safety 2019, 5, 65. [Google Scholar] [CrossRef]
  26. Ramírez Montoya, M.S.; Ponce, P.; Ramirez Mendoza, R.A.; Molina, A.; MacCleery, B.; Ascanio, M. From understanding a simple DC motor to developing an electric vehicle AI controller rapid prototype using MATLAB-Simulink, real-time simulation and complex thinking. Front. Educ. 2022, 7, 941972. [Google Scholar]
  27. Hardman, J.C. A community of learners: Cambodians in an adult ESL classroom. Lang. Teach. Res. 1999, 3, 145–166. [Google Scholar] [CrossRef]
  28. Carpenter, G.J.O. The School Success and Adjustment of Young African American Children. Ph.D. Thesis, Miami University, Oxford, OH, USA, 2005. [Google Scholar]
  29. Weir, S.; Errity, D.; McAvinue, L. Factors associated with educational disadvantage in rural and urban areas. Ir. J. Educ. Eireannach Oideachais 2015, 40, 94–110. [Google Scholar]
Figure 1. Learning experience needs and expectations necessary for effective learning.
Figure 1. Learning experience needs and expectations necessary for effective learning.
Sustainability 15 07598 g001
Figure 2. Roadmap using labkit.
Figure 2. Roadmap using labkit.
Sustainability 15 07598 g002
Figure 3. The closed control loop of the continuous control kit.
Figure 3. The closed control loop of the continuous control kit.
Sustainability 15 07598 g003
Figure 4. Circuit of the continuous control kit. (a) Schematic circuit. (b) Continuous control kit.
Figure 4. Circuit of the continuous control kit. (a) Schematic circuit. (b) Continuous control kit.
Sustainability 15 07598 g004
Figure 5. Logic gates used to develop a control logic kit. (a) AND logic gate. (b) OR logic gate. (c) NOT logic gate. (d) XOR logic gate. (e) NAND logic gate. (f) NOR logic gate.
Figure 5. Logic gates used to develop a control logic kit. (a) AND logic gate. (b) OR logic gate. (c) NOT logic gate. (d) XOR logic gate. (e) NAND logic gate. (f) NOR logic gate.
Sustainability 15 07598 g005
Figure 6. Schematic circuit to develop control logic kit.
Figure 6. Schematic circuit to develop control logic kit.
Sustainability 15 07598 g006
Figure 7. PCB control logic kit.
Figure 7. PCB control logic kit.
Sustainability 15 07598 g007
Figure 8. Sub-competencies of complex thinking. Based on [14,15,16,17].
Figure 8. Sub-competencies of complex thinking. Based on [14,15,16,17].
Sustainability 15 07598 g008
Figure 9. Results of students’ perceived achievement of complex thinking competency in the experimental and control groups (initial and final diagnoses).
Figure 9. Results of students’ perceived achievement of complex thinking competency in the experimental and control groups (initial and final diagnoses).
Sustainability 15 07598 g009
Figure 10. Scaling results of perceived achievement by sub-competency of the complex thinking competency for the experimental and control groups.
Figure 10. Scaling results of perceived achievement by sub-competency of the complex thinking competency for the experimental and control groups.
Sustainability 15 07598 g010
Figure 11. Biplot of final results: Biplot of shape ( α = 1). Comparing final results of the experimental and control groups in perceived achievement of complex thinking competency.
Figure 11. Biplot of final results: Biplot of shape ( α = 1). Comparing final results of the experimental and control groups in perceived achievement of complex thinking competency.
Sustainability 15 07598 g011
Table 1. The main components promoting the learning experience for the users.
Table 1. The main components promoting the learning experience for the users.
1ArduinoTM UNOController
2LED Matrix 8 × 8Actuator
3Photoresistor 100 [ Ω ]Sensor
4Camping base with circuitProcess
Table 2. Items of the instrument [18].
Table 2. Items of the instrument [18].
Sub-Competency#Item
Systemic thinking1I have the ability to find associations between variables, conditions and constraints in a project.
2I identify data from my discipline and other areas that contribute to solve problems.
3I participate in projects that have to be solved using inter/multidisciplinary perspectives.
4I organize information to solve problems.
5I enjoy learning different perspectives on a problem.
6I am inclined to use strategies to understand the parts and whole of a problem.
Scientific thinking7I have the ability to identify the essential components of a problem to formulate a research question.
8I know the structure and formats for research reports used in my area or discipline.
9I identify the structure of a research article used in my area or discipline.
10I apply the appropriate analysis methodology to solve a research problem.
11I design research instruments consistent with the research method used.
12I formulate and test research hypotheses.
13I am inclined to use scientific data to analyze research problems.
Critical thinking14I have the ability to critically analyze problems from different perspectives.
15I identify the rationale for my own and others’ judgments to recognize false arguments.
16I self-evaluate the level of progress and achievement of my goals to make the necessary adjustments.
17I use reasoning based on scientific knowledge to make judgments about a problem.
18I make sure to review the ethical guidelines of the projects in which I participate.
19I appreciate criticism in the development of projects in order to improve them.
Innovative thinking20I know the criteria to determine a problem.
21I have the ability to identify variables, from various disciplines, that can help answer questions.
22I apply innovative solutions to diverse problems.
23I solve problems by interpreting data from different disciplines.
24I analyze research problems contemplating the context to create solutions.
25I tend to evaluate with critical and innovative sense the solutions derived from a problem.
Table 3. Results of students’ perceived achievement of the complex thinking competency and its sub-competencies in the experimental and control groups (initial and final values).
Table 3. Results of students’ perceived achievement of the complex thinking competency and its sub-competencies in the experimental and control groups (initial and final values).
ExperimentalControl
InitialFinalInitialFinal
MeanSdMeanSdMeanSdMeanSd
Complex thinking4.030.494.110.463.940.464.020.52
Systemic thinking4.130.444.200.454.130.414.110.56
Scientific thinking3.770.643.890.663.630.713.850.64
Critical thinking4.100.514.190.483.940.524.020.54
Innovative thinking4.100.674.160.574.050.554.070.59
Table 4. Results of statistically significant differences (initial and final values) between control and experimental groups in the perceived achievement of complex thinking competency (Student’s t analysis).
Table 4. Results of statistically significant differences (initial and final values) between control and experimental groups in the perceived achievement of complex thinking competency (Student’s t analysis).
Concepttdfp-Value
Control Group (Initial Diagnosis and Final Diagnosis)−0.63261.2980.529
Experimental Group (Initial Diagnosis and Final Diagnosis)−1.464279.990.144
Final Diagnosis (Control vs. Experimental)−0.87543.5140.352
t = t-test statistic value, df = degrees of freedom, p-value = significance level of the t-test.
Table 5. Results of significant differences between control and experimental groups in the perception of final achievement of the sub-competencies of complex thinking (Student’s t).
Table 5. Results of significant differences between control and experimental groups in the perception of final achievement of the sub-competencies of complex thinking (Student’s t).
Concepttdfp-Value
Systemic Thinking Final Diagnosis (Control vs. Experimental)−0.81940.8050.417
Scientific Thinking Final Diagnosis (Control vs. Experimental)−0.30347.4510.762
Critical Thinking Final Diagnosis (Control vs. Experimental)−1.64742.9790.106
Innovative Thinking Final Diagnosis (Control vs. Experimental)−0.74045.0680.4627
t = t-test statistic value, df = degrees of freedom, p-value = significance level of the t-test.
Table 6. Principal component analysis matrix: analysis of perceived achievement of the sub-competencies of complex thinking (control and experimental groups).
Table 6. Principal component analysis matrix: analysis of perceived achievement of the sub-competencies of complex thinking (control and experimental groups).
ConceptPC1PC2PC3PC4
Systemic Thinking−0.5110.202−0.1430.823
Scientific Thinking−0.494−0.5190.693−0.058
Critical Thinking−0.498−0.409−0.689−0.328
Innovative Thinking−0.4950.7220.148−0.451
Standard Deviation1.7370.6070.5750.529
Proportion of Variance0.7540.0920.0820.700
Cumulative Proportion0.7540.8470.9301.00
PC = Principal Component.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sotelo, D.; Vázquez-Parra, J.C.; Cruz-Sandoval, M.; Sotelo, C. Lab-Tec@Home: Technological Innovation in Control Engineering Education with Impact on Complex Thinking Competency. Sustainability 2023, 15, 7598. https://doi.org/10.3390/su15097598

AMA Style

Sotelo D, Vázquez-Parra JC, Cruz-Sandoval M, Sotelo C. Lab-Tec@Home: Technological Innovation in Control Engineering Education with Impact on Complex Thinking Competency. Sustainability. 2023; 15(9):7598. https://doi.org/10.3390/su15097598

Chicago/Turabian Style

Sotelo, David, José Carlos Vázquez-Parra, Marco Cruz-Sandoval, and Carlos Sotelo. 2023. "Lab-Tec@Home: Technological Innovation in Control Engineering Education with Impact on Complex Thinking Competency" Sustainability 15, no. 9: 7598. https://doi.org/10.3390/su15097598

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop