Next Article in Journal
Radio Frequency Fingerprinting for Frequency Hopping Emitter Identification
Previous Article in Journal
A Conceptual Design Methodology for e-VTOL Aircraft for Urban Air Mobility
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Development of a Self-Diagnostic Mobile Application for Learning Progress in Non-Face-to-Face Practice Learning

1
Department of Computer Education, Jeonju National University of Education, 50 Seohakro Wansangu, Jeonju-si 55101, Korea
2
Department of Information and Communication Engineering, Sungkyul University, 53 Sungkyul University-Ro, Manan-gu, Anyang-si 14097, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(22), 10816; https://doi.org/10.3390/app112210816
Submission received: 2 October 2021 / Revised: 12 November 2021 / Accepted: 12 November 2021 / Published: 16 November 2021

Abstract

:
Due to COVID-19, non-F2F (non-face-to-face) learning is being conducted in educational sites around the world. Unlike theoretical subjects, which have a variety of applicable non-F2F learning content, in practical subjects, learners may undergo many difficulties due to many learning activities. Therefore, this research tries to design and develop a mobile application that allows learners to perform a self-diagnostic on their learning progress. In this research, we conducted a mobile application usability evaluation for 36 students who participated in non-F2F Arduino practice learning. To this end, we applied the ADDIE (Analysis, Design, Development, Implement, and Evaluation) model to develop a self-diagnostic mobile application for students’ learning progress according to the learning content. The research tool modified and distributed the question item appropriate for this subject’s study for the mobile usability evaluation that was used in the previous study. This research applied the ADDIE model to analyze the characteristics of students and learning contents, including designed learning contents, database, menu structure, developed learning contents, and mobile application. After using this in non-F2F practice learning for 15 weeks, a mobile application usability evaluation was conducted. As a result of the study, due to receiving usability scores between 3.53 and 4.42, it was found that the learning progress self-diagnostic in non-F2F practice learning was essential and that mobile applications were useful in non-F2F practice learning. Additionally, 33 out of 36 students responded that it would be useful for their learning if they actively used it in other subjects as well. We have found that leveraging self-diagnostic mobile applications through the mobile application usability evaluation tool can be useful for non-F2F practice learning. Additionally, it is expected to be useful for non-F2F practice learning if additional research is conducted on other factors as well such as teaching presence, online learning engagement, and learning flow in the future.

1. Introduction

As of 2020, COVID-19 is promoting non-face-to-face (non-F2F) learning on educational sites around the world [1,2,3,4,5,6]. As a result, as non-face-to-face learning progresses, many schools around the world have not been fully prepared to provide high quality education due to the sudden spread of COVID-19. The reality is that they have experienced quite a lot of trial and error [7].
For theoretical subjects, even in the era before the outbreak of COVID-19, there were many educational service companies that provided a variety of non-face-to-face learning contents, so there was relatively little room for quality decline or trial and error. On the other hand, the biggest problem now is learning related to science and engineering, or arts and sports courses which are taught by learning practical skills [7,8,9].
Although COVID-19 has created a difficult situation, many professors and researchers are actively conducting research on various learning methods in order to provide various challenges and opportunities for learners. Fortunately, the computer programming-related subjects that this study intends to pursue in this research are easy to learn online. Various online programming tools are provided and learning that does not utilize the training kit can still practice learning at a level comparable to that of theoretical subjects [10,11,12]. In 2020, in situations where face-to-face classes were difficult due to COVID-19, many studies were conducted to increase the learning efficiency of hands-on classes using methods such as IoT (Internet of Things) practice. In addition, research is continuing to be conducted to improve skills in the fields of nursing, robot education, and IoT practical education [13,14,15]. However, learning that utilizes IoT-based training kits such as the Arduino Uno, Raspberry Pi, and Atmega 128 are difficult to learn even if computer programming is practiced [16]. For example, if the instructor presented a circuit diagram, but the learner does not fully grasp the concept due to an incorrect implementation, or the malfunction of a part, or a typo in the programming process and the compilation fails, then the student can lose motivation. Therefore, students must evaluate themselves and guide themselves to maintain their motivation to learn. Specifically, in practical classes, regardless of whether or not a streaming video method or a VOD (video on demand) method is utilized, the learner’s learning speed rarely exceeds the video playback speed [17]. In Korea, self-diagnostic mobile applications are being utilized very effectively in COVID-19 situations. Likewise, in non-face-to-face practice learning, a learning self-diagnostic should be performed in order to reduce cases of falling behind or omission. Therefore, it is necessary to develop teaching and learning methods or aids that enable learners to self-diagnose their learning progress.
Therefore, this research tries to design and develop a mobile application that allows students to perform a self-diagnostic on their learning progress with non-face-to-face practice learning. This will help overcome the difficulties of both learners and teachers alike in non-face-to-face practice learning, and lay the foundation for the educational direction and environmental design of non-face-to-face learning.

2. Related Works

2.1. Non-Face-to-Face (Non-F2F) Learning

In a situation where face-to-face classes are difficult due to COVID-19, many studies are being conducted to increase learners’ practical skills through IoT practice. In particular, difficulties are occurring in practical classes at medical and engineering universities. Additionally, there have been many difficulties because practical classes are also conducted in non-face-to-face classes even in specialized high schools [13,14,15].
Non-face-to-face learning is also called online learning, and it is a learning method that can be realized by information and communication technology when interactions occur remotely in time and space. Unlike face-to-face learning, non-face-to-face learning has unique characteristics. Learning contents can be learned by utilizing various Internet materials. Additionally, non-face-to-face learning transcends time and space to enable interaction in real-time or non-real-time. Furthermore, it can be performed with a self-directed learning method. In non-face-to-face learning, non-real-time learning allows the learner to adjust the amount and speed of the lesson by him or herself in the time and space desired, and real-time learning does not lack a sense of realism and teaching presence [7,18]. However, these characteristics correspond more to the learning of theoretical subjects. A considerable number of practical subject studies can be more difficult depending on the educational equipment, subject characteristics, and classroom type. Additionally, when learners are learning at home, it may be difficult to equip each learner with training kits. Additionally, unlike face-to-face learning, even if they have training kits, it is impossible for instructors to directly instruct learners. Therefore, learning may be more difficult [7,11,18]. Specifically, in programming practice classes, it should be possible to enhance the sense of realism of the instruction by effectively engaging students in the class. Therefore, this study tried to utilize a self-diagnostic mobile application to help make practical classes more effective [19,20,21].

2.2. Mobile Application Usability Evaluation

A feature of mobile devices is that they are called personal, portable, and mobile wireless devices. Using such mobile devices for education is called mobile learning (m-Learning). The emergence of this new mobile device has characteristics that enable it to be used for immersive learning, foster imagination, and provide easy interaction, thus it is beginning to be widely used in the field of education, and changes the whole concept and model of learning [22]. Mobile devices can be particularly widely used, especially in programming practice classes, in programming classes for producing mobile applications, and in hands-on classes for creating IoT device prototypes using Arduino Uno or Raspberry Pi [23,24]. Additionally, the value of use is sufficient as a tool to aid learning due to the convenience and utilization of mobile devices [23,24]. However, in order to use them for learning, it is necessary to evaluate whether or not the tool is appropriate and useful through a usability evaluation. The dictionary defines ‘usability’ as ‘the degree of overall satisfaction, effectiveness, and efficiency that a specific user feels while performing a specific task in a specific environment.’ However, in this study, usability was investigated as an index for evaluating mobile applications, and devoted the greatest weight of its measurement to the evaluation of mobile contents and interface design for self-diagnostics and the delivery of learning contents. Furthermore, security-related items were added as a technical element of the evaluation [25].

3. Research Process

3.1. Subjects of Research

The subject of this study is the Arduino course, consisting of 36 students, which is a basic requirement for first-year students of IT-related departments at a university. As soon as they entered college, they began non-face-to-face classes due to COVID-19. If school had begun as normal, they could have borrowed an Arduino Uno practice kit from the school and learned in the computer labs. However, it became necessary to proceed with learning through the teacher’s video course with non-face-to-face classes. Therefore, the Arduino Uno practice kit was picked up by the student at school or delivered to their home. In order to review examples based on the school’s curriculum, circuit diagrams were shown using Tinkercad, and App Inventor was used for learning IoT implementation practice. Since the professor could not check the learning status of students who were in the process of learning, he created a mobile application that students could use for self-diagnostics of their learning progress, and distributed it to them in order to check their learning status. Students were able to take advantage of mobile applications for self-diagnostics about their learning progress.

3.2. Method of Research

3.2.1. ADDIE Model

The development of the mobile application required for this research is the ADDIE model used when developing educational materials and tools. The ADDIE model consists of analysis, design, development, implementation, and evaluation, in that order, as shown in Figure 1 [26,27,28,29]. In the analysis stage, the characteristics of learning contents and the learner were analyzed. In the design stage, the learning contents, database, and interface were designed. In the development stage, a mobile application was developed by configuring various learning contents. In the implementation stage, a class was conducted for 15 weeks using the learning contents and mobile application. Additionally, in the evaluation stage, the usability of the mobile application was evaluated.

3.2.2. Research Tools

In order to measure the usability evaluation of the mobile application developed and used in this study, the usability evaluation tool that measured mobile application use for daily activity living education was appropriately modified and utilized [30]. This evaluation tool consists of a total of 9 question items for the evaluation of mobile contents, a total of 11 question items for the evaluation content of the interface design, and 2 question items for the content of the technical evaluation. In this evaluation tool, the higher the score, the higher the satisfaction. Table 1 shows the sub-factors of the questionnaire question items. The Cronbach’s alpha value, which measures tool reliability, is 90. Furthermore, a separate question item was asked, checking whether or not the mobile application developed in this research could be recommended to others: ‘1-I would like to recommend it to others.’, ‘2-I don’t know.’, ‘3-Not recommended to others.’ In this way, the satisfaction level of using the app was easily measured.

4. Results of Research

4.1. Analysis

This study was developed according to the ADDIE model [26,27,28,29]. In the analysis stage, the characteristics of the learning contents and an analysis of the learners were performed. Educational programs that do not consider the characteristics of learners have a difficult time improving learning achievement. Therefore, learning contents and learning management tailored to the learner’s characteristics are necessary so that the learner can participate well in non-face-to-face practice learning and obtain satisfactory results [18]. The learning contents were organized as shown in Table 2, and the learners consisted of 24 males and 12 female students out of 36. Since there is a difference between the number of male students and the number of females, this study could not confirm the usability evaluation results by gender, but the total number of samples suitable for the mobile application usability evaluation were secured.
In order to understand the characteristics of the learners, in the first week, a student’s ID number, name, gender, and a brief self-introduction were required. The reason for analyzing the characteristics of the learners was to determine whether or not it was possible to practice using an Android smartphone to control the Arduino Uno. However, it was decided that using App Inventor would reduce the cognitive burden of learners because they were still in the first grade of university and had never taken a course in producing mobile applications using Android Studio.

4.2. Design

The mobile application used in this research was developed in Android Studio (version 3.5.1), and SQLite (version 3.X), provided by Android Studio, was used to create the database. The interface of the mobile application was designed to display only the necessary information so that it is not complicated, and the size of the characters was increased so that they could be recognized immediately while learning.

4.2.1. Database Design

The tables and fields of the database designed in this study are defined in Table 3. The ‘student_Number’ field is defined as a primary key in the ‘studentTBL’ table and a foreign key in the ‘questionTBL’ table. The ‘student_Number’ field is used to connect and access the data in each table.

4.2.2. Menu Structure Design

The menu structure of the mobile application developed in this study is shown in Figure 2, and all menus allow access to the main page. Additionally, menus that can be accessed by teachers as administrators and menus that students can access are classified. ‘Learning Self-Check System Main’, ‘Sign Up This App’, and ‘Learning Self Check List’ can be accessed by both teachers and students. Branching from the ‘Learning Self Check List’ are ‘Learning Self Check’ and ‘My Result’ which can only be accessed by students, and ‘Qx’s Result’ and ‘Student List’ are accessible only to teachers.

4.3. Development

4.3.1. Learning Contents Development

For this research, the learning contents organized for each week are shown in Figure 3. After showing a Tinkercad circuit diagram to learners through a video as shown in Figure 3a, each student was asked to complete the circuit diagram in Tinkercad, then to make the Arduino circuit by themselves, and program and control it directly as shown in Figure 3b. Finally, after adding the Bluetooth module as shown in Figure 3c, the learning content concluded with a practice of making an IoT system prototype using App Inventor on an Android smartphone as shown in Figure 3d. However, since no Bluetooth module is present in Tinkercad, the Tinkercad screen and Bluetooth module picture were captured and redrawn in the presentation tool to complete the circuit diagram.

4.3.2. Implementation of a Mobile Application for Self-Diagnosis of Learning Progress

As shown in Figure 4, a mobile application was developed according to the menu structure presented in Figure 3. The mobile application screen consists of a total of 7 screens. Figure 4a shows the menu that appears when you run the mobile application for the first time, and you can log in or connect to the membership registration menu. The remaining six screens are connected to the main screen. Figure 4b shows a screen with the membership registration menu. Students are asked to register by entering their department, student number, name, and password. Teachers do not need to sign up separately because the admin account and password are entered into the database in advance. Figure 4c is a list where learners can select a question item. When selecting a question item, the students are connected to Figure 4d, and the teacher is connected to Figure 4e. Figure 4d shows that the learner reads the question item selected in Figure 4c and clicks ‘Yes’ if the learner understood the learning content or performed the practice, or ‘No’ if the learner did not understand the learning content or did not perform the practice. In Figure 4e, the teacher can view the number of students who selected ‘Yes’ or ‘No’ for the question item selected. In this way, the teacher can check whether or not he or she has taught well for each question item. Additionally, the teacher can consider how to improve the teaching method for each corresponding question item. When the learner answers the question in Figure 4d, it becomes connected to the list shown in Figure 4f. Students can view their own learning progress status in Figure 4f and perform their own self-diagnostic through the results. If the teacher clicks the ‘Submit’ button after reviewing a question item as shown in Figure 4e, it becomes connected to the full student list as shown in Figure 4g, and the information of all students can be viewed. Teachers can then click on a specific student to view that student’s learning progress. In other words, the order in which learners use the application is shown in Figure 4a–f. The order in which instructors use the application is shown in Figure 4a,c,e,g.

4.4. Implement

In the implement stage, the mobile application installation file developed in this study was distributed to students and used. It was distributed during the orientation time of the first week of the lecture, and was used steadily during the full 15 weeks in every practice learning, excluding the midterm and final exam project periods.
Students were able to immediately check their leaning status using the mobile application. Based on the results, students were able to judge themselves coolly and obtain help in making decisions such as asking questions or requesting counseling from teachers.
Additionally, teachers were able to monitor learners’ results, and check if learners requested counseling or asked questions based on the results provided by the mobile application, and to provide learners customized guidance. Furthermore, it could be used as reference material for teachers themselves to think about their teaching methods by monitoring the results of the students’ learning progress.

4.5. Evaluation

During the final exam project period, an assessment was conducted via Google Forms regarding the mobile application usability evaluation tool [18]. In this study, a total of 36 students participated in the learning using mobile applications. Among them, the results were analyzed as shown in Table 4 by compiling the data of 36 students who sincerely responded to the questionnaire. The evaluation score was an average of 4.12 (SD = 0.88) out of 5 points. The question item asking about design suitability, ‘The characters used in the mobile application are in a size and font that are easy for the viewer to read.’ received the highest evaluation score with an average of 4.42 (SD = 0.81). However, the question item about the app’s security, ‘Presented a security policy for learning-related personal information.’, received the lowest evaluation score at 3.53 (SD = 1.18). The reason for this can be inferred from the fact that the notice on personal information protection was given during the learning time and was not presented on the screen of the mobile application.
As shown in the mobile content item in Table 4, the evaluation of class-related information was high, so the necessity of self-diagnostic learning progress was confirmed in non-face-to-face practice learning classes through the mobile application usability evaluation survey.
In a separate topic from the mobile application usability evaluation, 33 students said that they would recommend this mobile application to evaluate other learning questions, and 3 students answered that they did not know.
In non-face-to-face practice learning, the instructor cannot know the student’s exact situation, and the student may have a lot of worries that he or she will fall behind compared to the other students. Or, they often do not realize whether or not they are performing well in class. Therefore, in non-face-to-face practice learning, it is important to be able to self-diagnose one’s learning progress, and this can be expected to lead to effective non-face-to-face practice learning.

5. Conclusions and Discussion

While non-face-to-face learning is becoming common in education sites worldwide due to COVID-19, practice learning is the most vulnerable area in non-face-to-face learning. Unlike theoretical subjects, practice learning is not easy in practical subjects due to the limitations of video lectures. However, there are no studies yet on educational methodologies or effective feedback for effective practice learning.
Therefore, in this study, we designed and developed a learning progress self-diagnostic mobile application to help with Arduino practice learning. To this end, the ADDIE model was applied and a mobile application was produced according to the stages of analysis, design, development, implementation, and evaluation and applied to Arduino practice learning. Using this application, the learners could trace their learning situation, and the instructor could also monitor the student’s situation or use the self-diagnostic results as a reference for the instructor to consider his or her own teaching method. Additionally, a mobile application usability evaluation was conducted to evaluate the technical performance and UX/UI of the mobile application developed for this study. Through the mobile application usability evaluation, it was found that in non-face-to-face practice learning, a self-diagnostic of learning progress was essential. Additionally, most of the learners evaluated the learning progress self-diagnostic mobile application that was developed for this study with a score of 4 points or higher in 17 out of 22 questionnaire items. The utilization of the application was also high in non-face-to-face learning. Additionally, of the 5 remaining questionnaire items, 4 almost reached 4 points, so it can be said that a good response was obtained. However, it should be taken into account that the response to the final question was only 3.53 points, so there was no room for notifications, according to the personal information security policy used during the course class.
Through the mobile application usability evaluation, it was found that most students expressed that the method for self-diagnosis of their learning progress is absolutely necessary in non-face-to-face practice learning, and that most learners used the learning progress self-diagnostic mobile application developed in this study.
Through this study, it can be observed that the learning progress self-diagnostic method used in this study is important when conducting non-face-to-face practice learning in a global pandemic situation such as COVID-19. Implementation of such a method is expected to lead to satisfaction and confidence in non-face-to-face practice learning.
As a future research project, where non-face-to-face practice learning will be conducted, the learning progress self-diagnostic mobile application developed in this study will be used to evaluate not only the mobile application’s usability but also teaching presence, online learning engagement, and learning flow. It will be used to complete a specific learning strategy for non-face-to-face practice learning by adding these additional parameters. In addition, it is necessary to develop a mobile application that adds a function to translate students’ feedback into sentences by using voice input in practice learning classes.

Author Contributions

Methodology, S.K.; software, S.K.; validation, S.K. and H.-J.M.; formal analysis, S.K. and H.-J.M.; investigation, S.K.; resources, S.K.; data curation, S.K.; writing—original draft preparation, S.K.; writing—review and editing, H.-J.M.; visualization, S.K. and H.-J.M.; supervision, H.-J.M.; project administration, S.K. and H.-J.M.; funding acquisition, S.K. and H.-J.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Demuyakor, J. Coronavirus (COVID-19) and Online Learning in Higher Institutions of Education: A Survey of the Perceptions of Ghanaian International Students in China. Online J. Commun. Media Technol. 2020, 10, e202018. [Google Scholar] [CrossRef]
  2. Avgerinou, M.D.; Moros, S.E. The 5-Phase Process as a Balancing Act during Times of Disruption: Transitioning to Virtual Teaching at an International JK-5 School; Teaching, Technology, an Teacher Education during the COVID-19 Pandemic: Stories from the Field; Association for the Advancement of Computing in Education (AACE): Waynesfield, NC, USA, 2020; pp. 583–589. Available online: https://media-acs.zentech.gr/filesystem/Multimedia/pdf/Pages_from_ebook_online_teaching_and_learning-2_id393.pdf (accessed on 28 September 2020).
  3. Shin, Y.-J.; Lee, H.-J.; Kim, J.-H.; Kwon, D.-Y.; Lee, S.-A.; Choo, Y.-J.; Park, J.-H.; Jung, J.-H.; Lee, H.-S.; Kim, J.-H. Non-face-to-face online home training application study using deep learning-based image processing technique and standard exercise program. J. Converg. Cult. Technol. (JCCT) 2021, 7, 577–582. [Google Scholar] [CrossRef]
  4. Kim, J.S.; Ahn, Y.J.; Kim, K.A. Analysis of Presence Impacting on Learning Satisfaction and Persistence in Non-face-to-face Programming Courses. In Proceedings of the Korean Society of Computer Information Conference, Busan, Korea, 21–23 January 2021; Volume 29, pp. 303–304. [Google Scholar]
  5. Wang, C.; Hsu, H.-C.K.; Bonem, E.M.; Moss, J.D.; Yu, S.; Nelson, D.B.; Levesque-Bristol, C. Need satisfaction and need dissatisfaction: A comparative study of online and face-to-face learning contexts. Comput. Hum. Behav. 2019, 95, 114–125. [Google Scholar] [CrossRef]
  6. Lee, Y. A study on the Correlation of between Online Learning Patterns and Learning Effects in the Non-face-to-face Learning Environment. J. Korea Acad.-Ind. Coop. Soc. 2020, 21, 557–562. [Google Scholar] [CrossRef]
  7. Jeon, S.J.; Yoo, H.H. Relationship between General Characteristics, Learning Flow, Self-Directedness and Learner Satisfaction of Medical Students in Online Learning Environment. J. Korea Contents Assoc. 2020, 20, 65–74. [Google Scholar] [CrossRef]
  8. Wallace, P.E.; Clariana, R.B. Achievement Predictors for a Computer-Applications Module Delivered Online. J. Inf. Syst. Educ. 2000, 11, 13–18. [Google Scholar]
  9. Zhao, Y.; Tang, Y.; Liu, F.; Peng, Z.; Kong, J.; Huang, J.; Tong, Z. Research and practice of online emergency teaching based on electronic information technology under the influence of COVID-19. Int. J. Electr. Eng. Educ. 2021, 0020720920985048. [Google Scholar] [CrossRef]
  10. Gandraß, N.; Hinrichs, T.; Schmolitzky, A. Towards an Online Programming Platform Complementing Software Engineering Education. 2020, pp. 27–35. Available online: http://ceur-ws.org/Vol-2531/paper05.pdf (accessed on 17 September 2021).
  11. Koretsky, M. Work-in-Progress: An Online Journal Tool with Feedback for a Learning Assistant Program in Engineering. In Proceedings of the ASEE’S Virtual Conference, Virtual Online, 22–26 June 2020. [Google Scholar] [CrossRef]
  12. Park, H. A Study on the Online Software Education Model Based on Arduino. Master’s Thesis, Woosong University, Daejeon, Korea, 2020. Available online: http://www.riss.kr/link?id=T15593957 (accessed on 17 September 2021).
  13. Gorchs-Font, N.; Ramon-Aribau, A.; Yildirim, M.; Kroll, T.; Larkin, P.J.; Subirana-Casacuberta, M. Nursing students’ first experience of death: Identifying mechanisms for practice learning. A realist review. Nurse Educ. Today 2021, 96, 104637. [Google Scholar] [CrossRef] [PubMed]
  14. Renfrew, M.J.; Bradshaw, G.; Burnett, A.; Byrom, A.; Entwistle, F.; King, K.; Olayiwola, W.; Thomas, G. Sustaining quality education and practice learning in a pandemic and beyond: “I have never learnt as much in my life, as quickly, ever”. Midwifery 2021, 94, 102915. [Google Scholar] [CrossRef] [PubMed]
  15. Gena, C.; Mattutino, C.; Cellie, D.; Di Ninno, F.; Mosca, E. Teaching and learning educational robotics: An open source robot and its e-learning platform. In Proceedings of the FabLearn Europe/MakeEd 2021—An International Conference on Computing, Design and Making in Education, St. Gallen, Switzerland, 2–3 June 2021; pp. 1–4. [Google Scholar]
  16. Alharbi, F. Integrating internet of things in electrical engineering education. Int. J. Electr. Eng. Educ. 2020, 0020720920903422. [Google Scholar] [CrossRef]
  17. Yang, J. Quiz Management System for Mobile Web-Based Learning Environment. Master’s Thesis, Ewha Womans University, Seoul, Korea, 2018. Available online: http://www.riss.kr/link?id=T14881185 (accessed on 17 September 2021).
  18. Kim, E. Effects of Self-Directedness Computer Efficacy and Learning Strategy of High School Students on E-Learing Outcome. Master’s Thesis, Yonsei University, Seoul, Korea, 2014. Available online: http://www.riss.kr/link?id=T13455036 (accessed on 17 September 2021).
  19. Kang, M.H.; Kim, N.Y.; Kim, M.J.; Kim, J.Y.; Lim, H.J. A structural relationship among teaching presence, learning presence and learning outcomes of e-Learning in Cyber University. J. Educ. Inf. Media 2011, 17, 153–176. [Google Scholar]
  20. Lee, J.-M.; Yoon, S.-I. The Effects of Task Value, Perceived Usefulness, and Teaching Presence on Learning Outcomes in Cyber University. J. Korean Assoc. Inf. Educ. 2011, 15, 449–458. [Google Scholar]
  21. Park, J.; Kim, S. Analysis of Influencing Factors of Learning Engagement and Teaching Presence in Online Programming Classes. J. Inf. Commun. Converg. Eng. 2020, 18, 239–244. [Google Scholar] [CrossRef]
  22. Jialiang, H.; Huiying, Z. Mobile-based education design for teaching and learning platform based on virtual reality. Int. J. Electr. Eng. Educ. 2020, 0020720920928547. [Google Scholar] [CrossRef]
  23. Chen, W.; Jeong, S.; Jung, H. WiFi-Based home IoT communication system. J. Inf. Commun. Converg. Eng. 2020, 18, 8–15. [Google Scholar] [CrossRef]
  24. Hong, S. Technology trends and policies for IoT security. Int. J. Emerg. Multidiscip. Res. (IJEMR) 2020, 4, 1–6. [Google Scholar] [CrossRef]
  25. Wang, E. Improving Interface Usability of Mobile E-Commerce Augmented Reality (AR) Contents. Master’s Thesis, Yonsei University, Seoul, Korea, 2021. Available online: http://www.riss.kr/link?id=T15732039 (accessed on 17 September 2021).
  26. Lee, W.W.; Owens, D.L. Multimedia-Based Instructional Design: Computer-Based Training, Web-Based Training, Distance Broadcast Training, Performance-Based Solutions; Pfeiffer: San Francisco, CA, USA, 2004; ISBN 9780787970697. [Google Scholar]
  27. Jones, B.A. ADDIE Model (Instructional Design). 2014. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.572.4041 (accessed on 17 September 2021).
  28. Davis, A.L. Using instructional design principles to develop effective information literacy instruction: The ADDIE model. Coll. Res. Libr. News 2013, 74, 205–207. [Google Scholar] [CrossRef]
  29. Wang, S.-K.; Hsu, H.-Y. Using ADDIE model to design Second Life activities for online learners. In Proceedings of the Association for the Advancement of Computing in Education (AACE), Las Vegas, NV, USA, 17 November 2008; pp. 2045–2050. Available online: https://www.learntechlib.org/primary/p/29946 (accessed on 17 September 2021).
  30. Jung, C.W. Smartphone App Feasibility Test for Activity Daily Living Education in Stroke Patients. Master’s Thesis, Honam University, Gwangju, Korea, 2018. Available online: http://www.riss.kr/link?id=T14730924 (accessed on 17 September 2021).
Figure 1. ADDIE (Analysis, Design, Development, Implement and Evaluation) model.
Figure 1. ADDIE (Analysis, Design, Development, Implement and Evaluation) model.
Applsci 11 10816 g001
Figure 2. Design for menu architecture of mobile application.
Figure 2. Design for menu architecture of mobile application.
Applsci 11 10816 g002
Figure 3. Learning contents for Arduino class, (a) is making an Arduino circuit in Tinkercad and Arduino kits, (b) is programming practice learning in the Arduino IDE, (c) is completing the circuit diagram by adding a Bluetooth module, (d) is making an IoT system prototype in App Inventor.
Figure 3. Learning contents for Arduino class, (a) is making an Arduino circuit in Tinkercad and Arduino kits, (b) is programming practice learning in the Arduino IDE, (c) is completing the circuit diagram by adding a Bluetooth module, (d) is making an IoT system prototype in App Inventor.
Applsci 11 10816 g003
Figure 4. Development of the mobile application, (a) is the first screen to run in the mobile application, and functions to connect to the main page, login, and member registration. (b) is the membership registration menu. (c) is a menu for selecting an item. (d) is a menu that students check for questions. (e) is a menu for the teacher to view statistics for a specific question. (f) is a menu in which learners self-check all items. (g) is a menu that allows the teacher to view all students and the progress of a specific student.
Figure 4. Development of the mobile application, (a) is the first screen to run in the mobile application, and functions to connect to the main page, login, and member registration. (b) is the membership registration menu. (c) is a menu for selecting an item. (d) is a menu that students check for questions. (e) is a menu for the teacher to view statistics for a specific question. (f) is a menu in which learners self-check all items. (g) is a menu that allows the teacher to view all students and the progress of a specific student.
Applsci 11 10816 g004
Table 1. Question items for the mobile application usability evaluation.
Table 1. Question items for the mobile application usability evaluation.
Wide Scope UnitDetail Scope UnitNumber of Question Items
ContentsAccuracy2
Understanding3
Objectivity4
Total of Contents9
Interface’s DesignConsistency3
Design suitability5
Vocabulary accuracy3
Total of Interface’s Design11
TechnologySecurity2
Total22
Table 2. Question items for the mobile application usability evaluation.
Table 2. Question items for the mobile application usability evaluation.
WeeksWide Scope UnitDetail Scope Unit
1Orientation and Arduino’s elements
2Output parts and circuitsTinkercad, App Inventor and LED output
3Bluetooth connect, aia project file, using components
4Common cathode, common anode, RGB LED output and mobile control
5Servo motor output and mobile control
6Resistance size, piezo speaker output and mobile control
7Motor driver, DC motor output and mobile control
8Middle test project maker and presentation
9Input parts and circuitsFND and LCD output and mobile control
10Ultrasonic sensor input and mobile control
11Temperature and humidity sensor input and mobile control
12Variable resistance and joystick input and mobile control
13PIR sensor input and mobile control
14Photoresist sensor input and mobile control
15Final test project maker and presentation
Table 3. Definition of tables and fields in database.
Table 3. Definition of tables and fields in database.
Table’s NameField’s NameData TypeExample
studentTBLstudent_Number (primary key)Integer20200001
student_NameTextGildong Hong
student_PasswordTexthgd1234!!
questionTBLstudent_Number (foreign key)Integer20200001
question_NumberInteger1
question_TextTextI understood the teacher’s explanation.
answer_YorNTextY/N
complete_PercentInteger100%
yes_PersonsInteger100
no_PersonsInteger0
Table 4. Students’ mobile application feasibility evaluation.
Table 4. Students’ mobile application feasibility evaluation.
Wide Scope UnitDetail Scope UnitContentsMeanSD
Mobile contentsAccuracyLearning-related information is reliable.4.220.93
Learning-related information is clear.4.280.74
Total of accuracy4.250.83
UnderstandabilityEasy to understand learning-related information.4.250.77
Learning-related terms are familiar to me.4.220.80
Learning-related information level is easy to understand even in the early stages of learning.4.110.78
Total of understandability4.190.78
ObjectivityLearning-related information has specialty.4.060.95
Learning-related information is systematic and specific.3.970.97
Learning-related information is provided by an authoritative institution.3.940.98
Suitable for providing information with specialized knowledge of learning content.4.360.72
Total of objectivity4.080.92
Interface designConsistencyThere is consistency in color, arrangement, and expression method.4.190.86
The arrangement of icons in the mobile application is now unified with the overall design.4.190.89
The icons in the mobile application are grouped together for consistency.4.220.80
Total of consistency4.200.84
Design suitabilityArrange the content for gradually access and make it logically easy to understand.4.060.95
The meaning of the icon was clearly expressed.4.140.83
The characters used in the mobile application are in a size and font that are easy for the viewer to read.4.420.81
The visual elements work comfortably on the user.4.030.74
You can grasp the structure of mobile applications at a glance.3.970.94
Total of design suitability4.120.86
Vocabulary
accuracy
The phrases used in the mobile application are concise.4.060.83
The phrase used in the mobile application is accurate.4.220.76
The phrase used in the mobile application is correct.4.190.79
Total of vocabulary accuracy4.160.79
TechnologySecurityInformation on personal information protection was presented.3.921.00
Presented a security policy for learning-related personal information.3.531.18
Total of security3.721.10
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, S.; Mun, H.-J. Design and Development of a Self-Diagnostic Mobile Application for Learning Progress in Non-Face-to-Face Practice Learning. Appl. Sci. 2021, 11, 10816. https://doi.org/10.3390/app112210816

AMA Style

Kim S, Mun H-J. Design and Development of a Self-Diagnostic Mobile Application for Learning Progress in Non-Face-to-Face Practice Learning. Applied Sciences. 2021; 11(22):10816. https://doi.org/10.3390/app112210816

Chicago/Turabian Style

Kim, Semin, and Hyung-Jin Mun. 2021. "Design and Development of a Self-Diagnostic Mobile Application for Learning Progress in Non-Face-to-Face Practice Learning" Applied Sciences 11, no. 22: 10816. https://doi.org/10.3390/app112210816

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop