Next Article in Journal
NAGNE: Node-to-Attribute Generation Network Embedding for Heterogeneous Network
Previous Article in Journal
Exploring the Efficacy of Mixed Reality versus Traditional Methods in Higher Education: A Comparative Study
Previous Article in Special Issue
Information Communication Technology (ICT) and Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Pedagogical and Technical Analyses of Massive Open Online Courses on Artificial Intelligence

by
Emilio José Delgado Algarra
1,2,*,
César Bernal Bravo
3,
María Belén Morales Cevallos
4 and
Eloy López Meneses
5
1
Department of Integrated Didactics, Universidad de Huelva, Avenida de la Fuerzas Armadas, S/N, 21007 Huelva, Spain
2
COIDESO Research Center, Universidad de Huelva, Avenida de la Fuerzas Armadas, S/N, 21007 Huelva, Spain
3
Department of Education Sciences, Language, Culture and Arts, Rey Juan Carlos University, Paseo Artilleros s/n, 28032 Madrid, Spain
4
Faculty of Marketing and Communication, Universidad Ecotec, Via Principal Campus Ecotec Km 13.5, Samborondón 092302, Ecuador
5
Department of Education and Social Psychology, Pablo de Olavide University, 41013 Sevilla, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(3), 1051; https://doi.org/10.3390/app14031051
Submission received: 8 December 2023 / Revised: 6 January 2024 / Accepted: 23 January 2024 / Published: 26 January 2024
(This article belongs to the Special Issue Information and Communication Technology (ICT) in Education)

Abstract

:

Featured Application

This study offers an overview of the importance of artificial intelligence in terms of education through its reflection in MOOCs. These courses are open to all citizens and involve the participation of public institutions and private companies; so, the content of this article is useful as a source for other researchers and for the transfer of knowledge beyond the formal education context.

Abstract

MOOCs (massive open online courses) are popular distance courses for which anyone can sign up online with no limits on the number of participants. Moreover, artificial intelligence is a combination of algorithms for the development of human and rational capabilities by machines. This article presents a quantitative study with a sample of 734 MOOCs on artificial intelligence from three important platforms. Through exploratory and factor analyses, and with the support of a category system, it is concluded that, there are similarities in terms of access to content, ease of navigation, design, toolbars, consistency, visible hypertexts, browsing support and links, help in content searching, and course development with regard to the technical dimension. Regarding the pedagogical dimension, xMOOCs represent the most extensive international trend, and unidirectional resources predominate. In relation to the content dimension, MOOCs that include content on the emerging and current uses of artificial intelligence in learning and training are remarkable, including three main trends in MOOCs on artificial intelligence: machine learning and education, ethics of AI, and human learning and inclusivity.

1. Introduction

Globalization has spread technology, trade, and democracy across the globe [1], and has been expanded with the development of information and communication technologies (ICTs). The development of technologies by itself is not decisive for the transformation of social structures or the change in the teaching–learning processes. In this sense, learning and knowledge technologies (LKTs) recognize the importance of the active nature of users in the learning society. LKTs, unlike ICTs, do not focus attention on technical or instrumental aspects but rather on educational aspects that imply didactic mediation [2,3]. With regard to this, in learning environments, MOOCs (massive open online courses) are distance courses where anyone can sign up online without limits on the number of participants. The acronym comes from massive, referring to the possibility of a huge number of people attending the course; open, the content that can be shared and sometimes modified; online, autonomous access and development via the Internet; and course, which is structured for learning by passing tests [4,5].
Over the years, researchers have considered different trends in educational environments. According to this new situation, “ignoring emerging technologies and not including responsible use of them in the classroom implies a disconnection with social reality and social (and labour) needs in the near future” [3] (p. 286), considering that adaptability to technological environments [6] or critical thinking and creative thinking [7] are essential.
The 2023 Educause Horizon report considers that the potential for AI to become mainstream is growing and that the online versus face-to-face dichotomy is being disrupted. This is highlighting the impact of AI on the teaching and learning experience: “It can impact teaching by helping faculty create instructional content and grade assessments. It can impact the student experience by increasing engagement through the use of avatars and the metaverse, in addition to improving learning outcomes via personalizing learning” [8] (p. 10). Focusing on increasing the use of AI in socio-educational contexts, an analysis of the MOOCs on AI of the last 5 years was carried out using the three main platforms: Coursera, EdX, and MiriadaX. From an interdisciplinary perspective, this study focuses on the technical dimension, pedagogical dimension, and AI content of MOOCs. Thus, from the main exploratory to factorial approaches, the research objectives were as follows: Objective 1, know the technical dimension of MOOCs about artificial intelligence in the Coursera, Edx, and MiriadaX platforms; Objective 2, know the pedagogical dimension of MOOCs about artificial intelligence in the Coursera, Edx, and MiriadaX platforms; and Objective 3, understand the main MOOC trends regarding artificial intelligence.

1.1. Technical and Pedagogical Dimensions of MOOCs

Regarding the technical dimension of MOOCs, this study focuses on access, navigation, and interactivity. To access some programs and browsers, software components that add specific features (plugins) are needed. Academic studies on MOOCs [9,10,11,12] have established dimensions around quality that include items concerning didactics and pedagogy, and the trainer and student’s role or functionality, among others. Along these lines and focusing on roles and functionality, the present study defines the navigation (design, browsing, hypertexts, search engine content, etc.) and interactivity (teacher–student, student–student, messaging, chat, etc.) of MOOCs and MOOC platforms. Regarding the pedagogical dimension, MOOCs involve modular learning that is accessible and allows students or professionals to acquire specific competences based on their actual academic or work needs. This includes the partitioning of degrees into smaller units, described by edX CEO [13] as “Lego-like building blocks of learning”. In general terms, competences are defined as a combination of knowledge (facts, concepts, ideas, theories), skills (the ability to carry out processes), and attitudes (disposition to act/react to ideas, persons, or situations). The 2006 Recommendation on Key Competences for Lifelong Learning encouraged the development of competence-oriented teaching and learning. In general terms, seven key competences could be highlighted [14]: communication, mathematical competence and basic competences in sciences and technology, digital, learning to learn, social and civic, sense of initiative and entrepreneurship, and cultural awareness and expression. According to this approach, the content is established at different levels for the integration of knowledge, skills, and attitudes. A previous study has set out three types of integration processes between knowledge and skills [15]: low-road integration, because of practice towards automatic performance; high-road integration, when learners are able to abstract and detach information from its original context and apply it in new contexts; and transformative integration, linked to critical reflection that involves the ability to withstand social pressure.
The popularization of MOOCs was represented in the article “The Year of the MOOC” by the New York Times [16]. In relation to the methodology, the main international trend considers xMOOCs and cMOOCs as basic types [17]: xMOOCs are unidirectional and behaviorism-based courses where evaluation is based on questions, tests, and/or work delivery by students, and where dialogues between them are not promoted. CMOOCs are connectivism-based courses that are oriented under the guidelines of the connective learning of Siemens [18] and Downes [19]. Moreover, xMOOCs are teacher-based and centralised, and cMOOCs are self-organized and networked [17]. According to this classification and in relation to learning type, the following options can be considered: cumulative, comprehensive, integrated, and critical, with a constant or ascending level of complexity.
Regarding resources, MOOCs can use videos, readings, and questionnaires (typical in traditional courses); additionally, they can include forums to create a learning community [20,21]. The evaluation of MOOCs can include what, when, and how to evaluate information for students. What to evaluate is related to objectives and content, when to evaluate considers several options (at the end of each module; at the end of the course; before, during and at the end), and how to evaluate [17] can include self-assessment, peer assessment, etc. Regarding the MOOC platforms, in 2012, due to the high number of enrolled students and based on the technology developed at Stanford, Daphne Koller and Andrew Ng created Coursera. This platform was supported by universities such as Michigan, Penn, Princeton, and Yale. In 2012, the Massachusetts Institute of Technology (MIT) and Harvard jointly produced the Edx Project, a project created from MITx at MIT. Moreover, platforms such as MiriadaX also exist.

1.2. Artificial Intelligence in Educational Learning

According to the EDUCAUSE website, IBM Watson Education focuses on using AI in order to expand learning outcomes and solutions to support all students, following several steps: Step 1, personalized content for students that is based on mastery, lessons, activities, and assignments for students related to today’s society; Step 2, enhancing early childhood vocabulary development, using an app of tablet-based vocabulary learning that can recognize areas that involve additional focus; and Step 3, 1:1 AI-based tutoring for students, which allows users to track students’ progress, adapts the conversation, and provides insights for instructors. Moreover, computer science experts differentiate several types of AI, including systems that think like humans, systems that act like humans, systems that think rationally, and systems that act rationally [22]. AI allows teachers to personalize learning experiences, reducing workloads and assisting with dataset analysis. However, AI is a combination of algorithms designed to create similar capabilities to humans in machines. In general terms:
AI uses computer systems to accomplish tasks and activities that have historically relied on human cognition. Advances in computer science are creating intelligent machines that functionally approximate human reasoning more than ever before. Harnessing big data, AI uses foundations of algorithmic machine learning to make predictions that allow for human-like task completion and decision-making. As the programming, data, and networks driving AI mature, so does the potential that industries such as education see in its application. However, as AI develops more human-like capability, ethical questions surrounding data use, inclusivity, algorithmic bias, and surveillance become increasingly important to consider. Despite ethical concerns, the higher education sector of AI applications related to teaching and learning is projected to grow significantly [8] (p. 27).
In general terms, AI can be used for education, teaching, and learning; moreover, MOOCs can focus on several dimensions of AI including algorithmic bias, programming, analysis, machine learning, deep learning, human learning, ethical questions in data use, or inclusivity. UNESCO’s principle of inclusion from Sustainable Development Goal 4 in the Education 2030 Framework for Action highlights the importance of using the necessary means that allow people to participate in their environment regardless of their specific needs or social situation [23]. This reduces social exclusion and eliminates forms of discrimination in the learning environment. The development of new technologies, such as AI, means new possibilities for attention to diversity, active participation, and social interaction in educational and social environments. In this sense, some opportunities can be focused on [24]:
  • Teachers’ modelling: AI can help teachers reflect on and improve the effectiveness of their instructional activities in classrooms.
  • Multimodal interactions: Sensing technology, ambient classroom tools, and educational robots introduce alternative dynamics in learning environment by increasing interactivity, engagement, and feedback for students and teachers.
  • Educational robots and empathic systems: Making a machine appear to be empathic through encoding can encourage children to adopt positive behaviors.
  • Ethical Issues: Ethics in AI is an area that is receiving attention. This is especially important due to the influence of machines on students.
Regarding ethical implications, some important aspects could be considered [25]: criteria in defining and updating the ethical boundaries of the collection and use of learners’ data; not being able to easily interrogate how AI makes decisions; ethical obligations of private organizations (product developers) and public authorities (schools and universities); and how students’ interests and emotions as well as the complexity of the learning process impact upon the interpretation of the data and ethics of AI applied in educational contexts and pedagogical approaches that are ethically warranted. In general terms, the UNESCO Recommendation on Ethics of AI include the following principles [26] (p. 61): proportionality and do no harm, safely and security, fairness and non-discrimination, sustainability, right to privacy and data protection, human oversight and determination, transparency and explainability, responsibility and accountability, awareness and literacy, and multi-stakeholder and adaptive governance and collaboration.
Regarding adaptive learning technologies (ALT), they generate a dynamic adjustment of content and a didactic sequence to improve student learning through programmed interventions or with teacher intervention according to data provided by the system. One of the main objectives of these kind of technologies integrates the orientation of students. This orientation includes adapting their learning itinerary, promoting active learning, and offering significant support to at-risk students [3]. Adaptive systems can organize learning content, with techniques for matching the presentation of content that are properly adapted to individual students in a personalized way, and multiple assessment inputs to evaluate students’ skills. Adaptive systems need several features in terms of content, assessment, and competency frameworks. The main features of learning adaptive systems are [27]:
  • Content Scaffolding: Depending on the proficiency level of learners, scaffolding provides statistically different questions to various learners. In scaffolding methods, content modules are designed to index concepts.
  • Social Interaction: This refers to content-driven group collaboration related to social skills.
  • Content Inter-operability: Appropriate content is continuously and dynamically identified through interoperable content management systems.
  • Metadata: This method is used for the advanced tagging of content with underlying data on the different content modules (e.g., age, level, subject area identifiers, learning outcomes).
  • Normed- vs. Criterion-referenced Assessments: Criterion-referenced assessments show the performance of learners in relation to a defined set of outcomes. Norm-referenced assessments are designed to compare the performance of individual students with the performance of a representative sample of peers or “norm group”.
  • Predictive Psychometric Design: Adaptive tests make it possible to accurately place a learner on an individualized learning pathway; this is possible because the predictive capabilities are derived from the adaptive assessment design.
  • Diagnostic Classification Modelling: The diagnosis of cognition, competence of a particular skill, or sub-competence of a defined outcome is important in adaptive systems to align teaching, learning, and assessment.
  • Zone of Proximal Development: This refers to the difference between what learners can do without help and what they can do with help.
  • Self-assessment: Learners’ self-assessment is compared to what the adaptive system knows about a completed sequential piece of work.
  • Skill Standards Libraries: Skills are defined by the units of knowledge, skills, and abilities used in assessment. Libraries of skill standards are constructed as a correlative “benchmark” or outcome in modular adaptive content and assessment, informing students of what is expected of them.
  • Competences/Sub-competencies: Identified skills and competencies are delineated by “sub-competencies”.
  • Prerequisite Knowledge and Prior Knowledge Qualifiers: Prior learning as assessment is learner-centered, and places learners at a starting point for the next viable competence to learn to build on existing knowledge.
AI can be applied in terms of education in multiple ways and focused on students, teachers, or institutions [28]. Due to the importance of student autonomy in MOOCs, in this study, we were especially interested in the pedagogical aspects related to the learning process.
As the NMC Horizon Report 2018 states, ALT “refers to technologies that monitor student progress and use data to modify instruction at any time” [29] (p. 42), and they are linked to personalized learning approaches and learning analytics. In general terms, ALT would be related to AI using its basic algorithms to personalize learning and to offer content adapted to students. Due to the expansion of MOOCs and the development of artificial intelligence (AI) in daily life, as an assistive technology for children, adults and education, a review of the trends in artificial intelligence in MOOCs from the last 5 years was carried out within the following three platforms: Coursera, EdX, and MiriadaX. Considering the complexity of both conceptualizations and understanding the main relationships between MOOC content, not only central tendency but also a factorial analysis based on correlations was carried out, considering the importance of this technology in terms of education and learning issues.

2. Materials and Methods

2.1. Research Methodology, Sample, and Data Collection

The methodological approach of this research is quantitative. The sample consists of a random selection of 734 MOOCs on AI from the last 5 years from the following platforms: Coursera (654), Edx (75), and MiriadaX (5). They can be divided into five macro-reference areas (Figure 1): experimental sciences (64/734), health sciences (30/734), technical teachings (526/734), social and legal sciences (84/734), and humanities (30/734).
Sampling was carried out through the search bar of the respective platforms using the following keywords: “artificial intelligence”, “human learning”, “machine learning”, and “deep learning analysis”. The sampling strategy was simple random sampling. In this strategy, every set of items has the same probability of being chosen. Pedagogical and technical dimensions of MOOCs in the data collection tool were adapted from several questionnaires: a questionnaire for the evaluation of the teaching, technical, and educational aspects of teaching sites [12] and a questionnaire to measure the quality of a MOOC [30]. Regarding these dimensions, the ‘Yes’ and ‘No’ scale was maintained, with several exceptions. Items for data collection in the specific “artificial intelligence” block were related to the theoretical background. In terms of artificial intelligence, we focused on “content” and “types”. Exploratory and factorial analyses allowed us to define emerging trends in MOOCs regarding artificial intelligence.

2.2. Categories System and Data Analysis

After the problem was defined and references were reviewed, a system of categories was used, whose specific structure was adapted from a reference source [2]. This reference includes a coding system where “DID” corresponds to the category of the pedagogical dimension of the MOOCs and “TEC” corresponds to the technical dimension of the MOOCs. In addition, a new category of “AI” has been added with the support of other specific proposals [3,8,24,31,32,33]. In general terms, three categories of this system regarding MOOC platforms and MOOCs are the pedagogical dimension (Table 1), technical dimension (Table 2), and specific block: artificial intelligence (Table 3).
After categorization, the data are entered into the statistical program “SPSS” v.19, allowing analysis of the data. The analyses include an exploratory analysis through descriptive methods for all categories and a factor analysis through correlational methods for items in specific categories. In terms of the general structure of the analysis of the technical and pedagogical dimensions of this study, we took as a reference the study of MOOCs on citizenship education [2], focusing, in this case, on MOOCs on artificial intelligence.

3. Results and Discussion

3.1. Technical Dimension of MOOCs

According to objective 1 and research question 1, the technical dimension refers to the possibilities of interactivity, browsing, and accessibility offered by Coursera, Edx, and MiriadaX. These platforms are very similar in technical terms. In comparison with our previous research [2], there are similarities regarding access to content, ease of navigation, design, toolbars, consistency, visible hypertexts, browsing support and links, help with content search, and the development of the course. The teacher–student interaction is possible, for example, through private messages, although the student–student interaction and the use of specific communication tools for oral communication are not possible. Moreover, social networks and student interaction tools do not have a notable relevance in the courses reviewed. Nevertheless, previous educational research considered the importance of social networks and the interaction between students in MOOCs [35]. In other words, xMOOCs represent the most extensive international trend. Consistent with this international MOOC trend, previous research has indicated that interaction is one of their main drawbacks [2,36], including challenges for MOOC developers such as the support for interaction, balance between theory and practical examples, technical and learning strategy support for students, and support that includes communication with other learners and feedback from teachers.

3.2. Pedagogical Dimension of MOOCs

Regarding the pedagogical dimension of MOOCs in objective 2 and research question 2, according to the data collected regarding INITIAL BASIC INFORMATION, the main language (DID.BAS.01) of artificial intelligence courses is English, with 694/734 courses. Temporalization (DID.BAS.02) is quite different, so that 19/734 courses are designed to last 1 week or less, 205/734 courses run from 1 to 4 weeks, 103/734 courses last for 5 weeks, 266/734 courses run for 6 weeks, 35/734 courses last 7 weeks, and 106/734 courses run for 8 weeks or more. The list of modules (DID.BAS.03) presented 659/734 courses, and in relation to the number of modules (DID.BAS.04), this was similar. The difficulty levels (DID.BAS.05) were classified as beginner (240/734 courses), intermediate and mixed (419/734 courses), and expert levels (75/734 courses). Some 508/734 courses do not require prior knowledge (DID.BAS.06). Courses usually present teaching teams or a person responsible for the training (DID.BAS.07). The general information guide (DID.BAS.08) is presented in almost all courses and contact information (DID.BAS.09) in 379/734 courses.
In relation to the objectives and competences subcategory, the aims (DID.OYC.01) are implicitly or explicitly presented in all courses. However, a lack of balance is observed between concepts, procedures, and attitudes. The main objectives are defined in a conceptual approach. On the other hand, competences (DID.OYC.02) are not explicit in 711/734 courses, but they can be considered from an implicit point of view. Communication competence (DID.OYC.03) is considered in 112/734 courses; mathematical competence and basic competences is considered in sciences and technology (DID.OYC.04) in 222/734 courses; digital competence (DID.OYC.05) in 628/734 courses; learning to learn competence (DID.OYC.06) in 168/734 courses; social and civic competence (DID.OYC.07) in 426/734 courses; sense of initiative and entrepreneurship competence (DID.OYC.08) in 349/734 courses; and cultural awareness and expression competence (DID.OYC.09) in 256/734 courses. In the courses reviewed, the development of digital competence predominates, followed by social and civic competence, and the competence focused on the sense of entrepreneurial initiative and spirit. Within the content subcategory, the integration of knowledge, skills and attitudes (DID.CON.01) is present in an eminently declarative manner in 255/734 courses. Consistent with our research results, digital competence development through MOOCs has been recognized in other research [37]. Moreover, the general integration of AI and the everyday context could explain the significant importance of social and civic competence in most of the MOOCs from our research sample. Research into the pedagogical elements of MOOCs reflected broad cognitive competences that emphasize the importance of critical thinking, evidence-based argumentation, evaluation of evidence, and application of acquired knowledge in problem solving [38]. In general terms, critical thinking has been acknowledged as a trans-disciplinary skill [39], but this skill needs the support of disciplinary content to solve problems and build arguments.
In the methodology subcategory, regarding the description of the teaching activity (DID.MET.01), in only 38/734 of the courses did the teacher actively act as a guide. In other words, there is an almost total predominance of a passive teacher and the dispensing of content (696/734). The details of students’ workload (DID.MET.02) are present in all courses, while the participation in the activities (DID.MET.03) is individual in 694/734 courses. Although previous research recognized the benefits of critical thinking in learning [38], 75.6% of courses from the sample are cumulative or comprehensive, and do not include elements that encourage the construction of critical competence. Thus, the type of learning (DID.MET.04) is predominantly cumulative in 275/734 courses, compressive in 287/734, critical in 90/734, and integrated in 82/734 courses (Figure 2). Regarding the degree of complexity (DID.MET.05), it is constant in 350/734 courses and ascending in 384/734.
In relation to the resources subcategory, readings (DID.REC.01), videos (DID.REC.02), and quizzes (DID.REC.03) are present in all MOOCs of the sample. There is no use of social networks (DID.REC.04), and the use of websites (DID.REC.05), and other applications and resources (DID.REC.06) is considered in 82/734 and in 9/734 courses, respectively. Consistently with our analysis of the technical dimension, there is a lack of variety of resources used in many MOOCs in the sample. Considering these deficits, several studies that have considered the students’ point of view state that MOOCs should create learning communities to increase student interaction, student feedback, and communication, including both synchronous and asynchronous communication tools, requiring more resources in addition to lecture videos [40,41,42]. One possibility achieving this is through tools inspired by social networks, such as Facebook [43]. In the schedule subcategory, the temporal detail for content development (DIC.CRO.01), including customized temporalization, is present in all courses. Regarding key dates and deadlines (DID.CRO.02), they are clearly defined in 663/734 courses. In the evaluation subcategory, the assessment criteria (DID. EVA.01) are defined in all courses. Focusing on when to evaluate (DID.EVA.02) is mainly considered at the end of each module (628/734). This is followed by at the end of the course (36/734) and before, during, and at the end (35/734). Furthermore, this is not defined in 35/734 courses. How to evaluate (DID.EVA.03) is explicitly indicated in 689/734 courses. Some researchers in several disciplines have reported similar findings regarding assessment [35,36,37,38,39]. Participation in forums (DID.EVA.06) and the drafting of written work (essays, reports, etc.) (DID.EVA.07) are used as an evaluation tool in 36 courses and 29 courses, respectively. Finally, in relation to the bibliography subcategory, a basic bibliography is present in 605/734 courses (including scientific reading such as articles, chapters, etc.), and a basic and complementary bibliography is present in 91/734 courses. A bibliography is not included in 38/734 courses.

3.3. Exploratory Analysis of Artificial Intelligence Content in MOOCs

This specific block is related to objective 3 and responds to a part of research question 3. Frequencies and percentages of the course content related to artificial intelligence are shown in Table 4. Considering that, in an implicit and explicit approach, ethical questions in data use (AI.CON.07) and inclusivity (AI.CON.08) are the least frequently used artificial intelligence content (4.4% and 5.2%, respectively), the most extended content in current MOOCs is machine learning (AI.CON.04), at 83.7%. This is followed by human learning (AI.CON.06) at 65.1%, analysis (AI.CON.03) at 62.6%, deep learning (AI.CON.05) at 61.3%, education, teaching, and learning (AI.CON.09) at 56.3%, programming (AI.CON.02) at 53.8%, and algorithmic bias (AI.CON.01) at 28.6%.
In general terms, learning issues have an important presence in current MOOCs about artificial intelligence, highlighting the ability of systems to automatically learn and improve from experience without being explicitly programmed (machine learning). This also highlights the human process of acquiring knowledge, including skills and values (human learning). Education, teaching, and learning has a low explicit presence in MOOCs about artificial intelligence; however, considering implicit approaches too, this content has a relatively important presence. In other words, MOOCs that include content on the emerging and current uses of artificial intelligence in education and training are notable, considering its role as an assistive technology for children and adults. On the other hand, AI approaches were considered (Table 5): systems that think like humans, systems that act like humans, systems that think rationally, and systems that act rationally.
In this case, the predominant approach of courses in the sample has been taken as a reference. Although one type does not always predominate, when the predominance is clear (implicitly or explicitly), AI from the perspective of systems that think like humans is present in 56.1% of the 734 courses. This is followed by courses where AI is studied from the approach of systems that think rationally (16.9%) and systems that act like people (9.4%). This last approach is normally linked to robotics. The least predominant approach is that of systems that act rationally, at 7.3%. These data imply that, in relation to AI, there are more courses focused on systems thinking than on acting. On the other hand, courses focused on thinking as humans predominate over courses focused on rational thinking. This is consistent with the interest that researchers currently have in the study, development, and use of AI in relation to the representation and interpretation of human emotions, including the risks of interpreting human emotions with AI and the challenge of building fair and equitable machine learning systems [22,31].

3.4. Factor Analysis of the Artificial Intelligence Content in MOOCs

The factor analysis regarding the content of artificial intelligence in MOOCs is related to objective 3 and responds to research question 3. The first step to consider in factor analysis was to ensure that the sample number (734) was at least five times greater than the number of items. Next, as is shown in Table 6, the Kaiser–Meyer–Olkin (KMO) sampling adequacy index gives an acceptable result (0.693), as it is situated between 0.500 and 0.750. Moreover, Bartlett’s test of sphericity indicates a high statistical significance (p < 0.005) and that the factors are correlated. Thus, the factor model used is adequate to explain the data.
To explain this value of KMO, it should be clarified that, according to the communalities and prior to continuing with the factor analysis, the items with a communality value lower than 0.500 were not considered (AI.CON.02. Programming, AI.CON.03. Analysis, AI.TYP.04. Systems that act rationally), except for the item “AI.CON.07. Ethical questions in data use”. This decision was due to the fact that AI.CON.07 is an item considered to be especially relevant by the researcher and part of the research on AI [44]. This was because it was slightly below the minimum desirable value and when removing AI.CON.02 and AI.CON.03, beyond the improvement of the KMO values (from 0.679 to 0.693), the first four components would explain a 64.139% variance (previously it was below 60%). This variance would lead to considering the factor reduction, since the explanation of 64.139% would generally be considered acceptable [45].
Pearson correlations allow researchers to interpret the degree of the relationship between pairs of variables according to the magnitude (absolute value) and direction (sign); these scores can be low (0.30>), medium (0.50>), high (0.70>), or perfect (1) [2]. Figure 3 groups the significant correlations. In addition, the significance of the correlations was considered, with significant values at 0.01 denoted by ** (probability of correlation due to chance equal to or less than 1%).
Despite the absence of high correlations, there are medium–low correlations that have been selected for their statistical significance (correlation is significant at the 0.05 level [bilateral]) and for their special interest in research, such as correlations with “AI.CON.07. Ethical questions in data use” and with “AI.CON.08. Inclusivity”. The importance of the ethics of AI for researchers and policies is considered [14,44], highlighting that the ethics of AI lies in the ethical quality of prediction, the end outcomes, and the impact on humans. The selected correlations have been organized in the correlational structure shown in Figure 3. This correlation structure implies a visual image of the main correlations of the study in relation to the AI block and is complementary to the establishment of the emerging profiles of MOOCs on AI supported by principal component analysis.
In relation to the AI content and perspectives present in MOOCs, “AI.CON.09. Education, teaching, and learning” presents a significant medium–low correlation with “AI.CON.06. Human learning” (0.420**) and “AI.CON.05. Deep Learning” (0.458**); the last two indicators correlate with the 0.355** value. Education includes teaching and learning, and both humans and machines can learn. The relationship between these indicators relates to the importance of finding patterns in observations about a thing, process, or phenomenon. This is because, in general terms, whether it is human learning or machine learning, the expression of this pattern is the model that has been learned to define the relationship between the variables involved [46]. Deep learning correlates with “AI.CON.04. Machine learning”, establishing a chain of significant correlations between human learning, systems that think like humans, deep learning, and machine learning. However, despite the deep learning algorithm trying to ease the understanding about how human cognition and learning work, machine learning does not seem to explain how humans decide and learn in real life. The human-like or even superhuman performance of machine learning programs do not explain how humans reason and learn due to the omission of psychological characteristics such as limitations on information processing capabilities, attention, and short-term memory [47]. “AI.CON.08. Inclusivity” correlates with “AI.CON.06. Human learning” (0.262**) and “AI.CON.09. Education, teaching, and learning” (0.233**). Human learning is a complex process that, according to the content and perspectives of AI in analyzed MOOCs, establishes a low but significant relationship with inclusivity. Moreover, inclusivity establishes a relationship with education, teaching, and learning in similar terms. According to these results, a Massachusetts Institute of Technology—MIT—research project explored the ways in which AI systems can be designed and deployed to support diversity and inclusiveness in society with emphasis on the impact of AI on underserved groups and on how these communities think about AI systems. Moreover, the uneven access to AI and related technologies in often-marginalized populations (urban and rural poor communities, women, youth, LGBTQ, ethnic and racial groups, people with disabilities) contribute to amplifying digital inequalities [48,49]. In other words, the digital divide in the access and use of AI is a very relevant aspect that affects inclusivity. “AI.TYP.01. Systems that think like humans” correlates with “AI.TYP.03. Systems that think rationally” (0.391**). “AI.TYP.03. Systems that think rationally” (0.316**) and “AI.TYP.02. Systems that act like humans” (0.307**) correlate with “AI.CON.07. Ethical questions in data use”. Human thinking goes beyond rational thinking and, when making decisions, both the emotions and the ethics of the person play a relevant role. This result converges with the research of Kuo, Tsai, and Wang [50] that establishes a model where the reconfiguration of instructional elements of challenging content that encourages users to engage with the courses cognitively and emotionally is recommended. However, most studies on learning engagement in MOOCs have employed behavioral engagement and have ignored the importance of emotional and cognitive engagement [51,52]. In general terms, and focusing on AI content in MOOCs, this study establishes ethics as a relevant aspect. The European Commission has a High-Level Expert Group on Artificial Intelligence, which on 8 April 2019, published “Ethics Guidelines for Trustworthy Artificial Intelligence”. These guidelines put forward a set of seven key requirements that AI systems should meet in order to be considered trustworthy [49]: human agency and oversight, technical robustness and safety, privacy and data governance, transparency, diversity, non-discrimination and fairness, and societal and environmental well-being and accountability. The ethical factors regarding the development and use of AI have generated profound political debates that, in Europe, will be reflected in the artificial intelligence law [53]. On the other hand, important educational debates are being generated, establishing the progression of the responsible use of this technology that faces a reactionary current rejecting the use of technology in classrooms and contributes to the digital illiteracy of students [54].
In a principal component analysis, each variable (or group of variables) is dependent on several components. In turn, these are composed of all the other variables, so that the choice of factors is based on the criterion of the percentage of variance [55]. As we can see in Table 7, the reduction in the factors from 1 to 4 explains 64.140% of the variance. In short, by applying the principal component analysis, all variables would be grouped into six factors (Table 7).
To interpret the components, the factor matrix was transformed into a rotated factor matrix using a Varimax normalization factor rotation process with Kaiser (Table 8). This facilitates interpretation by achieving high correlations with one group of variables and low correlations with the others.
Principal component analysis has allowed us to extract information to identify subgroups of MOOC content regarding AI from a general approach, preserving most of the information in the dataset and focusing attention on how much information each of the components can include. Generative AI has been identified by many higher education experts as one of the most disruptive technologies from the last years because it has the potential to create text, images, or sounds like a human and forces educators to reimagine assessment and educational experiences [8]. According to the main results and based on the principal component analysis extraction method, 10 items about AI content in MOOCs could be reduced to the following four components. Component 1 explains 26.081% of the variance; we designated this machine learning and education, and it includes machine learning, deep learning, education, teaching and learning, and systems that think like humans. Component 2 explains 17.073% of the variance. Regarding this, despite concerns about the inappropriate use of AI in education, some experts consider that AI tools can improve student learning with the restructuring of didactic designs and experiences [56]. We designated this ethics of AI, and it includes ethical questions in data use, systems that act like humans, and systems that think rationally. MOOCs about AI include some of the basic aspects on ethics in the use and management of AI presented by UNESCO [25,26]. Component 3 explains 10.655% of the variance; we designated it human learning and inclusivity, and it includes human learning and inclusivity indicators. Finally, component 4 explains 10.330% of the variance; we designated it algorithmic bias and it includes only that indicator. In the fourth case, the factor reduction is not significant.

4. Conclusions

Nowadays, there are numerous technologies that can be used in social and educational environments. AI is an increasingly integrated technology in day-to-day life, while MOOCs offer autonomous training opportunities to users. Regarding the technical dimension, the MOOC platforms in this study (Coursera, Edx and MiriadaX) have similar technical features in terms of access to content, ease of navigation, design, toolbars, consistency, visible hypertexts, browsing support and links, help with content searches, and the development of the course. Regarding the pedagogical dimension, xMOOCs represent the most extensive international trend and unidirectional resources predominate. In other words, in line with analogous studies regarding more extended MOOC approaches [2,17,18], the main approach of the MOOCs reviewed from the three platforms is the xMOOC, a kind of MOOC where interaction is lacking, and student autonomy or recognition by prestigious institutions such as universities are some of the main features. Due to the main topic of this study, digital competence has a very significant importance in most MOOCs. Furthermore, due to the extent of xMOOCs and according to their features, a passive teacher approach, content dispensing, and individual participation in learning activities predominate. Moreover, proposed learning in MOOCs is predominantly cumulative and compressive (not critical), where readings, videos, and quizzes are common resources and the usual evaluation process is based upon self-assessment, taking place at the end of each module. MOOCs are expected to be the instrument that defines training through the Internet in the coming years [57]. However, there are disadvantages that must be examined from a critical and situated perspective. This study found that xMOOCs are the most widespread; although, in a connectivism approach, knowledge shifts from focusing on a teacher to focusing on students’ interactions for learning in cMOOCs [58,59].
In general terms, due to the extra effort needed for teachers to directly monitor massive courses, there are difficulties in promoting cMOOCs. In relation to the content dimension, MOOCs that include content on the emerging and current uses of artificial intelligence in education and training are remarkable. Authors such as Arntz, Gregory, and Zierahn [60] state that this digitization and automation of work is one of the most important societal and economic trends worldwide. Nowadays, adaptive systems are at least composed of methods that allow users to organize content to be learned, with techniques for matching the presentation of content that are properly adapted to individual students in a personalized manner with multiple assessment inputs to evaluate students’ skills. This digitization trend is reflected in the content of MOOCs about AI in such a way that factor analysis found three very marked trends in MOOCs on AI: machine learning and education, ethics of AI, and human learning and inclusivity. This research into MOOC trends on AI concludes that there is concern about the definition and development of AI ethics, including issues related to the seven key requirements from the “Ethics Guidelines for Trustworthy Artificial Intelligence” [61] and considering its role as an assistive technology for children and adults. There is an emerging trend in the study of AI related to the complexity of human learning and the inclusion of people, including the irregular access to AI and related technologies in often marginalized populations (urban and rural poor communities, women, youth, LGBTQ, ethnic and racial groups, people with disabilities) that amplify digital inequalities [48,49]. Finally, there is a growing interest in the development of AI that learns as humans do, where machine learning and deep learning can interpret people’s learning and actively contribute to their education.

Author Contributions

Conceptualization, methodology, formal analysis, investigation, writing—original draft preparation, E.J.D.A.; writing—review and editing, E.J.D.A., C.B.B., M.B.M.C. and E.L.M. All authors have read and agreed to the published version of the manuscript.

Funding

Project of Educational Innovation and research “Diseño y desarrollo de experiencias educativas de realidad virtual y aumentada en didácticas”, Cátedra institucional “educación en tecnologías emergentes, gamificación e inteligencia artificial” (EduEmer) and Research Center in Contemporary Thinking and Innovation for Social Development (COIDESO).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All the data used in this paper can be obtained by contacting the authors of this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Niemczyk, E.K. Glocal Education in Practice: Teaching, Researcher, and Citizenship. BCES Conf. B 2019, 17, 1–6. [Google Scholar]
  2. Delgado-Algarra, E.J.; Román Sánchez, I.M.; Ordóñez Olmedo, E.; Lorca-Marín, A.A. International MOOC trends in citizenship, participation and sustainability: Analysis of technical, didactic and content dimensions. Sustainability 2019, 11, 5860. [Google Scholar] [CrossRef]
  3. Delgado-Algarra, E.J. ITCs and Innovation for Didactics of Social Sciences; IGI Global: Hershey, PA, USA, 2020. [Google Scholar]
  4. Pedreño, A.; Moreno, L.; Ramón, A.; Pernías, P. UniMOOC: Un trabajo colaborativo e innovación educativa. Campus Virtuales 2013, 2, 10–18. [Google Scholar]
  5. Infante-Moro, A.; Infante-Moro, J.-C.; Torres-Díaz, J.-C.; Martínez-López, F.-J. Los MOOC como sistema de aprendizaje en la Universidad de Huelva (UHU). IJERI Int. J. Educ. Res. Innov. 2017, 8, 163–174. [Google Scholar]
  6. Butler-Adam, J. The fourth industrial revolution and education. S. Afr. J. Sci. 2018, 114, 1. [Google Scholar] [CrossRef]
  7. Coberly-Holt, P.; Elufiede, K. Preparing for the Fourth Industrial Revolution with Creative and Critical Thinking. In Proceedings of the Annual Meeting of the Adult Higher Education Alliance, 43rd, Orlando, FL, USA, 7–8 March 2019. [Google Scholar]
  8. Pelletier, K.; Robert, J.; Muscanell, N.; McCormack, M.; Reeves, J.; Arbino, N.; Grajek, S. 2023 EDUCAUSE Horizon Report. Teaching and Learning Edition; EDUCAUSE: Boulder, CO, USA, 2023. [Google Scholar]
  9. Mengual-Andrés, S.; Roig Vila, R.; Lloret Catalá, C. Validación del cuestionario de evaluación de la calidad de cursos virtuales adaptado a MOOC. Rev. Iberoam. Educ. Distancia 2015, 18, 145–169. [Google Scholar] [CrossRef]
  10. Gallego, G.; Roldán López, N.D.; Torres Velásquez, C.F.; Rendón Ospina, F.; Puerta Gil, C.A.; Toro García, C.A.; Giraldo, J.M.A.; Sánchez, J.P.T.; Álvarez, Y.S.; Velásquez, C.F.T. WPD1.13 Informe Sobre Accesibilidad Aplicada a MOOC. MOOC-Maker Construction of Management Capacities of MOOC in Higher Education (561533-EPP-1-2015-1-ES-EPPKA2-CBHE-JP). 2016. Available online: http://www.mooc-maker.org/?dl_id=34 (accessed on 17 August 2023).
  11. Ortega Ruiz, I.J. Análisis de adecuación de los MOOC al u-Learning: De la Masividad a la Experiencia Personalizada de Aprendizaje. Propuesta uMOOC. 2016. Available online: https://goo.gl/6A5hxZ (accessed on 17 August 2023).
  12. Bournissen, J.M.; Tumino, M.C.; Carrión, F. MOOC: Evaluación de la calidad y medición de la motivación percibida. IJERI Int. J. Educ. Res. Innov. 2018, 11, 18–32. [Google Scholar]
  13. Agarwal, A. How Modular Education Is Revolutionizing the Way We Learn (and Work). Forb. Available online: https://bit.ly/30sPsAt (accessed on 30 June 2023).
  14. European Commission. Proposal for a Council Recommendation on Key Competences for LifeLong Learning. 2018. Available online: https://data.consilium.europa.eu/doc/document/ST-5464-2018-ADD-2/EN/pdf (accessed on 2 March 2023).
  15. Baartman, L.K.; De Bruijn, E. Integrating knowledge, skills and attitudes: Conceptualising learning processes towards vocational competence. Educ. Res. Rev. 2011, 6, 125–134. [Google Scholar] [CrossRef]
  16. Pappano, L. The Year of the MOOC. The New York Times, 2 November 2012. Available online: https://shorturl.at/mnoqy(accessed on 30 June 2023).
  17. Yousef, A.M.F.; Chatti, M.A.; Wosnitza, M.; Schroeder, U. Análisis de clúster de perspectivas de participantes en MOOC. RUSC Univ. Knowl. Soc. J. 2015, 12, 74–91. [Google Scholar] [CrossRef]
  18. Siemens, G. MOOCs for the Win! ElearnSpace. 2012. Available online: http://www.elearnspace.org/blog/2012/03/05/moocs-for-the%20win/ (accessed on 3 April 2023).
  19. Downes, S. The Rise of MOOC. 2012. Available online: http://www.downes.ca/post/57911 (accessed on 3 April 2023).
  20. Cabero, J.; Leiva, J.J.; Moreno, N.M.; Barroso, J.; López Meneses, E. Realidad Aumentada y Educación. Innovación en Contextos Formativos; Octaedro: Barcelona, Spain, 2016. [Google Scholar]
  21. López Meneses, E.; Vázquez-Cano, E.; Román, P. Analysis and implications of the impact of MOOC movement in the scientific community: JCR and Scopus (2010–2013). Comunicar 2015, 44, 73–80. [Google Scholar] [CrossRef]
  22. Russel, S.; Norvig, P. Artificial Intelligence: Modern Approach, 4th ed.; Pearson: London, UK, 2020. [Google Scholar]
  23. Education 2030 Incheon Declaration and Framework for Action for the Implementation of Sustainable Development Goal 4. Ensure Inclusive and Equitable Quality Education and Promote Lifelong Learning Opportunities for All. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000245656 (accessed on 1 September 2016).
  24. Mohammed, P.; Watson, E.N. Towards Inclusive Education in the Age of Artificial Intelligence: Perspectives, Challenges, and Opportunities. In Artificial Intelligence and Inclusive Education; Knox, J., Wang, Y., Gallager, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2019; pp. 17–37. [Google Scholar]
  25. Miao, F.; Holmes, W.; Huang, R.; Zhang, H. AI and Education. Guidance for Policy Makers; UNESCO: Paris, France, 2021. [Google Scholar]
  26. Lui, B.L.; Morales, D.; Chinchilla, J.F.R.; Sabzalieva, E.; Valentini, A.; Vieira, D.; Yerovi, C. Harnessing the Era of Artificial Intelligence in Higher Educatioon. A Primer for Higher Education Stakeholders; UNESCO: Paris, France, 2023. [Google Scholar]
  27. Pugliese, L. Adaptive Learning Systems: Surviving the Storm. Educ. Rev. 2016, 10, 18–32. [Google Scholar]
  28. Holmes, W.; Tuomi, I. State of the art and practice in AI in education. Eur. J. Educ. 2022, 57, 542–570. [Google Scholar] [CrossRef]
  29. Becker, S.A.; Brown, M.; Dahlstrom, E.; Davis, A.; DePaul, K.; Diaz, V.; Pomerantz, J. NMC Horizon Report: 2018 Higher Education Edition; EDUCAUSE: Louisville, KE, USA, 2018. [Google Scholar]
  30. Guàrdia, L.; Maina, M.; Sangrà, A. MOOC Design Principles. A Pedagogical Approach from the Learner’s Perspective. Elearn. Pap. 2013, 33, 1–6. [Google Scholar]
  31. Delgado-Algarra, E.J.; Estepa-Giménez, J. Ciudadanía y dimensiones de la memoria en el aprendizaje de la historia. Análisis de un caso de educación secundaria. Vínc. Hist. 2018, 7, 366–388. [Google Scholar]
  32. Estepa-Giménez, J.; Ferreras-Listán, M.; Cruz, I.; Morón-Monge, H. Análisis del patrimonio en los libros de texto. Obstáculos, dificultades y propuestas. Rev. Educ. 2011, 335, 573–588. [Google Scholar]
  33. Cuenca, J.M.; Estepa-Giménez, J.; Martín Cáceres, M.J. Patrimonio, educación, identidad y ciudadanía. Profesorado y libros de texto en la enseñanza obligatoria. Rev. Educ. 2017, 375, 136–159. [Google Scholar]
  34. The Council of European Union. Council Recomendation of 22 May 2018 on Key Competences for Lifelong Learning (Text with EEA Relevance) (2018/C 189/01). 2018. Available online: https://shorturl.at/bqBOT (accessed on 15 April 2023).
  35. Aksela, M.K.; Wu, X.; Halonen, J. Relevancy of the Massive Open Online Course (MOOC) about Sustainable Energy for Adolescents. Educ. Sci. 2016, 6, 40. [Google Scholar] [CrossRef]
  36. Terras, M.M.; Ramsay, J. British Massive Open Online Courses (MOOCs): Insights and Challenges from a Psychological Perspective. J. Educ. Technol. 2015, 46, 472–487. [Google Scholar]
  37. Najafi, H.; Rolheiser, C.; Håklev, S.; Harrison, L. Variations in Pedagogical Design of Massive Open Online Courses (MOOCs) across Disciplines. Teach. Learn. Inq. Issotl J. 2017, 5, 47. [Google Scholar] [CrossRef]
  38. Krause, K.L.D. Challenging perspectives on learning and teaching in the disciplines: The academic voice. Stud. High. Educ. 2014, 39, 2–19. [Google Scholar] [CrossRef]
  39. Wang, M. Designing online courses that effectively engage learners from diverse cultural backgrounds. Br. J. Educ. Technol. 2007, 38, 294–311. [Google Scholar] [CrossRef]
  40. Young, J.R. What professors can learn from ‘hard core’ MOOC students? Chron. High. Educ. 2013, 59, A4. [Google Scholar]
  41. Plangsorn, B.; Na-Songkhla, J.; Luetkehans, L.M. Undergraduate students’ opinions with regard to ubiquitous mooc for enhancing cross–cultural competence. World J. Educ. Technol. Curr. Issues 2016, 8, 210–217. [Google Scholar] [CrossRef]
  42. Jessop, T.; Maleckar, B. The influence of disciplinary assessment patterns on student learning: A comparative study. Stud. High. Educ. 2016, 41, 696–711. [Google Scholar] [CrossRef]
  43. Gómez-Hurtado, I.; García Prieto, F.J.; Delgado-García, M. Uso de la red social Facebook como herramienta de aprendizaje en estudiantes universitarios: Estudio integrado sobre percepciones. Perspect. Educ. 2018, 57, 99–119. [Google Scholar] [CrossRef]
  44. Siau, K.; Wang, W. Artificial Intelligence (AI) Ethics: Ethics of AI and Ethical AI. J. Datab. Manag. 2020, 31, 74–87. [Google Scholar] [CrossRef]
  45. Vázquez Bernal, B.; Aguaded, S. La percepción de los alumnos de Secundaria de la contaminación: Comparación entre un ambiente rural y otro urbano. In Reflexiones Sobre la Didáctica de las Ciencias Experimentales; Martín Sánchez, M.T., Morcillo Ortega, J.G., Eds.; Universidad Complutense: Madrid, Spain, 2001; pp. 517–525. [Google Scholar]
  46. Goel, G. Human Learning vs. Machine Learning. Towards Data Science. 2019. Available online: https://towardsdatascience.com/human-learning-vs-machine-learning-dfa8fe421560 (accessed on 15 April 2023).
  47. Kao, Y.-F.; Venkatachalam, R. Human and machine Learning. Comput. Econ. 2021, 57, 889–909. [Google Scholar] [CrossRef]
  48. Ethics and Governance of Artificial Intelligence. AI and Inclusion Project. Available online: https://www.media.mit.edu/projects/ai-and-inclusion/overview/ (accessed on 15 August 2023).
  49. Salleb-Aouissi, A. AI and the Building of a More Inclusive Society. In Proceedings of theGlobal Symposium Artificial Intelligence & Inclusion, Río de Janeiro, Brazil, 8–10 November 2017. [Google Scholar]
  50. Kuo, T.M.; Tsai, C.C.; Wang, J.C. Linking Web-Based Learning Self-Efficacy and Learning Engagement in MOOCs: The Role of Online Academic Hardiness. Internet High. Educ. 2021, 51, 100819. [Google Scholar] [CrossRef]
  51. Barak, M.; Watted, A.; Haick, H. Motivation to learn in massive open online courses: Examining aspects of language and social engagement. Comput. Educ. 2016, 94, 49–60. [Google Scholar] [CrossRef]
  52. Zhu, M.; Sari, A.; Lee, M.M. A systematic review of research methods and topics of the empirical MOOC literature (2014–2016). Internet High. Educ. 2018, 37, 31–39. [Google Scholar] [CrossRef]
  53. European Parlament. EU AI Act: First Regulation on Artificial Intelligence. 2023. Available online: https://www.europarl.europa.eu/news/en/headlines/society/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence (accessed on 11 November 2023).
  54. Delgado-Algarra, E.J.; Lorca-Marín, A.A. ¿Cómo Debe ser un Maestro de Ciencias en Tiempos de ChatGPT? 2023. Available online: https://theconversation.com/como-debe-ser-un-maestro-de-ciencias-en-tiempos-de-chatgpt-209825 (accessed on 30 September 2023).
  55. Hoaglin, D.C.; Mosteller, F.; Tukey, J.W. Exploring Data Tables, Trends, and Shapes; Wiley: New York, NY, USA, 1985. [Google Scholar]
  56. Rosenblatt, K. New bot ChatGPT Will Force Colleges to Get Creative to Prevent Cheating, Experts Say. BBC News, 8 December 2022. Available online: https://www.nbcnews.com/tech/chatgpt-can-generate-essay-generate-rcna60362(accessed on 15 April 2023).
  57. González, A.; Carabantes, D. MOOC: Medición de satisfacción, fidelización, éxito y certificación de la educación digital. RIED Rev. Iberoam. Educ. Distancia 2017, 20, 105–123. [Google Scholar]
  58. Siemens, G. Teaching in Social and Technological Networks. 2010. Available online: https://www.slideshare.net/gsiemens/tcconline (accessed on 17 August 2023).
  59. Moya, M. La Educación encierra un tesoro: ¿Los MOOCs/COMA integran los Pilares de la Educación en su modelo de aprendizaje online? In SCOPEO INFORME 2. MOOC: Estado de la Situación Actual, Posibilidades, Retos y Futuro; Academic Press: Cambridge, MA, USA, 2013; pp. 157–172. [Google Scholar]
  60. Arntz, M.T.; Gregory, T.; Zierahn, U. The Risk of Automation for Jobs in OECD Countries: A Comparative Analysis; OECD Social, Employment and Migration Working Papers; OECD Publishing: Paris, France, 2016. [Google Scholar]
  61. European Commission. Ethics Guidelines for Trustworthy AI. Available online: https://ec.europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai (accessed on 15 August 2023).
Figure 1. Macro-reference areas of MOOCs.
Figure 1. Macro-reference areas of MOOCs.
Applsci 14 01051 g001
Figure 2. Learning type.
Figure 2. Learning type.
Applsci 14 01051 g002
Figure 3. Correlational structure of AI. (significant values at 0.01 denoted by **).
Figure 3. Correlational structure of AI. (significant values at 0.01 denoted by **).
Applsci 14 01051 g003
Table 1. Category: Pedagogical Dimension of MOOCs.
Table 1. Category: Pedagogical Dimension of MOOCs.
SubcategoryIndicator
Initial Basic InformationDID.BAS.01. Language
DID.BAS.02. Temporalization
DID.BAS.03. List of modules
DID.BAS.04. Number of modules
DID.BAS.05. Difficulty level
DID.BAS.06. Previous knowledge required
DID.BAS.07. Teaching team
DID.BAS.08. Guide with general information
DID.BAS.09. Contact
Objectives And PowersDID.OYC.01. Objectives
DID.OYC.02. Competencies
DID.OYC.03. Communication competence
DID.OYC.04. Mathematical competence and basic competences in sciences and technology
DID.OYC.05. Digital competence 1
DID.OYC.06. Learning to learn competence
DID.OYC.07. Social and civic competence
DID.OYC.08. Sense of initiative and entrepreneurial competence
DID.OYC.09. Cultural awareness and expression competence
ContentDID.CON.01. Integration of knowledge, skills and attitudes
MethodologyDID.MET.01. Description of teaching activity
DID.MET.02. Details of student workload
DID.MET.03. Participation in the activities
DID.MET.04. Type of learning
DID.MET.05. Level of complexity
MeansDID.REC.01. Readings
DID.REC.02. Videos
DID.REC.03. Quizzes
DID.REC.04. Social networks
DID.REC.05. Webs
DID.REC.06. Other applications and resources
ScheduleDID.CRO.01. Temporary detail for content development
DID.CRO.02. Key dates and deadlines
EvaluationDID.EVA.01. Evaluation criteria
DID.EVA.02. When to evaluate
DID.EVA.03. How to evaluate
DID.EVA.04. Self-assessment
DID.EVA.05. Case Analysis
DID.EVA.06. Participation in forum
DID.EVA.07. Work preparation (essay, report, etc.)
BibliographyDID.BIB.01 Bibliography
1 “Digital competence involves the confident, critical, and responsible use of, and engagement with, digital technologies for learning, at work, and for participation in society. It includes information and data literacy, communication and collaboration, media literacy, digital content creation (including programming), safety (including digital well-being and competences related to cybersecurity), intellectual-property-related questions, problem solving, and critical thinking” [34] (p. 9).
Table 2. Category: Technical Dimension of MOOCs.
Table 2. Category: Technical Dimension of MOOCs.
SubcategoryIndicator
AccessibilityTEC.ACC.01. Plugins
TEC.ACC.02. Content access
NavigationTEC.NAV.01. Design
TEC.NAV.02. Ease of browsing
TEC.NAV.03. Browsing support elements
TEC.NAV.04. Toolbar with links
TEC.NAV.05. Visible links and hypertexts
TEC.NAV.06. Help system for course development
TEC.NAV.07. Content search engine
InteractivityTEC.INT.01. Facilities or tools for teacher–student interaction
TEC.INT.02. Facilities or tools for student–student interaction (cooperative work)
TEC.INT.03. Allows interaction by private message
TEC.INT.04. Allows interaction by chat
TEC.INT.05. Allows interaction by video conference
TEC.INT.06. Allows interaction through specific communication programs (Adobe Connect, Blackboard, etc.)
Table 3. Specific Block: Artificial Intelligence.
Table 3. Specific Block: Artificial Intelligence.
SubcategoryIndicator
Course ContentAI.CON.01. Algorithmic bias
AI.CON.02. Programming
AI.CON.03. Analysis
AI.CON.04. Machine learning
AI.CON.05. Deep Learning
AI.CON.06. Human learning
AI.CON.07. Ethical questions in data use
AI.CON.08. Inclusivity
AI.CON.09. Education, teaching and learning
TypesAI.TYP.01. Systems that think like humans
AI.TYP.02. Systems that act like humans
AI.TYP.03. Systems that think rationally
AI.TYP.04. Systems that act rationally
Table 4. Table of Frequencies and Percentages of Content Related to AI.
Table 4. Table of Frequencies and Percentages of Content Related to AI.
Content Not PresentImplicit ContentExplicit Content
f.%f.%f.%
AI.CON.01. Algorithmic bias.52471.4%699.4%14119.2%
AI.CON.02. Programming.33946.2%12917.6%26636.2%
AI.CON.03. Analysis.27437.3%628.4%39854.2%
AI.CON.04. Machine learning.11916.2%10614.4%50969.3%
AI.CON.05. Deep Learning.28438.7%25334.5%19726.8%
AI.CON.06. Human learning.25634.9%24847.4%13017.7%
AI.CON.07. Ethical questions in data use.70296.6%131.8%192.6%
AI.CON.08. Inclusivity.69694.8%91.2%294.0%
AI.CON.09. Education, teaching, and learning.32143.7%28639.0%12717.3%
(Own elaboration).
Table 5. Table of Frequencies and Percentages of types of AI.
Table 5. Table of Frequencies and Percentages of types of AI.
Content Not PresentImplicit ContentExplicit Content
f.%f.%f.%
AI.TYP.01. Systems that think like humans.32243.8%28038.1%13218.0%
AI.TYP.02. Systems that act like humans.66590.6%608.2%91.2%
AI.TYP.03. Systems that think rationally.61083.1%638.6%618.3%
AI.TYP.04. systems that act rationally.68192.8%435.9%101.4%
Table 6. Kaiser–Meyer–Olkin (KMO) Sample Adequacy and Bartlett Sphericity Test.
Table 6. Kaiser–Meyer–Olkin (KMO) Sample Adequacy and Bartlett Sphericity Test.
Kaiser–Meyer–Olkin Sampling Adequacy Measure.0.693
Bartlett’s Sphericity TestApproximate chi-square1228.549
Gl45
Sig.0.000
Table 7. Total Variance Explained.
Table 7. Total Variance Explained.
ComponentInitial EigenvaluesExtraction Sums of Squared Loadings
TotalVariance %Accumulated %TotalVariance %Accumulated %
12.60826.08126.0812.60826.08126.081
21.70717.07343.1541.70717.07343.154
31.06610.65553.8101.06610.65553.810
41.03310.33064.1401.03310.33064.140
50.7827.82071.960
60.7537.53179.491
70.6476.47385.964
80.5635.63591.599
90.4414.40996.007
100.3993.993100.000
Extraction method: Principal component analysis.
Table 8. Rotated Component Matrix (Summarized).
Table 8. Rotated Component Matrix (Summarized).
Component
1234
AI.CON.04. Machine learning.0.671
AI.CON.05. Deep Learning.0.827
AI.CON.09. Education, teaching, and learning.0.595
AI.TYP.01. Systems that think like humans.0.669
AI.CON.07. Ethical questions in data use. 0.609
AI.TYP.02. Systems that act like humans. 0.735
AI.TYP.03. Systems that think rationally. 0.818
AI.CON.06. Human learning. 0.507
AI.CON.08. Inclusivity. 0.872
AI.CON.01. Algorithmic bias. 0.963
Extraction method: Principal component analysis. Rotation method: Varimax standardization with Kaiser.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Delgado Algarra, E.J.; Bernal Bravo, C.; Morales Cevallos, M.B.; López Meneses, E. Pedagogical and Technical Analyses of Massive Open Online Courses on Artificial Intelligence. Appl. Sci. 2024, 14, 1051. https://doi.org/10.3390/app14031051

AMA Style

Delgado Algarra EJ, Bernal Bravo C, Morales Cevallos MB, López Meneses E. Pedagogical and Technical Analyses of Massive Open Online Courses on Artificial Intelligence. Applied Sciences. 2024; 14(3):1051. https://doi.org/10.3390/app14031051

Chicago/Turabian Style

Delgado Algarra, Emilio José, César Bernal Bravo, María Belén Morales Cevallos, and Eloy López Meneses. 2024. "Pedagogical and Technical Analyses of Massive Open Online Courses on Artificial Intelligence" Applied Sciences 14, no. 3: 1051. https://doi.org/10.3390/app14031051

APA Style

Delgado Algarra, E. J., Bernal Bravo, C., Morales Cevallos, M. B., & López Meneses, E. (2024). Pedagogical and Technical Analyses of Massive Open Online Courses on Artificial Intelligence. Applied Sciences, 14(3), 1051. https://doi.org/10.3390/app14031051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop