Next Article in Journal
Bringing Critical Mathematics Education and Actor–Network Theory to a Statistics Course in Mathematics Teacher Education: Actants for Articulating Complexity in Student Teachers’ Foregrounds
Next Article in Special Issue
Do Cases Always Deliver What They Promise? A Quality Analysis of Business Cases in Higher Education
Previous Article in Journal
Brain Science and Geographic Thinking: A Review and Research Agenda for K-3 Geography
Previous Article in Special Issue
The Perceived Value of Remote Access Online Learning: An Instrument Construction and Validation Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validation of the Italian Version of the Community of Inquiry Survey

by
Salvatore Nizzolino
1,2,*,
Agustí Canals
3 and
Marco Temperini
4
1
Doctoral Programme in Education and ICT (e-Learning), Research Line Challenges for Sustainable Management and Organization in Online Education, Universitat Oberta de Catalunya, 08018 Barcelona, Spain
2
Faculty of Information Engineering, Computer Science and Statistics, Sapienza University of Rome, 00185 Rome, Italy
3
Faculty of Economic and Business, Universitat Oberta de Catalunya, 08018 Barcelona, Spain
4
Department of Computer Control and Management Engineering, Sapienza University of Rome, 00185 Rome, Italy
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(12), 1200; https://doi.org/10.3390/educsci13121200
Submission received: 19 October 2023 / Revised: 20 November 2023 / Accepted: 22 November 2023 / Published: 29 November 2023
(This article belongs to the Collection Trends and Challenges in Higher Education)

Abstract

:
This work presents the process of validation of the community of inquiry (CoI) survey in its Italian version. For over two decades, the CoI framework has been used to conceptualize online higher-order teaching/learning experiences as processes of inquiry in which participants collaborate in discourse and critical reflection to cocreate knowledge and achieve meaningful learning. The CoI is hinged on the mutual interaction of three dimensions named presences: teaching presence, social presence, and cognitive presence. The official survey to detect the level of presence perceived by learners has been predominantly conducted in English. In recent years, a number of scholars have deemed that its original format suits at least a B2 level of English proficiency, and several translations in other languages have been validated. Accordingly, the validation of the Italian version aims to improve the accuracy of the CoI questionnaire conducted among native Italian learners (n = 234). Analyses show satisfactory outputs in terms of validity and reliability of the 34 Likert-scale items, whilst adaptations to other languages open new perspectives grounded on cultural variables.

1. Introduction

In almost two and a half decades, the community of inquiry (CoI) framework has contributed to the monitoring of content analysis—particularly texts from forums and board comments—in a variety of asynchronous text-based environments [1,2,3,4,5,6] as well as in hybrid learning [7,8,9,10], both in university education and lifelong learning perspectives [11,12] and in association with a large diversity of tools and media [13,14,15] including the metaverse [9]. The framework was designed and tested at the Athabasca University (Canada) by Garrison, Anderson, and Archer [16], and has been increasing in popularity and scope following the leading trends of topics such as online education and e-learning.
CoI represents a framework comprising three interconnected dimensions known as “presences,” which drive educational environments emphasizing higher-order thinking through collaborative learning interactions. As per the well-established definitions provided by its founders, the three dimensions of the community of inquiry (CoI) may be defined as follows (Figure 1). Cognitive Presence (CP) is the imperative element of critical thinking, a range of processes and outcomes often presented as the essential result of a successful higher education pathway. In order to manifest, this dimension requires participants to develop a certain degree of learning awareness [16,17]. Social Presence (SP) is the aptitude of participants to project their personality traits into the community in order to be perceived as real human beings, despite the lack of face-to-face interactions [16,17].Teaching Presence (TP) implies all actions required by the teacher/instructor to monitor and manage the learning environment, coordinate, and encourage the other two presences [16,17]. Frequently, TP deemed the “pivotal dimension,” encompasses the design, facilitation, motivation, and guidance of learning activities [16,18].
The limited adoption of the CoI framework within the Italian educational system thus far could potentially experience a positive influence with the introduction of this initial translation of the official survey. The relatively low awareness of the CoI framework in the Italian context may be attributed, in part, to the EU’s extensive two-decade endeavor to formulate and promote frameworks supporting collective capacity building and the digital transformation of education. This dissemination strategy, designed to establish uniform skills and competences across EU member states, includes several prominent digital competence frameworks:
  • For citizens [19];
  • For educational organizations [20];
  • For educators [21];
  • For opening-up higher education institutions [22];
  • For consumers [23];
  • For entrepreneurship [24];
  • For life skills [25].
As a consequence, an awareness campaign on CoI conducted in higher educational institutions and in teacher training may also benefit from a more accessible survey translated and adapted to the local mindset. Compared to other frameworks, CoI is distinguished by its pragmatic foundation, characterized by a high level of intuitiveness and flexibility, as highlighted by experts in the field [26]. This feature garnered particular recognition during the years of the pandemic [27,28,29]. The validation of the Italian survey involved a comprehensive examination of the relevant literature spanning two decades. This body of literature includes seminal works that standardized methodological and conceptual aspects, explored study cases in diverse settings, and conducted significant literature reviews. Additionally, insights from papers detailing similar validation experiences in other languages proved to be extremely valuable in terms of approach and methodology. So far, the survey has been validated in Portuguese [30], Korean [31], Chinese [32], Turkish [33], Spanish [34], and German [35]. The mentioned translations adapted the questionnaire revised and officialized by Arbaugh et al. [36], whilst the Chinese version relates to the increased template of the framework, which includes a controversial fourth dimension: learning presence [37,38]. Thus, the Chinese version is functional for the present study only in the sections that pertain to the relationships between the three original presences examined by the authors. Ma et al. [32] stressed the undeniable need for validating a different language version before formally using it by highlighting the different statistical viewpoints in the studies of translated versions.
The incorporation of the official survey in its original triadic format empowered instructors to discern the manifestation of the three presences based on the experiences documented in an extensive body of literature. In addition, the focus on different digital platforms allows for an instructional design perspective, which stimulates implications to also adopt CoI as a “predictive model” in association with specific e-learning settings and tools.

2. The Theory behind the Framework

A traditional approach to cognitive constructivism marked the personal construction of knowledge as a concept largely inspired by Piaget. In contemporary educational discourse, the foundational principles of the prevailing constructivist model trace back to the contributions of Vygotsky and the pragmatist philosophers Pierce and Dewey (1902/1991). Broadly categorized under the umbrella of social constructivism, the pioneers of this movement proposed theories centered on the construction of knowledge through social interactions and an approach rooted in scientific inquiry. Since the early stage of online education, technology has been adopted worldwide in order to facilitate both synchronous and asynchronous interactions between students and teachers, rather than sustaining the mere transmission of contents. The educational standpoint of social constructivism posits that knowledge possesses a social nature and proposes its formation within the individual minds of each learner. Consequently, the process of knowledge acquisition goes beyond the traditional model of teachers imparting knowledge to passive students; instead, each student actively constructs meaningful interpretations, facilitating the integration of new knowledge with existing background knowledge [39,40,41,42]. Moreover, these principles have garnered attention from the educational academic community in recent decades, particularly influencing the design and management of MOOCs [43,44,45]. In a remote learning environment that encourages the cocreation of knowledge through communication, several foundations derived from constructivism support the process:
  • Learners can develop new knowledge by layering it on top of prior knowledge;
  • New knowledge is not the result of passive reception but of an active process;
  • Language becomes the essential medium for building new knowledge in a communicative context;
  • Awareness and consciousness of one’s own learning helps not only to learn but also to assist, evaluate, and guide one’s peers;
  • New knowledge is consolidated through social exchange/negotiation, projected into the real world, and validated.
According to Kreijns et al. [46], the awareness that one’s learning is predicated on others’ learning to succeed reinforces group cohesion with the rest of the community. In a formal structured course, it is the instructor’s duty to continuously trigger this awareness and stimulate peer interactions. The digital tools chosen should promote constant and regular communication as the main strategy for meeting social needs. Accordingly, the texts exchanged are the content being analyzed to detect specific patterns. Each presence is assigned to a set of specific subdimensions, in order to detect the level of perception through determined indicators. TP: design and organization; facilitation; direct instruction. SP: affective expression; open communication; group cohesion. CP: triggering event; exploration; integration; resolution (see Figure 2).
An earlier concept that paved the way to the configuration of teaching presence was described by Andersen in 1979 as “teacher immediacy,” that is, the range of nonverbal behaviors that reduce the physical and/or psychological distance between teachers and students [47].
In a remote digital environment, the absence of face-to-face paralinguistic cues, such as hand gestures, eye contact, and tone of voice, presents a significant challenge. In this setting, traditional roles and behaviors rooted in the conventional teacher/student hierarchy cannot be reliably applied to calibrate the learning experience [18]. Given the unique demands of this context, a preliminary assessment becomes essential. In this regard, TP emerges as the dimension with the highest predictive factor, as it shapes the future cognitive load [48,49,50]. However, it is crucial to recognize that instructors should anticipate synergies between SP and CP and, accordingly, structure the learning environment with suitable social affordances [51,52].
While the three presences are interconnected and their effects overlap, the founders of the CoI framework observed that TP serves as the pivotal dimension leveraging SP, which activates and propels the collaborative process of CP [53]. SP often assumes a mediating role between TP and CP, fostering open communication and group cohesion in line with the constructivist perspective. Therefore, within the paradigm of learner-built knowledge, it may be plausible to represent CoI through a hierarchical sequence.
Notably, many attempts have been made to assign the primal mediation role to one of the three presences, often defining results as “causal relationships” [53,54,55,56,57]. Beyond the different degree of mediation role, what is relevant is the confirmed connection and interdependence within the triadic framework. The 34-item survey developed by Arbaugh et al. [36] to assess participants’ perceptions of the three dimensions has proven to be a reliable instrument to monitor the CoI model, and the present study is based on the translation of this instrument. The original English survey can be recovered from the official webspace associated to the Athabasca University (https://coi.athabascau.ca/coi-model/coi-survey/, accessed on 17 November 2023). The survey has been extensively validated by the most prolific academia revolving around CoI [57,58,59], along with other scholars who examined single case studies. Since the Italian version was not only translated but also required some interpretations and adaptations, items had to be validated. Justifications of specific reinterpretation of some terms are reported in Appendix A.

3. Samples and Data Collection

The methodology is based on the adoption of the CoI survey in Italian language (Appendix A), which mirrors the original English questionnaire of 34 Likert-scale items graded from 1 to 5. The constructs associated to the 34 items are assigned to three blocks related to the three presences: items 1–13 to TP; items 14–22 to SP; items 23–34 to CP (Figure 2). Within these three subgroups, items are designed to detect specific features, patterns, and attitudes characterizing each presence. The translated survey was converted into a Google Form and the link was shared through mail messages and the platforms adopted.
To identify suitable respondent profiles and gauge the perception of CoI elements, a set of basic requirements was established to assess participants’ understanding. The total number of participants (n = 234) comprised four groups involved in blended learning experiences and technology-enhanced learning from 2021 to 2023, as outlined in Table 1. According to the existing literature, hybrid contexts have often been linked to significantly high CoI perception scores [10,60,61]. The utilization of synchronous video communication (SVC) was prevalent, aiding in self-representation, content delivery, and social interactions on the screen, enabling participants to immerse themselves in the learning community. SVC, a powerful tool, can have a considerable impact on social, cognitive, or educational presence [62]. Some scholars in the field argued that CoI works have yet to discern the differentiation of learning communities, such as common interest groups, home communities, professional development communities, local working groups, etc. [63]. Therefore, by expanding and diversifying participant profiles beyond the university, this study contributes to testing CoI in the lifelong learning field.
All courses featured face-to-face lessons and additional online activities. The primary distinctions between the two learner profiles were the digital platforms used and the learning objectives. University students engaged through the Moodle platform, while adult learners utilized a collaborative teamwork platform. The Moodle solutions involved social activities conducted through forums, debates, and group work, while adult-oriented courses also included more informal solutions, such as WhatsApp. All platform settings generated automatic notifications whenever new materials or a new post was uploaded by the instructor, ensuring that all learners who subscribed to the platforms received messages in their mailboxes.
Regarding the administration of the CoI survey, university students were instructed to complete it face-to-face during the penultimate lesson of the semester. The teacher recommended that they take the necessary time to connect their personal devices (laptop, tablet, or smartphone) to the online survey and reflect on each item carefully to provide realistic feedback. Students who were not present in the classroom took the survey online within a specified time limit on the platform’s bulletin board. Conversely, adult learners completed the survey in both online and paper formats, as some preferred a more traditional way of answering the questionnaire due to the significant age difference within this group. All feedback was anonymized, and participants were informed of this in the survey instructions.
The criteria for evaluating the sample size are approached from two perspectives: (1) determining the minimum number for a representative sample and (2) establishing a proportion between the number of respondents and the items included in the survey. There are varied opinions on the appropriateness of sample size. Kass and Tinsley [64] suggested 5 to 10 respondents per item, Nunnally [65] recommended at least 10 or more inputs per item, and Comrey and Lee [66] proposed a sample size of 200 for meaningful feedback and 300 for highly reliable results. However, there are recommendations supporting the feasibility of working with a smaller sample size [67] in the range of 1/5 to 1/10, justifying the suitability of our sample (n = 234). Other CoI surveys in different languages, such as the Spanish version, were validated with 5 respondents per item, employing a sample size of n = 162 [34]. Considering these benchmarks and additional statistical tests detailed in the subsequent sections, it was determined that the number of participants in the present study was deemed adequate.
Bartlett’s test of Sphericity [68] was highly significant at p < 0.001, which shows that the correlation matrix has significant degree of correlation among some variables. The chi squared test [69] returned a value of 1140.503 and a high associated degree of freedom (df) of 401. Hence, it is possible to reject the hypothesis that the correlation matrix is an identity matrix. Moreover, the high df indicated a greater amount of variability in the data by providing more precision in statistical estimation. It is generally favorable due to allowing for a more reliable analysis, particularly when there is a large sample size or a large number of variables being considered, as in our case. So, a factor analysis may be worthwhile for the dataset.

4. Reliability Analysis and Validation

In this section, we look at important aspects of the dependent variables associated with their degree of reliability as a structured questionnaire. This degree can be estimated from the association between observations of the same construct. We also look at the relationships between reliability and construct validity [70]. During the first step, descriptive statistics were performed to assess the adequacy of the 34 items by calculating means, standard deviations, skewness and kurtosis of the three CoI dimensions (Table 2). If the mean of an item is too close to 1 or 5, the correlations between the rest of the items may be altered.
Statistics show symmetrical distribution of a remarkably high mean, even though some respondents valued TP with a score of 1 (negative skewness). Conversely, the internal constructs of this presence reveal the highest alpha values (Table 3).
The second step was that of running Cronbach’s α computing [71] to check the degree of the 34 items’ consistency. Cronbach’s α is used under the assumption that there are multiple items measuring the same underlying construct. A frequent implication in a survey to detect satisfaction demands different questions that ask different things, and in combination, they measure overall satisfaction. Cronbach’s α is a notorious measure of internal consistency and is also considered an assessment of scale reliability [72,73]. It can be expressed as follows:
α = n n 1 1 i V i V i
Cronbach’s α is also a key element of the confirmatory factor analysis (CFA) and exploratory factor analysis (EFA). In the other translated versions of the survey, the process of validation went through CFA and EFA; more specifically, the Portuguese version [30] ascertained the three-factor structure (n = 510). The Kaiser–Meyer–Olkin (KMO) test, used to measure sampling adequacy, reporting a value of 0.97, demonstrated the sample adequacy with high internal consistencies of 0.89 or higher. The Turkish version [33] showed a KMO of 0.98 and high internal consistencies above 0.95 (n = 575). The Korean version [31] validated the three-factor structure through EFA (n = 498), justifying the suppression of two items due to cross-loadings on multiple factors. The KMO was then 0.97, with high internal consistencies above 0.90 [74].
The r.drop (or “r eliminated”) value in the item statistics section of the output of a Likert scale analysis (Such as the output of the alpha() function of the psych package in RStudio) represents the correlation coefficient of each item with the sum of the scores of the other items on the scale, excluding the item under consideration. Put differently, the r.drop coefficient measures the correlation between an item and the other items in the scale, except for the considered item. This coefficient can be used as a measure of the importance of the item in the scale, i.e., if the item is significantly correlated with the other items in the scale, excluding the item itself, then the item could be important for the measure of the construct or latent dimension that the scale intends to measure [75]. So, a significant r.drop value might suggest that the item is strongly correlated with the other scale items, while a nonsignificant value might suggest that the item is unrelated or weakly correlated with the other scale items.
Concerning the interpretation of r.drop values, in our case, the lowest is 0.29 (item 21) and the highest is 0.80 (item 23).
The evaluation of the importance of an item in a Likert scale by the value of r.drop depends on the specific context of the analysis and the scale in question. In general, however, it can be said that an r.drop value between 0.30 and 0.70 may be considered acceptable, while a value below 0.30 or above 0.70 may require further observations about the validity of the item or scale [75,76]. In addition, the value of r.drop always needs to be interpreted in conjunction with other item statistics, such as the Cronbach’s α coefficient, the variance explained by the item in the scale, its factorial loading, or its saturation [71]. In our case, the lowest r.drop value (0.29) for item 21—associated with SP—might indicate a relatively low correlation between this and the other items in the scale. On the other hand, the highest r.drop value (0.80) for item 23—associated with CP—could denote a strong correlation between this and the other scale items, excluding itself. Nevertheless, as mentioned above, the assessment of the scale items should not be based only on the r.drop value, but also on other statistics [77].
It is also worth considering the “if an item is dropped” value of items associated to each presence. In analyzing the internal consistency through the Cronbach’s α coefficient, the alpha se value indicates the estimated standard error of alpha. In other words, it represents the precision of the Cronbach’s α estimate calculated on the sample data. Thus, it is an indication of the variability of alpha that might be observed in different samples. A low alpha se indicates that the Cronbach’s α estimate is reliable and should not vary significantly if the sample is repeated. Meanwhile, a high alpha se value indicates a less reliable estimate of alpha, which might vary significantly in different samples. In general, a value of alpha se lower than 0.05 is considered acceptable for most analyses. Greater homogeneity among items (i.e., greater correlation among them) is associated with a lower standard error of alpha. As shown in Table 4, the value med.r relates to the median of the correlations between each questionnaire item and the total score of the questionnaire “if item is eliminated”, calculated after the elimination of the item in question. In other terms, med.r indicates the median correlation between each questionnaire item and the total score when that item is eliminated from the total score calculation. The med.r is often used to assess the internal consistency, particularly to determine whether a particular questionnaire item negatively affects the reliability of the questionnaire. In general, a higher med.r value indicates a higher correlation between the considered item and the total score of the questionnaire, which suggests that the elimination of that item might decrease the internal consistency of the questionnaire. Conversely, a lower med.r value indicates a lower correlation between the considered item and the total score of the questionnaire, which suggests that the elimination of that item might improve the internal consistency of the questionnaire.
A med.r value above 0.3 or 0.4 indicates that the considered item is moderately or significantly correlated with the total score of the questionnaire, which, in turn, suggests that the elimination of that item might decrease the internal consistency of the questionnaire. The lower point of SP Cronbach’s α, although highly acceptable, is discussed further in association with other values that put this dimension under particular observation, in line to the shared results from the CoI-related body of literature. In order to establish the degree to which the different set of questions measure the same construct, a further analysis was performed on the 10 constructs assigned to the three presences. The consistency of the 10 constructs is particularly relevant, since they serve as indicators. Being each construct a single bone of the survey spine, they can be also examined as interrelated variables.
The standard guideline indicates that a value exceeding 0.7 is considered good, while lower values suggest that the associated items may require revision or that the scale itself needs redesigning. Alternatively, low values could indicate a limited perception of the associated construct, which, in our context, may be a direct or indirect consequence of insufficiently developed or perceived SP during learning experiences. In various case studies within the literature, it has been observed that the intended learning outcomes often depend more on TP than on SP and CP [78]. Fluctuations in the perception of SP have been cited as a reason to expand the original three presences by introducing elements such as self-efficacy, self-regulation, effort-regulation, and emotional factors [37]. The complexity of asynchronous communication and social interactions through digital media poses challenges to SP, leading to issues such as high dropout rates, poor learning outcomes, and low satisfaction and interaction. Given these necessary remarks on SP, it is worth recalling that this study focuses on validating the internal consistency of the Italian survey and is not intended to assess the level of satisfaction related to the implementation of the CoI framework.

5. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA)

Even though the English survey has been extensively validated, a new process of validation is a common practice each time a questionnaire is localized and adapted to a different language. The perspective of handling the process of validation for an “untested survey” justifies a multivariate statistical approach. Exploratory factor analysis (EFA) unveils latent factors or dimensions underlying a set of observed variables—in this case, survey items. It assists in determining the number of factors to retain based on factors such as eigenvalues, scree plots, and cumulative variance explained. EFA provides factor loadings, indicating the strength and direction of the relationship between each item and each factor. By interpreting these loadings and examining the pattern matrix, one can identify the underlying constructs represented by the factors and assess the consistency of items within each factor. EFA can also identify potential cross-loadings or ambiguous items that may require revision or removal for improved construct validity. Combined with confirmatory factor analysis (CFA), EFA offers an initial exploration of the data, identifying potential components, while CFA delves deeper, confirming the factor structure and providing detailed insights into latent constructs. Moreover, EFA aims to reduce many variables to a smaller number of factors by detecting major similarities [79].
In our case, CFA employs oblique rotation (promax) based on the decomposition of the correlation matrix. Oblique rotation allows for the extraction of correlated factors, which can better capture the underlying structure when there are expected relationships or correlations between factors. CFA validates the principal components that capture the maximum variation in the data, and with oblique rotation, these components can have correlations with each other. The analysis of eigenvalues, scree plot (Figure 3), and cumulative variance (Table 5), helps determine the number of components to retain. The eigenvalue analysis yields four components above the value of 1 as in other case-studies [36,80], and the association between components and survey items confirms the survey structure based on the three presences.
By focusing on the magnitude and direction of the loadings, items with higher absolute loadings above 0.4 indicate a stronger association with the respective factor. Positive loadings are indicative of a positive relationship, while negative loadings indicate a negative relationship. The identification of a coherent theme or construct associated with each factor based on the items helps assign meaningful interpretations to the CoI survey based on three presences. A revision of the cross-loadings was necessary, namely of items loading on multiple factors or not loaded. Cross-loadings may suggest that the item measures more than one construct or that it is not clearly associated with a single factor. The oblique method applied to assess the strength and pattern of these correlations was promax. It is common to consider loadings above a certain threshold (e.g., 0.3 or 0.4) as meaningful contributors to the factor. In our loading process, we applied the 0.4 value, as shown in Table 6.
Items 1–13, excluding item 12 due to its loading falling below the 0.4 threshold, constitute components 1 and 2. It can be asserted that components 1 and 2 consistently represent TP. Meanwhile, component 5 encompasses all items associated with SP, except for item 22, which exhibits cross-loading under components 1, 2, and 3. This suggests that item 22 shares common variance with multiple latent factors, indicating its potential to capture aspects of different constructs or dimensions simultaneously. Lastly, component 3 and 4 load all items associated with CP. The diagram in Figure 4 shows the visual association of items according to factor loading.

6. Internal Correlations of the 10 Constructs

CFA is applied in the following set of diagrams (Figure 5, Figure 6 and Figure 7). Circles represent the 10 constructs as indicators. Boxes represent the survey items associated to each construct and they are the variables. The value below each box reports the variation in each item measured from 0; we find the lowest variation in item 20 and the highest in item 17 (both in Figure 7).

7. Discussion

Based on statistical tests and measurements, the validation process revealed a satisfactory level of internal validity and reliability. Consequently, the Italian version of the community of inquiry (CoI) survey is recommended for assessing the perception of the CoI framework in teaching/learning contexts among Italian native speakers. However, further insights may be derived from the collected results, making the Italian survey a valuable monitoring tool.
In accordance with similar results [34], EFA measured a significant correlation between constructs associated with Teaching Presence (TP) and Cognitive Presence (CP), whilst Social Presence (SP) suggests a less marked degree of correlation with the other two presences. In the surveyed contexts, the mediation role of SP does not seem to be of paramount importance for a successful learning outcome. Survey items associated with SP formulate a set of statements that are highly related to the individual affective and emotional sphere, so they foster self-evaluation more than assessing the self in the social context. Consequently, the social facilitation promoted by instructors is not detected using SP items alone, and the influence of TP is not perceivable in this section of the survey. This also accords with previous observations on SP from authors supporting possible adjustments to this specific dimension. It is noteworthy that, since its initial validation, the original English version of the questionnaire, comprising 34 items, has undergone optimization attempts for application in specific contexts. In many cases, following EFA and CFA, certain studies excluded particular items [31,80] or expanded the survey by incorporating additional items [81,82]. However, for the validation of the Italian version, such modifications were not considered. The introduction of this instrument to a new cultural and linguistic context prioritized avoiding secondary controversies that might confuse end-users.
Numerous studies addressing the validation of the English version of the CoI questionnaire can be found in various publications [36,83,84,85,86]. A significant aspect of this body of literature is that the founders of CoI also cite the validation in other languages as a valuable experience supporting the robustness of the theory behind the original English version [30,31,32,33,34].
Certain limitations in the present validation process must be taken into account. In the Italian case, the diversification in samples—belonging to both academic and lifelong learning courses—associates learners with unconnected groups. An interclass correlation approach to future analysis may return additional information on the assumption that different learning needs and contexts may be treated as different clusters. Lastly, translations of the CoI survey may introduce a new range of reflections related to the SP awareness in different cultures. The SP constructs affective expression, open communication, and group cohesion may represent significant cultural variables, depending on the learners’ background. The learners’ engagement through digital media normally activated to enhance a blended learning experience through social affordances may be considered unsatisfactory in certain cultures in spite of the massive adoption of technology and smart devices. In the literature around the CoI, the cultural perspective as a variable capable of affecting the implementation of the model is currently uncharted territory.

8. Conclusions

Two applications of the Italian community of inquiry (CoI) survey may be defined within two major scopes: (1) monitoring and assessment tool in collaborative training and courses supporting higher-order thinking; (2) reflection on teaching/learning dynamics in a variety of settings, totally online and/or blended. As far as the first application is concerned, it has been confirmed over more than two decades that the CoI framework is not a suitable instrument for assessing learning outputs and individual performances. Conversely, CoI has been successfully implemented to monitor interactions between teaching practices and learners’ communities and as a guide to define the qualities of the educational experience. Indeed, through the CoI questionnaire, we can gauge aspects related to course organization and design, learners’ engagement and satisfaction, students’ perception of the social context [87,88,89], and the teachers’ capacity to foster higher-order thinking and improve planning [13,90,91,92,93].
Concerning the second application, it has been acknowledged how the flexibility of the framework fits a wide range of educational settings, such as adult education [94,95,96], continuing professional training [97,98,99,100,101] and even informal learning [102,103]. For this reason, the Italian version aimed to adapt the translation to a versatile questionnaire (see Appendix B), which may also fit professional development, nonformal and informal education.
A third scope, which is opening promising scenarios, is that of employing CoI for instructional design as well as a predictive model. Indeed, by applying the CoI principles to adjust future affordances, usability, and user experience, statements such as “TP depends on instructional design and organization of the curriculum and activities” [31] would be reframed in the opposite way: curriculum and activities depend on how TP predicts and designs students’ level of engagement. As a predictive model, CoI would allow for course designers to infer valid implications on how the three presences interact by assessing teaching, social, and cognitive affordances provided through digital tools.
The current validation serves as an initial phase, and a subsequent dissemination stage coupled with additional data collection is imperative to initiate discussions regarding potential adjustments. Italian academic institutions, higher education centers, and online learning platforms providing instruction in the Italian language could play a pivotal role in expanding and diversifying samples and case studies in the coming years. Future case studies could use the control group method as a benchmark to compare a CoI-based experience with a traditional learning experience though this Italian version of the questionnaire. Similar to the experience with the English version, a broader exchange of insights and experiences has the potential to inspire further adaptations of the questionnaire to local conditions, including the consideration of removing or adding items as needed.

Author Contributions

Conceptualization, S.N. and A.C.; methodology, A.C.; software, S.N.; validation, S.N.; formal analysis, S.N.; investigation, S.N., and M.T.; resources and data collection, S.N. and M.T.; data curation, S.N.; writing—original draft preparation, S.N.; writing—review and editing, S.N. and A.C.; visualization, S.N.; supervision, A.C.; project administration, A.C.; funding acquisition, S.N. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank the support from the Generalitat de Catalunya to the KIMO research group (2021 SGR 01516).

Institutional Review Board Statement

The data collection did not include personal or sensitive data, the feedback was totally anonymous. The final data-set comprised Likert-scale numbers.

Informed Consent Statement

The questionnaire included an introductory disclaimer explaining that feedback was used for research purposes and that the final submit worked as authorization. Written informed consent from the participant(s) to publish this paper is not applicable.

Data Availability Statement

A partial dataset (n = 149) generated during the current study is available in the corresponding author’s Google repository (Google Form), while other respondents were interviewed using paper questionnaires available upon reasonable request. The feedback of 149 participants is available at the following link: https://docs.google.com/forms/d/1dWXZtKpfxQkHu-JIdQUbAS3pPbdz34wVozk5DHOsUXY/viewanalytics. Others 85 participants’ feedback (on paper-based questionnaire) is available on reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

COICommunity of inquiry
TPTeaching presence
SPSocial presence
CPCognitive presence
MOOCsMassive open online courses
CPAPrincipal component analysis
EFAExploratory factor analysis
KMOKaiser–Meyer–Olkin

Appendix A

Table A1. Italian translation of the official CoI survey based on 34 Likert-scale items.
Table A1. Italian translation of the official CoI survey based on 34 Likert-scale items.
Frequentist Individual Mean and St. Dev.
ItemMeanSD
1. Il docente ha comunicato i contenuti del corso in maniera chiara.4.4480.700
2. Il docente ha comunicato gli obiettivi del corso in maniera chiara.4.4780.691
3. Il docente ha fornito istruzioni chiare per partecipare alle attività.4.5070.773
4. Il docente ha comunicato con chiarezza le scadenze e le tempistiche per realizzare le attività.4.4480.800
5. Il docente mi ha aiutato/a nell’identificare aspetti che possono generare accordo/disaccordo, facilitando l’apprendimento.4.2760.770
6. Il docente ha orientato il gruppo per facilitare la comprensione dei contenuti e mi ha aiutato/a a chiarire le incertezze.4.2990.795
7. Il docente ha coinvolto i partecipanti in un dialogo produttivo e ne ha stimolato la partecipazione.4.3880.735
8. Il docente ha agevolato l’impegno costante dei partecipanti, e questo ha facilitato il mio percorso.4.2610.785
9. Il docente ha incoraggiato i partecipanti ad esplorare nuovi concetti.4.3730.645
10. Il docente ha stimolato nei partecipanti il senso di appartenenza al gruppo.3.9480.991
11. Il docente ha orientato il dibattito su aspetti pertinenti, aiutandomi ad apprendere.4.2990.672
12. Il docente ha fornito un feedback per individuare i miei punti di forza e di debolezza in relazione agli obiettivi del corso.4.1940.845
13. Il docente ha fornito un feedback entro tempi di attesa accettabili.4.1640.833
14. Conoscere altri partecipanti mi ha fatto/a sentire parte del gruppo.4.0670.815
15. Ho potuto farmi un’idea chiara su alcuni partecipanti del corso.3.9180.805
16. La comunicazione digitale è un ottimo mezzo per l’interazione sociale.3.8130.911
17. Mi sono sentito/a a mio agio conversando attraverso gli strumenti digitali.4.0300.775
18. Mi sono sentito/a a mio agio partecipando agli scambi e alle discussioni.4.2010.680
19. Mi sono sentito/a a mio agio interagendo con gli altri partecipanti.4.2160.653
20. Mi sono sentito/a a mio agio anche quando ho manifestato un disaccordo, in un clima di fiducia reciproca.3.9480.697
21. Ho percepito che il mio punto di vista era accettato dagli altri partecipanti.3.9400.733
22. Gli scambi online mi hanno aiutato/a a sviluppare uno spirito di collaborazione.4.0600.723
23. Le situazioni proposte hanno accresciuto il mio interesse nel corso.4.2910.713
24. La attività proposte hanno stimolato la mia curiosità.4.4180.664
25. Sono stato/a motivato/a ad esplorare aspetti legati ai contenuti proposti.4.2840.800
26. Ho usato diverse fonti d’informazione per esplorare i contenuti proposti.4.4180.664
27. La riflessione condivisa e la ricerca di informazioni mi hanno aiutato/a ad elaborare i contenuti.4.3510.640
28. Gli scambi online mi hanno aiutato a considerare prospettive diverse dalle mie.4.1640.777
29. Ho gestito le incertezze combinando le nuove informazioni con le mie conoscenze pregresse.4.4030.650
30. Le attività mi hanno aiutato/a ad elaborare chiarimenti e soluzioni.4.3880.600
31. Le riflessioni sui contenuti e lo scambio di idee mi hanno aiutato/a a capire meglio i concetti fondamentali.4.3430.650
32. Sono in grado di spiegare come applicare e verificare le conoscenze acquisite.4.0750.690
33. Ho trovato soluzioni ai quesiti proposti che si possono applicare in situazioni concrete.4.2390.706
34. Posso applicare quello che ho imparato al mio lavoro o in altri contesti al di fuori del corso che ho frequentato.4.2910.703

Appendix B

Appendix B.1. Remarks on Lexicon Interpretation/Adaptation

The general viewpoint that guided the translation, performed by the first author, who serves as an instructor of English as a foreign language, was a balance between the original meaning and a high degree of adaptability suitable for both online and blended education, as well as a large extent of profiles ranging from young students to lifelong learning environments, in formal and nonformal contexts.

Appendix B.2. Instructor

With the original CoI survey being in English, the authors aimed to elaborate a version suitable for the Italian education context. By setting a long-term vision, which includes a variety of settings, subjects, and teaching styles, the translation process adapted some English terms that could create ambiguity if literally translated without a pragmatic approach. The term “instructor” was translated with “docente”, a category that includes all kinds of professional educators, in formal and nonformal settings, in both the public and private sector. This Italian term is normally used to indicate whoever transfers knowledge within and outside education institutions. Other terms such as “tutor” (common in Italian) or “insegnante” (teacher) evoke, respectively, workshops and nonformal courses (the former) and only lower grades in public school (the latter). Eventually, the term “professore” (professor) is perceived as excessively formal and generally related to high educational contexts such as secondary school and university, whilst “istruttore” (instructor) refers to a range of specific practical skill courses (e.g., driving school, horseback riding, etc.), so they were not appropriate for including all possible variables of online settings, such as MOOCs and other self-paced solutions. It is worth mentioning that the word “docente” is also part of the institutional communication and lexicon adopted by the Ministry of Education, particularly with the construct “classe docente”, which includes all roles and conditions related to the pathway to becoming a qualified professional educator in Italy.

Appendix B.3. Discussion

The terms discussion/discussions characterize items 11, 18, 22, 28, and 31, and represent a highly relevant concept in a constructivist framework. The literal translation in Italian is “discussione/i”, but it may lead to some misunderstandings. Frequently, asynchronous online exchanges are limited to occasional posts, reading and approving opinions in a forum, taking the floor in a debate by publishing a point of view, or simply by reading other participants’ content. In Italian culture, the term “discussione” evokes a longer engagement for talking about one’s points of view, and the more suitable translation seems to be “dibattito” (debate). On the other hand, a debate implies an involvement to oppose different arguments, and many learners may not perceive an involvement spread over several weeks or months as a “dibattito”. Consequently, the authors opted for the emotionally neutral term “scambio” (exchange), which evokes a wider variability of interactions, not necessarily associated to a prolonged and mutual personal engagement. The term “dibattito” was used only in item 11, where it conveys the concept of “educational debate”, so it evokes the teachers’ method and not learners’ engagement in specific and circumscribed discussions. According to our positive results, respondents show a consistent awareness of what an online “scambio” is and implies in a blended course.

Appendix B.4. Participants

The Italian translation kept the same semantic degree by choosing the term “partecipante”. Other solutions were weighted, such as the Spanish translation of the survey adopting options that are equivalents of the words “classmate” and “student”. The motivation to keep the concept of participants is rooted in the variety of environments and settings the CoI may be adopted in, as previously explained.

References

  1. Akyol, Z.; Garrison, D.R. The Development of a Community of Inquiry over Time in an Online Course: Understanding the Progression and Integration of Social, Cognitive and Teaching Presence. J. Asynchronous Learn. Netw. 2008, 12, 3–22. [Google Scholar]
  2. Gutierrez-Santiuste, E.; Gallego-Arrufat, M.-J. Internal structure of virtual communications in communities of inquiry in higher education: Phases, evolution and participants’ satisfaction. Br. J. Educ. Technol. 2015, 46, 1295–1311. [Google Scholar] [CrossRef]
  3. Junus, K.; Santoso, H.B.; Ahmad, M. Experiencing the community of inquiry framework using asynchronous online role-playing in computer-aided instruction class. Educ. Inf. Technol. 2022, 27, 2283–2309. [Google Scholar] [CrossRef] [PubMed]
  4. Neto, V.; Rolim, V.; Pinheiro, A.; Lins, R.D.; Gašević, D.; Mello, R.F. Automatic Content Analysis of Online Discussions for Cognitive Presence: A Study of the Generalizability across Educational Contexts. IEEE Trans. Learn. Technol. 2021, 14, 299–312. [Google Scholar] [CrossRef]
  5. Parrish, C.W.; Guffey, S.K.; Williams, D.S.; Estis, J.M.; Lewis, D. Fostering Cognitive Presence, Social Presence and Teaching Presence with Integrated Online—Team-Based Learning. TechTrends 2021, 65, 473–484. [Google Scholar] [CrossRef] [PubMed]
  6. Tzelepi, M.; Makri, K.; Petroulis, I.; Moundridou, M.; Papanikolaou, K. Gamification in online discussions: How do game elements affect critical thinking? In Proceedings of the IEEE 20th International Conference on Advanced Learning Technologies, ICALT 2020, Tartu, Estonia, 6–9 July 2020; pp. 92–94. [Google Scholar] [CrossRef]
  7. Garrison, D.R.; Vaughan, N.D. Community of Inquiry and Blended Learning. In Blended Learning in Higher Education; Wiley: Hoboken, NJ, USA, 2012; pp. 13–30. [Google Scholar] [CrossRef]
  8. Martin, F.; Wu, T.; Wan, L.; Xie, K. A Meta-Analysis on the Community of Inquiry Presences and Learning Outcomes in Online and Blended Learning Environments. Online Learn. J. 2022, 26, 325–359. [Google Scholar] [CrossRef]
  9. Ng, D.T.K. What is the metaverse? Definitions, technologies and the community of inquiry. Australas. J. Educ. Technol. 2022, 38, 190–205. [Google Scholar] [CrossRef]
  10. Stenbom, S. A systematic review of the Community of Inquiry survey. Internet High. Educ. 2018, 39, 22–32. [Google Scholar] [CrossRef]
  11. Chen, R.H. Effects of Deliberate Practice on Blended Learning Sustainability: A Community of Inquiry Perspective. Sustainability 2022, 14, 1785. [Google Scholar] [CrossRef]
  12. Eriksen, M. Personal Leadership Conundrum. J. Manag. Educ. 2007, 31, 263–277. [Google Scholar] [CrossRef]
  13. Cheng, G. Using the community of inquiry framework to support and analyse BYOD implementation in the blended EFL classroom. Internet High. Educ. 2022, 54, 100854. [Google Scholar] [CrossRef]
  14. Fiock, H. Designing a Community of Inquiry in Online Courses. Int. Rev. Res. Open Distrib. Learn. 2020, 21, 134–152. [Google Scholar] [CrossRef]
  15. Yang, X.; Zhang, M.; Kong, L.; Wang, Q.; Hong, J.C. The Effects of Scientific Self-efficacy and Cognitive Anxiety on Science Engagement with the “Question-Observation-Doing-Explanation” Model during School Disruption in COVID-19 Pandemic. J. Sci. Educ. Technol. 2021, 30, 380–393. [Google Scholar] [CrossRef] [PubMed]
  16. Garrison, D.R.; Anderson, T.; Archer, W. Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. Internet High. Educ. 1999, 2, 87–105. [Google Scholar] [CrossRef]
  17. Garrison, D.R.; Arbaugh, J.B. Researching the community of inquiry framework: Review, issues, and future directions. Internet High. Educ. 2007, 10, 157–172. [Google Scholar] [CrossRef]
  18. Anderson, T.; Rourke, L.; Garrison, D.R.; Archer, W. Assessing teaching presence in a computer conferencing context. J. Asynchronous Learn. Netw. 2001, 5, 1–17. [Google Scholar] [CrossRef]
  19. Carretero, S.; Vuorikari, R.; Punie, Y. The Digital Competence Framework for Citizens (Issue May); Publication Office of the European Union: Luxembourg, 2017. [Google Scholar] [CrossRef]
  20. Kampylis, P.; Punie, Y.; Devine, J.; Institute for Prospective Technological Studies. Promoting Effective Digital-Age Learning: A European Framework for Digitally-Competent Educational Organisations; Publication Office of the European Union: Luxembourg, 2015. [Google Scholar] [CrossRef]
  21. Redecker, C.; Punie, Y.; European Commission. Joint Research Centre. European Framework for the Digital Competence of Educators: DigCompEdu; Publications Office of the European Union: Luxembourg, 2017; p. 93. [Google Scholar] [CrossRef]
  22. Santos, I.D.; Punie, Y.; Castaño, M.J. Opening up Education: A Support Framework for Higher Education Institutions; Publication Office of the European Union: Luxembourg, 2016; p. 78. [Google Scholar] [CrossRef]
  23. Brečko, B.; Ferrari, A. The Digital Competence Framework for Consumers; Publication Office of the European Union: Luxembourg, 2016. [Google Scholar] [CrossRef]
  24. Bacigalupo, M.; Kampylis, P.; Punie, Y.; van den Brande, G. EntreComp: The Entrepreneurship Competence Framework. EUR Sci. Tech. Res. Ser. 2016, 27939, 35. Available online: http://www.worldcat.org/oclc/954079372 (accessed on 3 May 2023).
  25. Sala, A.; Punie, Y.; Garkov, V.; Cabrera, M.; European Commission. Joint Research Centre. LifeComp: The European Framework for Personal, Social and Learning to Learn Key Competence; Publications Office of the European Union: Luxembourg, 2020. [Google Scholar] [CrossRef]
  26. Redmond, P.; Lock, J.V. A flexible framework for online collaborative learning. Internet High Educ. 2006, 9, 267–276. [Google Scholar] [CrossRef]
  27. Jia, C.; Hew, K.F.; Bai, S.; Huang, W. Adaptation of a conventional flipped course to an online flipped format during the COVID-19 pandemic: Student learning performance and engagement. J. Res. Technol. Educ. 2022, 54, 281–301. [Google Scholar] [CrossRef]
  28. Kabilan, M.K.; Annamalai, N. Online teaching during COVID-19 pandemic: A phenomenological study of university educators’ experiences and challenges. Stud. Educ. Eval. 2022, 74, 101182. [Google Scholar] [CrossRef]
  29. Olson, J.S.; Kenahan, R. An Overwhelming Cloud of Inertia”: Evaluating the Impact of Course Design Changes Following the COVID-19 Pandemic. Online Learn. 2021, 25, 264–281. [Google Scholar] [CrossRef]
  30. Moreira, J.A.; Ferreira, A.G.; Almeida, A.C. Comparing communities of inquiry of Portuguese higher education students: One for all or one for each? Open Prax. 2013, 5, 165. [Google Scholar] [CrossRef]
  31. Yu, T.; Richardson, J.C. Examining reliability and validity of a Korean version of the Community of Inquiry instrument using exploratory and confirmatory factor analysis. Internet High. Educ. 2015, 25, 45–52. [Google Scholar] [CrossRef]
  32. Ma, Z.; Wang, J.; Wang, Q.; Kong, L.; Wu, Y.; Yang, H. Verifying causal relationships among the presences of the Community of Inquiry framework in the Chinese context. Int. Rev. Res. Open Distance Learn. 2017, 18, 213–230. [Google Scholar] [CrossRef]
  33. Olpak, Y.Z.; Çakmak, E.K. Examining the Reliability and Validity of a Turkish Version of the Community of Inquiry Survey. Online Learn. J. 2018, 22, 147–161. [Google Scholar] [CrossRef]
  34. Velázquez, B.B.; Gil-Jaurena, I.; Encina, J.M. Validation of the Spanish version of the “Community of Inquiry” survey. Rev. Educ. A Distancia 2019, 1, 59. [Google Scholar] [CrossRef]
  35. Norz, L.M.; Hackl, W.O.; Benning, N.; Knaup-Gregori, P.; Ammenwerth, E. Development and Validation of the German Version of the Community of Inquiry Survey. Online Learn. J. 2023, 27, 468–484. [Google Scholar] [CrossRef]
  36. Arbaugh, J.; Cleveland-Innes, M.; Diaz, S.R.; Garrison, D.R.; Ice, P.; Richardson, J.C.; Swan, K.P. Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet High. Educ. 2008, 11, 133–136. [Google Scholar] [CrossRef]
  37. Shea, P.; Bidjerano, T. Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments. Comput. Educ. 2010, 55, 1721–1731. [Google Scholar] [CrossRef]
  38. Shea, P.; Bidjerano, T. Learning presence as a moderator in the community of inquiry model. Comput. Educ. 2012, 59, 316–326. [Google Scholar] [CrossRef]
  39. Miettinen, R. The concept of experiential learning and john dewey’s theory of reflective thought and action. Int. J. Lifelong Educ. 2000, 19, 54–72. [Google Scholar] [CrossRef]
  40. Shaffer, D.W. Pedagogical praxis: The professions as models for postindustrial education. Teach. Coll. Rec. 2004, 106, 1401–1421. [Google Scholar] [CrossRef]
  41. Swan, K. Social construction of knowledge and the community of inquiry framework. In Open and Distance Education Theory Revisited; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar] [CrossRef]
  42. Swan, K.; Garrison, D.R.; Richardson, J.C. A Constructivist Approach to Online Learning: The Community of Inquiry Framework; IGI Global: Hershey, PA, USA, 2009. [Google Scholar] [CrossRef]
  43. Bordoloi, R.; Das, P.; Das, K. Lifelong learning opportunities through MOOCs in India. Asian Assoc. Open Univ. J. 2020, 15, 83–95. [Google Scholar] [CrossRef]
  44. Gonda, D.; Ďuriš, V.; Pavlovičová, G.; Tirpáková, A. Analysis of factors influencing students’ access to mathematics education in the form of MOOC. Mathematics 2020, 8, 1229. [Google Scholar] [CrossRef]
  45. Abdullah, S.; Abirami, R.M.; Gitwina, A.; Varthana, C. Assessment of academic performance with the e-mental health interventions in virtual learning environment using machine learning techniques: A hybrid approach. J. Eng. Educ. Transform. 2021, 34, 79–85. [Google Scholar] [CrossRef]
  46. Kreijns, K.; Van Acker, F.; Van Buuren, H. Community of Inquiry: Social Presence Revisited. E-Learn. Digit. Media 2014, 11, 5–18. [Google Scholar] [CrossRef]
  47. Rourke, L.; Anderson, T.; Garrison, D.R.; Archer, W. Assessing social presence in asynchronous text-based computer conferencing. J. Distance Educ. 1999, 14, 50–71. [Google Scholar]
  48. Burbage, K.; Jia, Y.; Hoang, T. The impact of community of inquiry and self-efficacy on student attitudes in sustained remote health professions learning environments. BMC Med. Educ. 2023, 23, 481. [Google Scholar] [CrossRef]
  49. Kozan, K. The incremental predictive validity of teaching, cognitive and social presence on cognitive load. Internet High. Educ. 2016, 31, 11–19. [Google Scholar] [CrossRef]
  50. Liao, H.; Zhang, Q.; Yang, L.; Fei, Y. Investigating relationships among regulated learning, teaching presence and student engagement in blended learning: An experience sampling analysis. Educ. Inf. Technol. 2023, 28, 12997–13025. [Google Scholar] [CrossRef]
  51. Weidlich, J.; Bastiaens, T.J. Designing sociable online learning environments and enhancing social presence: An affordance enrichment approach. Comput. Educ. 2019, 142, 103622. [Google Scholar] [CrossRef]
  52. Zhang, R.; Bi, N.C.; Mercado, T. Do zoom meetings really help? A comparative analysis of synchronous and asynchronous online learning during COVID-19 pandemic. J. Comput. Assist. Learn. 2023, 39, 210–217. [Google Scholar] [CrossRef] [PubMed]
  53. Garrison, D.R.; Cleveland-Innes, M.; Fung, T.S. Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. Internet High. Educ. 2010, 13, 31–36. [Google Scholar] [CrossRef]
  54. Assalahi, H. Learning EFL online during a pandemic: Insights into the quality of emergency online education. Int. J. Learn. Teach. Educ. Res. 2020, 19, 203–222. [Google Scholar] [CrossRef]
  55. Dempsey, P.R.; Zhang, J. Re-examining the construct validity and causal relationships of teaching, cognitive, and social presence in community of inquiry framework. Online Learn. J. 2019, 23, 62–79. [Google Scholar] [CrossRef]
  56. Shea, P.; Bidjerano, T. Measures Of Quality in Online Education: An Investigation of the Community of Inquiry Model and The Net Generation. J. Educ. Comput. Res. 2008, 39, 339–361. [Google Scholar] [CrossRef]
  57. Shea, P.; Bidjerano, T. Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Comput. Educ. 2009, 52, 543–553. [Google Scholar] [CrossRef]
  58. Garrison, D.R.; Cleveland-Innes, M.; Fung, T. Student role adjustment in online communities of inquiry: Model and instrument validation. J. Asynchronous Learn. Netw. 2004, 8, 61–74. [Google Scholar] [CrossRef]
  59. Swan, K.; Ice, P. The community of inquiry framework ten years later: Introduction to the special issue. Internet High. Educ. 2010, 13, 1–4. [Google Scholar] [CrossRef]
  60. Almasi, M.; Zhu, C. Investigating students’ perceptions of cognitive presence in relation to learner performance in blended learning courses: A mixed-methods approach. Electron. J. E-Learn. 2020, 18, 324–336. [Google Scholar] [CrossRef]
  61. Hu, Y.; Huang, J.; Kong, F. College students’ learning perceptions and outcomes in different classroom environments: A community of inquiry perspective. Front. Psychol. 2022, 13, 1047027. [Google Scholar] [CrossRef] [PubMed]
  62. Themeli, C.; Bougia, A. Tele-proximity: Tele-community of Inquiry Model. Facial Cues for Social, Cognitive, and Teacher Presence in Distance Education. Int. Rev. Res. Open Distrib. Learn. 2016, 17, 145–163. [Google Scholar] [CrossRef]
  63. Öberg, L.-M.; Nyström, C.A.; Littlejohn, A.; Vrieling-Teunter, E. Communities of Inquiry in Crisis Management Exercises. In Networked Professional Learning. Research in Networked Learning; Springer: Berlin/Heidelberg, Germany, 2019; pp. 55–68. [Google Scholar] [CrossRef]
  64. Kass, R.A.; Tinsley, H.E.A. Factor Analysis. J. Leis. Res. 1979, 11, 120–138. [Google Scholar] [CrossRef]
  65. Nunnally, J. Psychometric Theory, 2nd ed.; McGraw-Hill: New York, NY, USA, 1978. [Google Scholar]
  66. Comrey, L.; Lee, H.B. A First Course in Factor Analysis; Psychology Press: New York, NY, USA, 2013. [Google Scholar] [CrossRef]
  67. Gorsuch, R.L. Factor Analysis; Psychology Press: New York, NY, USA, 2013. [Google Scholar] [CrossRef]
  68. Bulinski, A.; Butkovsky, O.; Sadovnichy, V.; Shashkin, A.; Yaskov, P.; Balatskiy, A.; Samokhodskaya, L.; Tkachuk, V. Statistical Methods of SNP Data Analysis and Applications. Open J. Stat. 2012, 2, 73–87. [Google Scholar] [CrossRef]
  69. Plackett, R.L. Karl Pearson and the Chi-Squared Test. Int. Stat. Rev. 1983, 51, 59. [Google Scholar] [CrossRef]
  70. Babbie, E. Practice of Social Research. 2010. Available online: https://search.worldcat.org/title/662584565 (accessed on 17 November 2023).
  71. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef]
  72. Hair, J.F.; Anderson, R.E.; Babin, B.J.; Black, W.C. Multivariate Data Analysis: A Global Perspective; Pearson Education: Hoboken, NJ, USA, 2010; Available online: https://lib.ugent.be/catalog/rug01:001321386 (accessed on 17 November 2023).
  73. Meyers, L.; Gamst, G.; Guarino, A. Applied Multivariate Research: Design and Interpretation; SAGE Publication: London, UK, 2005. [Google Scholar]
  74. Heilporn, G.; Lakhal, S. Investigating the reliability and validity of the community of inquiry framework: An analysis of categories within each presence. Comput. Educ. 2020, 145, 103712. [Google Scholar] [CrossRef]
  75. Moore, D.S.; McCabe, G.P.; Craig, B.A. Introduction to the Practice of Statistics. 2012. Available online: https://search.worldcat.org/title/700406591 (accessed on 17 November 2023).
  76. Watson, J.; Fitzallen, N.; Fielding-Wells, J.; Madden, S. The Practice of Statistics. In International Handbook of Research in Statistics Education; Springer: Berlin/Heidelberg, Germany, 2018; pp. 105–137. [Google Scholar] [CrossRef]
  77. Maravelakis, P. The use of statistics in social sciences. J. Humanit. Appl. Soc. Sci. 2019, 1, 87–97. [Google Scholar] [CrossRef]
  78. Szeto, E. Community of Inquiry as an instructional approach: What effects of teaching, social and cognitive presences are there in blended synchronous learning and teaching? Comput. Educ. 2015, 81, 191–201. [Google Scholar] [CrossRef]
  79. Reyna, J.; Hanham, J.; Vlachopoulos, P.; Meier, P. Using factor analysis to validate a questionnaire to explore self-regulation in learner-generated digital media (LGDM) assignments in science education. Australas. J. Educ. Technol. 2019, 35, 128–132. [Google Scholar] [CrossRef]
  80. Borup, J.; Shin, J.K.; Powell, M.G.; Evmenova, A.S.; Kim, W. Revising and Validating the Community of Inquiry Instrument for MOOCs and other Global Online Courses. Int. Rev. Res. Open Distrib. Learn. 2022, 23, 82–103. [Google Scholar] [CrossRef]
  81. Şen-Akbulut, M.; Umutlu, D.; Arıkan, S. Extending the Community of Inquiry Framework: Development and Validation of Technology Sub-Dimensions. Int. Rev. Res. Open Distance Learn. 2022, 23, 61–81. [Google Scholar] [CrossRef]
  82. Wertz, R.E.H. Learning presence within the Community of Inquiry framework: An alternative measurement survey for a four-factor model. Internet High Educ. 2022, 52, 100832. [Google Scholar] [CrossRef]
  83. Abbitt, J.T.; Boone, W.J. Gaining insight from survey data: An analysis of the community of inquiry survey using Rasch measurement techniques. J. Comput. High Educ. 2021, 33, 367–397. [Google Scholar] [CrossRef]
  84. Garrison, D.R. E-Learning in the 21st Century: A Community of Inquiry Framework for Research and Practice, 3rd ed.; Routledge: New York, NY, USA, 2016; pp. 1–202. [Google Scholar] [CrossRef]
  85. Swan, K.; Richardson, J.C.; Ice, P.; Garrison, D.R.; Cleveland-Innes, M.; Arbaugh, J.B. Validating a measurement tool of presence in online communities of inquiry. E-mentor 2008, 2, 24. Available online: https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-70284 (accessed on 17 November 2023).
  86. Wei, L.; Hu, Y.; Zuo, M.; Luo, H. Extending the COI Framework to K-12 Education: Development and Validation of a Learning Experience Questionnaire. In Blended Learning. Education in a Smart Learning Environment; Lecture Notes in Computer Science (including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2020; Volume 12218, pp. 315–325. [Google Scholar] [CrossRef]
  87. Guo, P.; Saab, N.; Wu, L.; Admiraal, W. The Community of Inquiry perspective on students’ social presence, cognitive presence, and academic performance in online project-based learning. J. Comput. Assist. Learn. 2021, 37, 1479–1493. [Google Scholar] [CrossRef]
  88. Tan, C. The impact of COVID-19 pandemic on student learning performance from the perspectives of community of inquiry. Corp. Gov. 2021, 21, 1215–1228. [Google Scholar] [CrossRef]
  89. Yin, B.; Yuan, C.-H. Precision Teaching and Learning Performance in a Blended Learning Environment. Front. Psychol. 2021, 12, 631125. [Google Scholar] [CrossRef]
  90. Blayone, T.J.B.; van Oostveen, R.; Barber, W.; DiGiuseppe, M.; Childs, E. Democratizing digital learning: Theorizing the fully online learning community model. Int. J. Educ. Technol. High. Educ. 2017, 14, 13. [Google Scholar] [CrossRef]
  91. Butler, D.L.; Schnellert, L. Collaborative inquiry in teacher professional development. Teach. Teach. Educ. 2012, 28, 1206–1220. [Google Scholar] [CrossRef]
  92. Kovanović, V.; Joksimović, S.; Poquet, O.; Hennis, T.; Čukić, I.; de Vries, P.; Hatala, M.; Dawson, S.; Siemens, G.; Gašević, D. Exploring communities of inquiry in Massive Open Online Courses. Comput. Educ. 2018, 119, 44–58. [Google Scholar] [CrossRef]
  93. Wang, K.; Zhu, C.; Li, S.; Sang, G. Using revised community of inquiry framework to scaffold MOOC-based flipped learning. Interact. Learn. Environ. 2022. [Google Scholar] [CrossRef]
  94. Ammenwerth, E.; Hackl, W.O.; Felderer, M.; Sauerwein, C.; Hörbst, A. Building a Community of Inquiry within an Online-Based Health Informatics Program: Instructional Design and Lessons Learned; IOS Press: Amsterdam, The Netherlands, 2018; Volume 253. [Google Scholar] [CrossRef]
  95. Delmas, P.M. Using VoiceThread to Create Community in Online Learning. TechTrends 2017, 61, 595–602. [Google Scholar] [CrossRef]
  96. Howell, S.L.; Johnson, M.C.; Hansen, J.C. The Innovative Use of Technological Tools (the ABCs and Ps) to Help Adult Learners Decrease Transactional Distance and Increase Learning Presence. Adult Learn. 2023, 34, 181–187. [Google Scholar] [CrossRef]
  97. Beavan, K. (Un)felt ferments: Limning liminal professional subjectivities with pragmatist–posthuman feminism and intimate scholarship. Manag. Learn. 2022, 53, 460–482. [Google Scholar] [CrossRef]
  98. Devonshire, E.; Dodds, S.; Costa, D.; Denham, R.; Fitzgerald, K.; Schneider, C.R. Educating and engaging a new target audience about the problem of pain for society. Br. J. Pain 2022, 16, 641–650. [Google Scholar] [CrossRef]
  99. Krzyszkowska, K.; Mavrommati, M. Applying the community of inquiry e-learning model to improve the learning design of an online course for in-service teachers in Norway. Electron. J. E-Learn. 2020, 18, 462–475. [Google Scholar] [CrossRef]
  100. Miller, M.G.; Hahs-Vaughn, D.L.; Zygouris-Coe, V. A confirmatory factor analysis of teaching presence within online professional development. J. Asynchronous Learn. Netw. 2014, 18, 1. [Google Scholar] [CrossRef]
  101. Zheng, B.; Ganotice, F.A.; Lin, C.-H.; Tipoe, G.L. From self-regulation to co-regulation: Refining learning presence in a community of inquiry in interprofessional education. Med. Educ. Online 2023, 28, 2217549. [Google Scholar] [CrossRef]
  102. Gandolfi, E.; Ferdig, R.E.; Soyturk, I. Exploring the learning potential of online gaming communities: An application of the Game Communities of Inquiry Scale. New Media Soc. 2023, 25, 1374–1393. [Google Scholar] [CrossRef]
  103. Hersleth, S.A.; Kubberød, E.; Gonera, A. Informal social learning dynamics and entrepreneurial knowledge acquisition in a micro food learning network. Int. J. Entrep. Innov. 2023, 24, 268–280. [Google Scholar] [CrossRef]
Figure 1. CoI framework adapted from Garrison et al. [16].
Figure 1. CoI framework adapted from Garrison et al. [16].
Education 13 01200 g001
Figure 2. Ideal sequence of the three presences, where SP is assigned a mediation role.
Figure 2. Ideal sequence of the three presences, where SP is assigned a mediation role.
Education 13 01200 g002
Figure 3. Eigenvalues scree plot.
Figure 3. Eigenvalues scree plot.
Education 13 01200 g003
Figure 4. EFA diagram returning a structure based on rotated components (promax method).
Figure 4. EFA diagram returning a structure based on rotated components (promax method).
Education 13 01200 g004
Figure 5. Constructs associated with teaching presence: design and organization; facilitation; direct instructions.
Figure 5. Constructs associated with teaching presence: design and organization; facilitation; direct instructions.
Education 13 01200 g005
Figure 6. Constructs associated with cognitive presence: triggering event; exploration; integration; resolution.
Figure 6. Constructs associated with cognitive presence: triggering event; exploration; integration; resolution.
Education 13 01200 g006
Figure 7. Constructs associated with social presence: affective expression; open communication; group cohesion.
Figure 7. Constructs associated with social presence: affective expression; open communication; group cohesion.
Education 13 01200 g007
Table 1. Respondents’ groups.
Table 1. Respondents’ groups.
Learners’ ProfileMedian AgeTraining ContextType/OrganizationnSVC
University students20Faculty of Engineering
Faculty of Pharmacy
Sapienza University of Rome118No
Adults45European Project ManagementSapienza University of Rome29Yes
Adults48European Project ManagementMunicipality31Yes
Adults46Italian L2 teachersLifelong learning body56Yes
Table 2. Descriptive statistics of the three presences.
Table 2. Descriptive statistics of the three presences.
MeanStd. Dev.SkewnessKurtosisMinMaxn
Teaching presence4.340.751−1.2822.44815234
Social presence4.030.753−0.4660.03815234
Cognitive presence4.310.686−0.8641.18815234
Table 3. Reliability analysis of each presence.
Table 3. Reliability analysis of each presence.
Standard AlphaAlpha seMed.r
Teaching presence0.940.00760.55
Social presence0.830.0240.32
Cognitive presence0.930.00910.53
Table 4. Conceptual structure of the survey showing associations between items and components.
Table 4. Conceptual structure of the survey showing associations between items and components.
Presence
Cronbach’s α
Items
Set
Items
Sub-Set
ConstructComponent
Cronbach’s α
TP—0.941–131–4Design and organization0.93
5–10Facilitation0.90
11–13Direct instruction0.76
SP—0.8414–2214–16Affective expression0.70
17–19Open communication0.66
20–22Group cohesion0.68
CP—0.9423–3423–25Triggering event0.89
26–28Exploration0.74
29–31Integration0.83
32–34Resolution0.85
Table 5. Eigenvalues of the five major components.
Table 5. Eigenvalues of the five major components.
Unrotated Solution
EigenvalueProportion Var.Cumulative
Component 114.4410.4250.425
Component 23.0870.0910.516
Component 31.3420.0390.555
Component 41.1420.0340.589
Component 50.9240.0270.616
Table 6. Factor loading.
Table 6. Factor loading.
Fact. 1Fact. 2Fact. 3Fact. 4Fact. 5Uniqueness
v40.867 0.217
v30.809 0.153
v10.805 0.176
v20.741 0.234
v60.4840.563 0.261
v22−0.4320.4140.475 0.424
v50.4190.546 0.278
v10 0.759 0.311
v7 0.745 0.351
v8 0.708 0.284
v11 0.675 0.388
v9 0.669 0.419
v13 0.468 0.475
v28 0.825 0.389
v27 0.813 0.321
v29 0.587 0.397
v31 0.522 0.431
v30 0.464 0.383
v24 0.441 0.276
v26 0.438 0.505
v23 0.418 0.292
v34 0.848 0.248
v33 0.817 0.242
v32 0.500 0.463
v25 0.446 0.274
v20 0.7330.454
v21 0.7320.516
v15 0.6420.621
v14 0.6090.533
v17 0.5810.687
v16 0.5730.677
v19 0.4870.530
v18 0.4720.579
v12 0.610
Note: Applied rotation method is promax.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nizzolino, S.; Canals, A.; Temperini, M. Validation of the Italian Version of the Community of Inquiry Survey. Educ. Sci. 2023, 13, 1200. https://doi.org/10.3390/educsci13121200

AMA Style

Nizzolino S, Canals A, Temperini M. Validation of the Italian Version of the Community of Inquiry Survey. Education Sciences. 2023; 13(12):1200. https://doi.org/10.3390/educsci13121200

Chicago/Turabian Style

Nizzolino, Salvatore, Agustí Canals, and Marco Temperini. 2023. "Validation of the Italian Version of the Community of Inquiry Survey" Education Sciences 13, no. 12: 1200. https://doi.org/10.3390/educsci13121200

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop