Next Article in Journal
Education and Employment of Refugees and Migrants in the Formal Elderly Healthcare Sector: Results from an Online Survey in Italy
Previous Article in Journal
Impact of Gulf Cooperation Countries’ Foreign Direct Investment on Sudan’s Agricultural Exports
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and Validation of a Questionnaire to Measure Digital Skills of Chinese Undergraduates

Department of College English Teaching, Qufu Normal University, Qufu 273165, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(6), 3539; https://doi.org/10.3390/su14063539
Submission received: 9 February 2022 / Revised: 13 March 2022 / Accepted: 14 March 2022 / Published: 17 March 2022

Abstract

:
To keep sustainable development in a digital society, it is essential for young people to possess certain digital skills. Levels of the digital skills of Chinese undergraduates take a certain role in the process of educational digitalization promoted by the Chinese government, becoming an important concern for Chinese universities and policymakers. However, few valid and reliable instruments are available for the assessment of the digital skills of undergraduates in China. Thus, developing, and testing the reliability and validity of a questionnaire to measure the digital skills of Chinese undergraduates are necessary. Based on previous literature and situations in the Chinese educational context, this study developed and validated a questionnaire to assess the digital skills of Chinese undergraduates. Through factor analysis approaches, the internal factor structure of the questionnaire was explored, and its reliability and validity were verified. Development and validation of a questionnaire, in which 6 factors were extracted, were described: access to and management of digital content, use of digital means, communication of digital content, creation of digital content, digital empathy, and digital safety. The questionnaire was applied to the first sample of 222 undergraduates and the second sample of 231 undergraduates selected randomly from a university located in the east of China. An exploratory factor analysis (EFA) through SPSS 26.0 with the first sample was used to determine the internal factor structure of the whole questionnaire, which showed the expected congruency between items and dimensions. Then a confirmatory factor analysis (CFA) through Mplus 8.3 with the second sample was utilized to check model-data fit of the questionnaire, which showed a good fit between them. Convergent validity and discriminant validity of the questionnaire were also verified. The resulting questionnaire emerged as a useful tool for carrying out nationwide studies on digital skills in higher education generally or in different disciplines specifically in China in the future.

1. Introduction

In the past few decades, digital technologies and social networks have been developed rapidly all over the world. Undergraduates, often referred to as digital natives, live in a digital society and use digital technologies for many daily activities; they are therefore expected to possess certain crucial digital skills. However, some undergraduates may not have the minimal required digital skills needed to use digital devices, the Internet, and digital social networks [1,2,3]. Even though some of them are digitally literate, they might not possess the digital skills needed to manage their education efficiently and effectively. There is a substantial difference between what they do with digital technologies and what they know about these technologies [4]. For example, they have weak awareness of information safety and their skills in coping with changes in digital technologies are obviously insufficient. To analyze the actual situations and the real levels of digital skills of Chinese undergraduates, first of all, a reliable and valid measuring instrument should be developed. Only by such an instrument, can we measure the levels of digital skills, identify deficits of digital skills, provide the necessary support to improve digital skills, and evaluate the effects of training programs on digital skills.
Nowadays, those who cannot acquire necessary digital skills from the digital environments will find themselves unable to fully participate in the surrounding economic, social and cultural life. Being “survival skills” [5] and “vital assets” [6] in a digital society, digital skills are not only essential abilities for the daily social life of undergraduates but also an important social capital for their participation in educational life, since learning and education have become more and more digitalized [7,8,9].
The Chinese government has put forward various policies to promote the application of digital technologies into higher education, for example, China's Education Modernization 2035, Education Informatization 2.0 Action Plan, Code for the Construction of Digital Campuses in Colleges and Universities (for Trial Implementation) (2021). Being the subject of higher education, undergraduate students play an important role in the effective application of digital technologies into higher education. Hence, the levels of digital skills of Chinese undergraduates need our exploration. However, research on this topic is far from sufficient and empirical research is especially scarce [10]. An important issue for the exploration is the measurement tools that are available to estimate the digital skills of Chinese undergraduate students. The existing questionnaires of digital skills might not function optimally when used to assess Chinese undergraduates, as it was not mainly developed in congruence with the situations in a Chinese context. Although a variety of survey instruments have been theoretically and empirically developed to assess the digital skills of young people [11,12], most of these studies were conducted in developed countries such as USA [13], UK [14], Spain [12], and so on. Little research employed empirical studies to construct an instrument to measure the digital skills of undergraduates in developing countries. However, the educational environment and the internet contexts in developing countries are different from that in developed countries [9]. Questionnaires to measure the digital skills of people in developing countries are necessary. Targeting undergraduate students in developing countries such as China, this study aims to construct a reliable and valid questionnaire to assess their digital skills empirically.
Furthermore, we searched for studies on the digital skills of Chinese undergraduates on the largest literature retrial service platform in China (www.cnki.net) (accessed on 16 January 2020), using the combination of keywords in the title or abstract of the article: “digital competenc*” OR “digital skills” OR “digital literacy” AND (“undergraduate*” OR “university students” OR “higher education”). Only 21 articles were found and most of them focused either on the theoretical framework of digital skills or the development of digital skills. No articles were found to construct a questionnaire measuring the digital skills of Chinese undergraduates from the empirical perspective. However, estimating levels of digital skills of undergraduates is essential for universities and policymakers to promote China’s educational digitalization, and is essential for undergraduates to benefit from digitalized education and keep sustainable development in a digital society.
Digital skills are of immense importance in gaining sustainable development in present-day society and have been the focus of attention in recent research [15,16]. Especially under the background of the current pandemic crisis all over the world, the digital skills of undergraduate students and how they shape their digital skills in education demand a deeper investigation. Many terms were used similarly as digital skills in previous studies (e.g., IT skills, ICT skills, computer competence, digital competence, digital literacy, etc.) [17,18,19,20], the term digital skills in this study refers to abilities to appropriately use different digital means; to access, manage, communicate, and create necessary digital content; to actively, sensitively, empathically, and ethically communicate with others while navigating digital environments; to be conscious about the safety of digital content [12,21].
In terms of the assessment of digital skills, questionaries are commonly used. Although an increasing amount of research has been conducted to investigate the measurement of digital skills, most do not necessarily include measurement of digital safety skills and digital empathy skills which are becoming crucial in a digital society due to the misuse, abuse, and unethical management of information and data [12].
Common methods used in existing research on questionnaires of digital skills are mostly focused on exploratory factor analysis (EFA) [12]. Nevertheless, EFA is exploratory and data-driven, which should not be used as the only means for the final determination of a questionnaire because EFA-based factor structures may not be reproducible in other data sets [22,23,24]. To verify a reliable and valid questionnaire, studies using confirmatory factor analysis (CFA) are needed as well. CFA allows explicit statistical testing of hypotheses in terms of the relationship between items, dimensions, and structure of the whole questionnaire by using a quantitative approach [22].
Under this background, the present study is to develop a reliable and valid questionnaire to assess the digital skills of Chinese undergraduates (Hereafter, the questionnaire is shortened as DigSki-CUS) by using the approaches of EFA with the software SPSS (version: 26.0) and CFA with the software Mplus (version: 8.3). The findings of this study will contribute to figuring out the levels of digital skills, and to helping improve the digital skills of Chinese undergraduate students. Hopefully, this study will accelerate vast research on digital skills in different countries.

2. Literature Review

Digital skills have attracted extensive attention all over the world [25], since pos-sessing necessary digital skills facilitates people benefiting from today’s digital technologies. For undergraduate students, digital skills help them access educational content, deal with rich information, and make use of e-learning as an educational medium effectively and efficiently [16].
Initially, digital skills were mainly associated with the use of computers [26]. With the passage of time and the development of digital technologies, the term digital skills has evolved accordingly and becomes a more comprehensive term that is associated with several skills. Ng [27] proposed a framework of digital skills from the dimensions of “technological”, “cognitive” and “social-emotional”. In this framework, the technological dimension is related to the skills of operating technical tools. The cognitive dimension is associated with the ability to think critically and reflectively when searching, using, and evaluating digital information. The social-emotional dimension is about the appropriate use of the Internet to socialize, communicate and learn, and to protect the safety and privacy of digital content. The American Library Association [28] defined a digitally skilled person as someone who owns various technical and cognitive skills, including using digital technologies, interpreting digital information, and collaborating with others in a digital society. The European Commission [29] claimed that a digital-skilled person is someone who uses digital information and communication technologies creatively, critically, and safely, being able to adapt to a new set of knowledge and attitudes that are necessary for a digital society. Vuorikari et al. [30] argued that digital skills are composed of the following dimensions: information and data literacy; communication and collaboration; digital content creation; safety; and problem-solving. Adorjan & Micciardelli [31] mentioned that a digitally literate student should attain technological skills, critical thinking skills, abilities to assess online risks, and abilities to protect their privacy and digital reputation. Johnston [32] summarized digital skills as the ability to use digital technologies to access, evaluate, create and communicate information in a socially responsible and ethical manner.
Although ideas considering digital skills vary in previous studies, they do share some core constructs of digital skills: abilities to locate, access, use and create information and data, to be safety-conscious about digital content, and to act responsibly and empathetically via the use of digital technologies. All these core constructs of digital skills should be considered in a measurement tool to assess the digital skills of Chinese undergraduates.
In terms of the assessment of digital skills, questionnaires are commonly used. Mengual-Andrés et al. [33] proposed a questionnaire including 5 dimensions: technological literacy, information access and use, communication and collaboration, digital citizenship, and creativity and innovation, which was designed to measure the digital skills of Spanish college students by using the Delphi method. Peart et al. [12] developed a questionnaire with 6 dimensions and 12 sub-dimensions to measure the digital skills of 16-35-year-old young people in Spain and the United Kingdom. Dimensions such as information, communication and collaboration, digital content creation, digital security, critical thinking, and socio-civic were included in the questionnaire. We can not deny that these questionnaires were effective to measure digital skills to some extent. However, they ignored the situations of less developed digital competence or digital inequalities in developing countries such as China. These questionnaires did not target Chinese undergraduate students and the validity and reliability of them to measure the digital skills of Chinese undergraduate students were not confirmed. What’s more, the COVID-19 pandemic has exposed digital deficits and challenges in school systems in developing countries and research in this field needs to be boosted.
In the Chinese educational context, Liu et al. [10] theoretically constructed a model to measure the digital skills of Chinese teachers. 8 dimensions such as digital literacy, digital teaching literacy, digital assessment ability, and so on were included in the model. Although Liu et al.’s [10] research was carried out in the Chinese educational context, its focus was on teachers not on undergraduate students. And it only constructed a theoretical model, which was not verified from the empirical aspect. Li & Hu [9] developed and validated a digital skills scale for Chinese schoolchildren empirically from 5 dimensions: operational skills, mobile skills, social skills, creative skills, and safety skills. However, the starting point, patterns, and challenges of digital technology use between children and undergraduates are quite different [9].
Besides measuring the digital skills, the aforementioned questionnaires were also utilized to assess other related dimensions of digital skills such as creativity and innovation skills, critical thinking skills, and socio-civic skills. These related dimensions will lengthen the digital skills questionnaire, bringing response burden and fatigue to respondents. This will cause a low response rate, low completion rate, and low data quality [34], which is not beneficial to measure specific digital skills related closely to digital means and digital content.
Based on the framework of Li & Hu [9], Mengual-Andrés et al. [33], and Peart et al. [12], we conceptualized a framework consisting of 5 dimensions of digital skills: information skills, communication skills, creation skills, digital safety skills, and digital empathy skills. Information skills referred to the ability to effectively access and manage digital content through digital means. Communication skills meant the ability to communicate with others in digital environments. Creation skills referred to the ability to create new content by using digital media and tools. Digital safety skills meant the ability to use the Internet and digital technologies safely and protect one’s own privacy and wellbeing. Digital empathy skills referred to the cognitive and emotional ability to be reflective and socially responsible, while strategically using digital technologies [35]. Dimensions such as creativity and innovation skills, critical thinking skills, socio-civic skills [12,33], were not considered due to the aforementioned response burden and fatigue.
According to the conceptual framework, indicators including 38 measuring items were used, adapted, and derived from previous studies for each of the 5 dimensions of digital skills [12,33], with some items added according to the situations in China [9]. These 38 measuring items would be discussed in Section 4.5. Questionnaire Development in detail.

3. Research Questions

Targeting Chinese undergraduate students, the present study aims to answer two specific research questions:
(1)
What is the internal factor structure of the questionnaire to estimate the digital skills of Chinese undergraduates?
(2)
What are the reliability and validity of the content and construct of the questionnaire to estimate the digital skills of Chinese undergraduates?

4. Methods

4.1. Research Design and Methods

The present study adopted a mixed-methods approach, consisting of qualitative methods in the development of DigSki-CUS (phase 1) and quantitative methods in the validation of DigSki-CUS (phase 2). In phase 1, the conceptual framework and item pool of the questionnaire were initially developed from an extensive literature review. After the questionnaire compilation, a pre-test was conducted among a small sample of 20 undergraduates. They completed the questionnaire and assessed its applicability and acceptance level. A focus-group interview was conducted to get feedback about the questionnaire from the 20 participants. In phase 2, a paper-and-pencil-based questionnaire was used to collect data of participants randomly sampled from a university located in the east of China. After data collection, EFA with SPSS 26.0 was conducted to determine the internal factor structure of the questionnaire, and CFA with Mplus 8.3 was performed to verify model-data fit, convergent validity, and discriminant validity. Figure 1 showed the 2 phases and different methods.

4.2. Ethical Considerations

To avoid harm to research participants resulting from research misconducts, ethical conducts should be considered to build trust and confidence in research [36]. Each step of this study was conducted according to the good research practice and requirements of the Ethics Committee and Institutional Review Board. Before participating in any phase of the study, participants received information about their involvement in developing this questionnaire and their rights as participants. They were told that the results of this study may be published in a peer-review journal. Participants were identified with unique identifying numbers. Their anonymity and personal information were protected throughout the whole process of this study.

4.3. Procedure

The procedure of this study can be briefly divided into two phases: the development phase and the validation phase. The development phase of the questionnaire involved a literature review, designing and constructing the dimensions and items of the questionnaire. In this phase, we reviewed previous research related to the measurement of digital skills and developed a preliminary questionnaire to assess the digital skills of Chinese undergraduates, including Information skills, communication skills, creation skills, digital safety skills, and digital empathy skills.
The validation phase of the questionnaire included assessing the reliability and validity of the questionnaire. Because the properties of an item or dimension can change after being adapted or revised, we conducted factor analyses to investigate the properties of the questionnaire. Approaches of EFA and CFA were adopted to confirm the internal consistency, reliability, and validity of the questionnaire.

4.4. Sample

For conducting EFA and CFA, 500 participants [37] were selected at random from undergraduates in one Chinese university where the researchers worked. Among these 500 participants, 475 students answered the designed paper-and-pencil-based DigSki-CUS voluntarily and anonymously. Survey responses were always confidential and deidentified from personal names and other identifying information. The percentage of participation was 95.0%. 22 students of these 475 students did not complete 80% of the whole questionnaire and data concerning them were delated. The percentage of the valid questionnaire was 95.4%. Thus, we got 453 cases for analysis. The 453 cases were randomly split into two groups, one group of 222 cases for EFA and another group of 231 cases for CFA. Demographic characteristics of the two groups of participants were shown in Figure 2.

4.5. Questionnaire Development

The first version of the questionnaire DigSki-CUS was developed based on Li & Hu [9], Liu et al. [10], Mengual-Andrés et al. [33], Peart et al. [12], and so on. These previous studies mentioned some important dimensions of digital skills that are relevant to young people, such as management and use of information and data, communication with others in a digital world, data protection, etc.
The conceptual framework of the questionnaire DigSki-CUs was developed mainly based on Li & Hu [9] and Peart et al. [12]. The initial conceptual framework consists of 5 dimensions of digital skills: information skills, communication skills, creation skills, digital safety skills, and digital empathy skills.
A pool of 38 items was developed based on instruments developed in Li & Hu [9] and Peart et al. [12]. The items identified in the chosen instruments were classified into 5 specified dimensions according to the conceptual framework of DigSki-CUS. Redundant or irrelevant items were excluded.
As the questionnaire was specifically developed for undergraduate students in China, we added some measuring items which assess specific necessary digital skills required in the Chinese educational context according to Chinese government documents, say, Code for the Construction of Digital Campuses in Colleges and Universities (for Trial Implementation) (2021). For example, items such as “I can complete digital content that meets the minimum requirements of learning tasks”, “I can create and edit digital content with higher standards according to the requirements of work or study” were original items based on this document. The 5 dimensions and 38 items of the initial version of DigSki-CUS were shown in Table 1.
The questionnaire was a paper-and-pencil-based self-assessment and was designed at with 7-point Likert scale, with 7 response options available for the items, ranging from 1 for completely disagree to 7 for completely agree.

4.6. Questionnaire Testing and Refinement

Before applying the version of DigSki-CUS (38 items) to a large sample and prior to the commencement of data collection, we ran a pre-test among a small sample of 20 participants to investigate whether the items in the questionnaire were understandable, to identify whether the items were misinterpreted, and to judge whether the questionnaire could be completed within an acceptable timeframe of 15–20 min. Feedback was obtained from a focus-group interview of the participants, who were asked to provide their opinions if they could understand all the questions and were able to respond to the provided scales. The problematic items were revised in line with the responses yielded from the interview. After the pre-test, 2 items (item 6 and 24) that were judged confusing were deleted in order to increase the usability, understandability, and consistency of the questionnaire. The refined questionnaire included 36 items and could be completed in less than 20 min, which was appropriate for use in the targeted population. Demographic information such as age, gender, and academic major was also collected. Thus, the second version of the DigSki-CUS (36 items) was formed.

4.7. Data Analysis

In order to determine the best factor structure, reliability, and validity of the second version of the DigSki-CUS (36 items), both EFA and CFA were conducted. EFA can be used to see what factors emerge from actual data, while CFA can be used to determine if the factors hold up [26]. The sample of 453 Chinese undergraduate students was randomly split in half. The first half of the sample of 222 students was used for EFA while the second half of the sample of 231 students was used for CFA [38].
Firstly, EFA with the principal axis factoring method was conducted to investigate the factor structure of the second version of the DigSki-CUS (36 items) by analyzing the relationships between items using the first sample of 222 undergraduates. For factor extraction, a decision about the number of factors to retain was based initially on eigenvalues of 1.0 or higher. A scree plot was also examined, looking for a change in the slope of the line connecting the eigenvalues of the factors. For factor rotation, oblimin rotation was utilized because the factors were expected to be correlated [39]. EFA was conducted with SPSS 26.0.
Secondly, CFA was conducted with Mplus 8.3 to determine whether the factor structure obtained in EFA could be confirmed on the second sample of 231 undergraduates. Model goodness-of-fit (GOF) was evaluated using several indices: Chi-square (x2), Degrees-of-freedom (df), Ratio of chi-square to degrees-of-freedom (x2/df) (good if x2/df < 2 and below; acceptable if x2/df < 5), Root Mean Square Error of Approximation (RMSEA) (good if RMSEA < 0.06 and below; acceptable if RESEA < 0.08), Comparative Fit Index (CFI) (good if CFI > 0.95; acceptable if CFI > 0.90), and Tucker-Lewis index (TLI) (good if TLI > 0.95; acceptable if TLI > 0.90) [39,40]. It should be noted that the above model fit threshold values are simply guidelines and should not be interpreted as strict rules [41].
In addition, we calculated values of Composite Reliability (CR) and Average Variance Extract (AVE) for factors to assess their convergent and discriminant power [42]. Convergent validity tests that items that are expected to be related are, in fact, related. It might be confirmed by values of AVE and CR. AVE is the mean variance extracted for the item loadings of a factor. It can be calculated following the formula below.
AVE = i = 1 n L i 2 n
In the formula, i refers to the number of items ranging from 1 to n. n stands for the total number of items and Li represents the standardized factor loading of item number i.
CR refers to the composite reliability and can be calculated according to the following formula.
CR = ( i = 1 n L i ) 2 ( i = 1 n L i ) 2 + ( i = 1 n e i )
In this formula, i refers to the number of items ranging from 1 to n. n represents the total number of items and Li represents the standardized factor loading of the item number i. ei is the unexplained variance of item number i by the factor.
Discriminant validity tests that items of factors that should have no relationship do, in fact, not have any relationship. Discriminant validity is verified if the square root of AVE value of one specific factor is much larger than the correlation coefficients of this factor with any other factors.

5. Results

5.1. Results of EFA

EFA was conducted with the first sample of 222 undergraduate students. The purpose of the EFA was to investigate the factors underlying the items in the second version of the DigSki-CUS (36 items). A principal axis analysis with oblimin rotation was run using SPSS 26.0. Cases with missing items had the item mean substituted. The means and standard deviation for each of the items were available by the correspondent author’s email.
To determine if factor analysis was adequate for the sample, Keiser–Meyer–Olkin measure of sampling adequacy (KMO) and Bartlett’s test of sphericity were confirmed. The KMO score was 0.926 (>0.6) and Bartlett’s test measure was smaller than 0.001, which indicated that the sample was acceptable for EFA.
An EFA with a principal axis analysis and oblimin rotation method with SPSS 26.0 was conducted to examine the latent factor structure of the DigSki-CUS. The number of factors is determined by the number of factors with an eigenvalue higher than 1.0 and the percentage of variance explained by factors. At least 50% of the variance should be explained [43]. The items belonging to a factor are decided by factor loadings. Factor loadings over 0.3 are considered to be significant for the inclusion of the items in a factor [44,45]. The following criteria help us to decide which items that would be excluded from a factor: items with factor loading below 0.3 were deleted, and items with substantial loadings on more than one factor were deleted.
The first EFA resulted in a solution of 6 factors with eigenvalues over 1.0, explaining 55.30% of the variance. However, in this solution, some items needed to be deleted. Item 36 and item 10 needed to be eliminated due to their low loadings. Item 18 had substantial loadings on two factors. So, item 18 should also be eliminated. In this way, we got our third version of the DigSki-CUS (33 items).
The second EFA was conducted after the deletion of items 36, 10 and 18. This analysis resulted in a solution of 6 factors with eigenvalues over 1.0, accounting for 55.845% (>50%) of the variance. The scree plot in Figure 3 confirmed the solution of 6 factors. In a scree plot, it is desirable to find a sharp reduction in the size of the eigenvalues, with the rest of the smaller eigenvalues constituting rubble. When the eigenvalues drop dramatically in size, an additional factor would add relatively little to the information already extracted.
Factor loadings of all items in the second factor analysis were all above 0.3, and there were no cross-loading items. To identify items that should be included in each of the 6 factors, oblimin rotation was conducted. Table 2 clearly shows the 6 factors and items in each factor with their factor loadings.
Factor loadings of all items in the second EFA were all above 0.3 and there were no cross-loading items with substantial loadings. To identify which items should be included in each of the 6 factors, oblimin rotation was conducted. Table 2 showed clearly the 6 factors and items in each factor with their factor loadings.
Factor 1 included item 29 (I am able to put myself in other people’s shoes in digital environments), item 30 (I am willing to help other people in digital environments), item 31 (I am able to use digital technologies to exercise my citizenship), item 32 (I respect other people in digital environments), item 33 (I take into account the opinion of others in digital environments), item 34 (I get informed before commenting on a topic), and item 35 (I am able to restrict my behavior based on the qualities of netizens). Factor 1 was identified as “digital empathy”, measuring a person’s cognitive and emotional ability to be reflective and socially empathic, while strategically using digital content [12]. This factor reflected our previous theoretical conceptualization of one dimension of digital skills: digital empathy skills.
Factor 2 encompassed item 1 (I have apps that keep me up to date with news), item 2 (I am able to search for and access information in digital environments), item 3 (I can use different tools to store and manage information), item 4 (I am able to search for information that I need on the Internet), and item 5 (I can understand the information that I get from the Internet). Factor 2 is named as “access to and management of digital content”, assessing abilities to locate, access, use and manage digital content correctly.
Factor 3 contained item 7 (I skillfully use digital software to complete learning tasks), item 8 (I can complete digital content that meets the minimum requirements of learning tasks), item 9 (I can create and edit digital content with higher standards according to the requirements of work or study), item 11 (I am able to use digital means to cooperate with others to complete tasks), item 12 (I am able to use digital means to solve problems encountered in my study), and item 13 (I am able to use digital means to detect plagiarism of content that I created). Factor 3 was denominated as the “use of digital means”, measuring abilities to use different digital means to achieve certain aims.
We found that the items in our previous dimension of information skills were grouped into two factors: Factor 2 (access to and management of digital content) and Factor 3 (use of digital means). Factor 2 focused on the information aspect while Factor 3 emphasized the technology aspect. Thus, it is reasonable to regard these two factors separately.
Factor 4 included item 22 (I am careful and try to ensure that my messages do not irritate others), item 23 (I am careful with my personal information), item 25 (I avoid having arguments with others in digital environments), item 26 (I am able to identify harmful behaviors that can affect me on social networks), item 27 (I avoid behaviors that are harmful on social networks), item 28 (Before doing a digital activity (e.g., upload a photo, comment…), I think about the possible consequences), item 37 (When sharing digital information, I am able to protect my privacy and security), and item 38 (I reflect on whether a digital environment is safe). Factor 4 was identified as “digital safety”, assessing abilities to protect personal privacy and digital content in digital environments.
Factor 5 encompassed item 14 (I can communicate with others in digital environments), item 15 (I know how to communicate with others through different digital means), item 16 (I know how to communicate with others in different ways (e.g., images, texts, videos…)), and item 17 (I communicate my ideas to people whom I know in digital environments). Factor 5 is named “communication of digital content”, measuring the needs and abilities to communicate digital content with others.
Factor 6 contained item 19 (I know different ways to create and edit digital content (e.g., videos, photographs, texts, animations…)), item 20 (I am able to accurately present what I want to deliver in digital environments), and item 21 (I can transform information and organize it in different formats). Factor 6 was denominated as “creation of digital content”, assessing abilities to create and edit new digital content.
Factor 4 (digital safety skills), Factor 5 (communication of digital content), and Factor 6 (creation of digital content) verified our previous conceptual framework of dimensions of digital safety skills, communication skills, and creation skills.
Through the procedure described above, we finally set up a theoretical and empirical framework of 6 digital skills: digital empathy, access to and management of digital content, use of digital means, digital safety, communication skills, and creation of digital content. A total of 33 items were retained out of 38 items.
The factor structure obtained from the third version of the DigSki-CUS is briefly demonstrated in Figure 4. In Figure 4, the middle and outer layers showed the name of each factor and the brief explanation of each factor, respectively.
After obtaining the factor structure of the questionnaire, a reliability analysis was conducted within each of the six factors to verify the internal consistency of the questionnaire. Cronbach’s α reliability coefficients were calculated through SPSS 26.0 to determine the reliability of the internal structure of each factor. Cronbach’s α reliability coefficients ranged from α = 0.784 to α = 0.888 (above 0.7) (Table 3), which showed that each factor has an acceptable internal consistency score and the internal structure of each dimension was reliable.
Based on the results of EFA, CFA was conducted to check the validity of the third version of the DigSki-CUS (33 items) in the following section.

5.2. Results of CFA

CFA was performed to assess how well the data of the second half sample of 231 Chinese undergraduate students fitted the obtained factor structure of the questionnaire that was derived from EFA. The analysis was run with the software package Mplus 8.3. To determine model-data fit, multiple goodness-of-fit indices were examined. The following indices of model-data fit were used to examine whether the factor structure of the questionnaire fitted the collected data: (1) x2, df, x2/df, (2) RMSEA, (3) CFI, (4) TLI. Indices such as AVE and CR were utilized to check the convergent validity and the discriminant validity of the third version of DigSki-CUS (33 items).

5.2.1. Model Goodness-of-Fit

We conducted CFA with Mplus 8.3 on the basis of the factor structure that had been previously obtained from EFA. Model fit indices of each factor of the third version of DigSki-CUS (33 items) were calculated.
The CFA results revealed model fit indices of the factor “use of digital means” (with items 7, 8, 9, 11, 12, 13) as follows: x2 = 46.636, df = 9, x2/df = 5.18, p < 0.001, RMSEA = 0.153, CFI = 0.930, TLI = 0.883. Although CFI = 0.930>0.9 showed a good model-data fit, x2/df = 5.18 (>3), RMSEA = 0.153 (>0.08), TLI = 0.883 (<0.9) showed that the model and the data did not fit well. According to the modification index provided by Mplus 8.3, Item 11 had a higher residual correlation with both item 8 (M.I. = 16.280) and item 12 (M.I. = 18.064), and it was deleted from the analysis, leaving a total of 5 items for the factor “use of digital means”. Subsequent CFA of this revised factor (with items 7, 8, 9, 12, 13) confirmed a good model-data fit (x2 = 11.142, df = 5, x2/df = 2.228 < 3, p < 0.001, RMSEA = 0.073 < 0.08, CFI = 0.981 > 0.95, TLI = 0.962 > 0.95) (Table 4).
The CFA recorded model goodness-of-fit indices of the factor “access to and management of digital content” (with items 1, 2, 3, 4, 5) as follows: x2 = 10.320, df = 5, x2/df = 2.064 < 3, p < 0.001, RMSEA = 0.068 < 0.08, CFI = 0.989 > 0.95, TLI = 0.978 > 0.95. All the indices of this factor confirmed an acceptable model-data fit (Table 4).
The CFA results revealed model goodness-of-fit indices of the factor “communication of digital content” (with items 14, 15, 16, 17) as follows: x2 = 7.163, df = 2, x2/df = 3.581, p < 0.001, RMSEA = 0.106, CFI = 0.976, TLI = 0.928. Although CFI = 0.976 > 0.95 and TLI = 0.928 > 0.90 showed that the model and the data fit well, x2/df = 3.581 (>3) and RMSEA = 0.106 (>0.08) showed that this factor did not fit the data well. Evaluating the factor loadings revealed that item 17 showed a lower value of 0.414 (<0.5). This observation indicated that item 17 represented what was shared among all items to a smaller extent. As a consequence, item 17 was deleted. The revised factor (with items 14, 15, 16) became a just-identified model (Table 4) and exhibited a substantial fit to the data. The factor “creation of digital content” (with items 19, 20, 21) was also a just-identified model (Table 4) and showed a good fit to the data.
The CFA results showed model fit indices of the factor “digital empathy” (with items 29, 30, 31, 32, 33, 34, 35) as follows: x2 = 49.836, df = 14, x2/df = 3.560, p < 0.001, RMSEA = 0.105, CFI = 0.953, TLI = 0.929. Although CFI = 0.976 > 0.95 and TLI = 0.928 > 0.90 indicated a good fit between the model and the data, x2/df = 3.560 (>3) and RMSEA = 0.105 (>0.08) showed that this factor did not fit the data so well. According to the modification index provided by Mplus 8.3, we found that item 35 had a higher residual correlation with item 34 (M.I. = 15.151), and it also had a lower factor loading than item 34. Item 31 had a higher residual correlation with item 33 (M.I. = 10.107), and it had a lower factor loading than item 33. Furthermore, careful examination of item 35 (I am able to restrict my behavior based on the qualities of netizens) and item 31 (I am able to use digital technologies to exercise my citizenship) showed that these two items were concerned much more with digital responsibility instead of digital empathy. Thus, item 35 and item 31 were removed from this factor. Subsequent CFA of the revised factor (with items 29, 30, 32, 33, 34) exhibited an acceptable fit to the data (x2 = 10.914, df = 5, x2/df = 2.182 < 3, p < 0.001, RMSEA = 0.072 < 0.08, CFI = 0.989 > 0.95, TLI = 0.978 > 0.95) (Table 4).
Model goodness-of-fit indices of the factor “digital safety” (with items 22, 23, 25, 26, 27, 28, 37, 38) from CFA were as follows: x2 = 70.720, df = 20, x2/df = 2.357, p < 0.001, RMSEA = 0.105, CFI = 0.921, TLI = 0.889. Although x2/df = 2.357 < 3 and CFI = 0.921 > 0.90 showed the model and the data fit well, RMSEA = 0.105 (>0.08) and TLI = 0.889 (<0.09) showed that the factor did not fit the data well. According to standardized factor loadings, item 17 had a low factor loading of 0.499, which is lower than 0.5 and should be deleted. Furthermore, item 38 had a higher residual correlation with both item 37 (M.I. = 34.909) and item 28 (M.I. = 10.899). Item 38 was also eliminated. Then, two additional CFAs of the revised factor (with items 23, 25, 26, 27, 28, 37) revealed that the model fit indices were excellent (x2 = 13.594, df = 9, x2/df = 2.228 < 3, p < 0.001, RMSEA = 0.047 < 0.05, CFI = 0.989 > 0.95, TLI = 0.982 > 0.95) (Table 4).
In general, some items were deleted from the third version of DigSki-CUS (33 items) according to model goodness-of-fit indices gotten from CFA. Due to low standardized factor loadings or high cross-factor loadings, items 11, 17, 31, 35, 22, 38 were removed. Thus, the fourth version of DigSki-CUS with 6 factors and 27 items was de-termined. The model fit indices of this version were within acceptable levels (Table 4).
To confirm the validity of the fourth version of the DigSki-CUS (27 items), the convergent validity and the discriminant validity will be checked in the following sections.

5.2.2. Convergent Validity

Convergent validity tests that items that are expected to be related are, in fact, related. Convergent validity might be tested by examining the factor loadings of the items, values of AVE and CR.
Firstly, standardized factor loadings of items were examined. Based on CFA, standardized factor loadingswere determined for all 27 items of the fourth version of DigSki-CUS (27 items) to decide whether the items should be removed from the questionnaire. All standardized factor loadings for the items of the fourth version of DigSki-CUS (27 items) all exceeded 0.5, which indicates a high convergent validity of this version of DigSki-CUS (27 items).
Secondly, AVE values of the fourth version of DigSki-CUS (27 items) were calculated following the formula in Section 4.7. AVE is the mean variance extracted for the item loadings of a factor. Most AVE values for the fourth version of DigSki-CUS (27 items) were above 0.5 (Table 5). Only two AVE values were 0.455 and 0.458, which were below 0.5 but still were above 0.36, indicating an acceptable convergent validity (threshold of AVE is 0.36-0.5). AVE values showed that items in each factor of the fourth version of DigSki-CUS (27 items) belonged to their respective fac-tors and indicated acceptable convergent validity of the factors.
Thirdly, CR statistics of factors, which were calculated according to the formula in Section 4.7. ranged from 0.770 to 0.870, indicating acceptable construct validity of the fourth version of DigSki-CUS (27 items) as they were all above the threshold of 0.7 (Table 5).
Both AVE values and CR statistics provide evidence of the convergent validity of the fourth version of the DigSki-CUS (27 items).

5.2.3. Discriminant Validity

Discriminant validity tests that items of factors that should have no relationship do, in fact, not have any relationship [46]. Discriminant validity assumes that items of a factor should correlate higher among themselves than they correlate with other items from other factors that are theoretically supposed not to correlate. In the present study, we adopted Fornell and Larcker criteria [47] to verify the discriminant validity of the fourth version of the DigSki-CUS (27 items). Whether the square root of the AVE value of one specific factor is much larger than the correlation coefficients of this factor with any other factors is testified. The AVE value measures the explained variance of a factor. When comparing the AVE value of a factor with the correlation coefficients of any pair of factors, we actually want to see if the items of the factor explain more variance than do the items of the other factors. In this way, we can determine whether the items of one factor distinguish them from items of other factors. Thus, correlations between 6 factors were calculated through Mplus 8.3. Afterward, on the diagonal, we inserted the square root of the AVE value in order to compare it with the other correlation coefficients (Table 6).
In Table 6, we found that the square root of the AVE value of the factor “use of digital means” is 0.677, which is higher than the correlation coefficients of this factor with other factors: 0.452, 0.503, 0.618, 0.640, and 0.554. This shows that the factor “use of digital means” has sufficient discriminant validity.
The square root of the AVE value of the factor “access to and management of digital content” is 0.744, which is higher than the correlation coefficients of this fact with other factors: 0.452, 0.721, 0.542, 0.466, and 0.560. This shows that the factor “access to and management of digital content” has acceptable discriminant validity.
Similarly, the square root of the AVE value of the factor “communication of digital content” is 0.731, which is higher than the correlation coefficients of this factor with other factors: 0.503, 0.721, 0.630, 0.528, and 0.648. This reveals that the factor “communication of digital content” has acceptable discriminant power.
The square root of the AVE value of the factor “creation of digital content” is 0.792, which is higher than the correlation coefficients of this factor with other factors: 0.618, 0.542, 0.630, 0.463, and 0.627. The square root of the AVE value of the factor “digital empathy” is 0.758, which is higher than the correlation coefficients of this factor with other factors: 0.640, 0.466, 0.528, 0.463, and 0.588. The square root of the AVE value of the factor “digital safety” is 0.675, which is higher than the correlation coefficients of this factor with other factors: 0.554, 0.560, 0.648, 0.627, and 0.588. Therefore, the factors “creation of digital content”, “digital empathy” and “digital safety” have sufficient discriminant validity. Therefore, the factors of the fourth version of DigSki-CUS have enough discriminant power.
Convergent validity and discriminant validity supported that the fourth version of the DigSki-CUS (27 items) is an effective questionnaire to assess the digital skills of Chinese undergraduate students. This version is also the final version of the questionnaire DigSki-CUS that we developed and validated in this study. Based on this final version of the DigSki-CUS, further studies on the digital skills of Chinese undergraduates could be conducted.

6. Discussion

The increasing interest in digital skills has fueled an assessment for different levels of digital skills in different populations. The success of the tools to measure digital skills largely depends on both the reliability and the validity of the measurement. The present study describes the properties of a questionnaire estimating digital skills in a Chinese population.
In this study, we proposed and consolidated a reliable and valid questionnaire, the DigSki-CUS. We used a mixed-methods approach to achieve the research goal. Qualitative methods in the first phase include a literature review to develop the conceptual framework and items of the questionnaire based on Li & Hu [9] and Peart et al. [12], and a focus-group interview to obtain feedback from the participants who completed the questionnaire in a pre-test. We ensured that all proposed items reflected levels of digital skills of Chinese undergraduates. After a pre-test among 20 undergraduates, a focus-group interview was held to discuss the acceptance of the questionnaire and to refine the corresponding items. Quantitative methods in the second phase were comprised of EFA and CFA. EFA was executed to define the latent factors and the related items in each factor. CFA was conducted to examine the consistency of the factor structure and refine the items to improve the validity and model fit. Convergent validity and discriminant validity were also testified.
To some extent, the 6 factors reflected our previous theoretical conceptualization of digital skills with 5 dimensions. The dimension of information skills includes the factor “access to and management of digital content” (with items 1, 2, 3, 4, 5) which measures skills to search for and manage digital content correctly, and the factor “use of digital means” (including items 7, 8, 9, 12, 13) which assesses abilities to appropriately use different types of digital means, tools, and websites. The dimension of communication skills is reflected by the factor “communication of digital content” (encomposing items 14, 15, 16), which estimates competence to communicate digital information with others. The dimension of creation skills contains the factor “creation of digital content” (with items 19, 20, 21), which assesses skills to create or edit new digital content. The dimensions of digital safety skills and digital empathy skills are reflected by the factor “digital safety” (containing items 23, 25, 26, 27, 28, 38), which estimate awareness and abilities to protect personal digital information, and the factor “digital empathy” (including items 29, 30, 32, 33, 34), which measures cognitive and emotional ability to strategically use digital media reflectively and responsibly. The result was a valid and reliable DigSki-CUS, consisting of 27 items grouped into 6 factors, reflecting 5 dimensions of the digital skills, as shown in Table 7.
EFA and CFA were conducted to decide the factor structure and validity of the questionnaire in two different samples of Chinese undergraduate students. The EFA produced 6 factors of 27 items in the questionnaire: “use of digital means”, “communication of digital content”, “creation of digital content”, “digital empathy”, and “dig-ital safety”. This is similar to Peart et al.’s [12] findings. But Peart et al. [12] grouped digital empathy into the dimension of socio-civic skills, together with social and digital engagement, democratic attitudes. However, this study argues that digital empathy is one basic part belonging to digital skills because it has a moderate to strong correlation with other factors of digital skills, ranging from r = 0.463 to r = 0.640.
In our framework, digital safety skills and digital empathy skills were proposed as two independent dimensions. As digital environments become increasingly complex and more people misuse or abuse information, digital safety skills and digital empathy skills are important for undergraduates, helping them realize online risks, avoid being harmed, and attain positive outcomes of digital uses.
To confirm the reliability of the final version of DigSki-CUS (27 items), we examined the reliability coefficients of Cronbach’s α of the whole questionnaire and each factor again. Cronbach’s α of the whole questionnaire is 0.942, indicating the reliability of the whole questionnaire. The reliability coefficients of Cronbach’s α of factors “access to and management of digital content”, “use of digital means”, “communication of digital content”, “creation of digital content”, “digital empathy”, “digital safety” are 0.874, 0.817, 0.778, 0.844, 0.861, and 0.852, showing the reliability of each factor (Table 8).
CFA revealed that several modifications were required in order to achieve a good fit between the model of the questionnaire attained from EFA and the collected data. These modifications included the removal of several items due to their low factor loadings or double loadings. Direct deletion of these items during CFA might pose a threat to the validity of the questionnaire. However, further analysis of the convergent validity and the discriminant validity suggested that the construct validity of the questionnaire was maintained even though these items were removed. What’s more, the removal of these items satisfied the economic criterion of a questionnaire.
As a consequence, the two research questions in Section 3 were answered. The internal factor structures of the questionnaire were as follows. The factor “access to and management of digital content” contains items 1, 2, 3, 4, 5, which measures skills to search for and manage digital content correctly. The factor “use of digital means” includes items 7, 8, 9, 12, 13, which assesses abilities to appropriately use different types of digital means, tools, websites. The factor “communication of digital content” encompasses items 14, 15, 16, which estimates competence to communicate digital information with others. The factor “creation of digital content” contains items 19, 20, 21, which assess skills to create or edit new digital content. The factor “digital safety” contains items 23, 25, 26, 27, 28, 38, which estimates awareness and abilities to protect personal digital information. The factor “digital empathy” includes items 29, 30, 32, 33, 34, which measures cognitive and emotional ability to strategically use digital media reflectively and responsibly.
The reliability and validity of the questionnaire were verified from different perspectives. The reliability of the whole questionnaire and each factor of the questionnaire was confirmed (α = 0.942, α = 0.817, α = 0.874, α = 0.778, α = 0.844, α = 0.861, and α = 0.852) (Table 8). The construct validity was testified from the aspects of model-data fit, convergent validity, and discriminant validity. Indices demonstrated an acceptable model-data fit (Table 4), convergent validity (Table 5), and discriminant validity (Table 6), implicating the confirmation of the internal factor structure of the questionnaire. The final version of the questionnaire DigSki-CUS (27 items) was proved to be a reliable and valid measuring tool to assess the digital skills of Chinese undergraduate students.
A point needed to mention in this study is that the items measuring skills of use of digital means, adapted according to Code for the Construction of Digital Campuses in Colleges and Universities (for Trial Implementation) (2021), showed good reliability and validity. This provides us with insightful views that questionnaires should be evaluated and conducted in the context of its application, audience, and intended use.

7. Limitations and Future Work

The present study develops and validates a questionnaire to assess the digital skills of Chinese undergraduate students. However, it is not without limitations. For instance, this study selected undergraduates from only one university, and more various universities may be taken into consideration to confirm the reliability and validity of the questionnaire in the future.
A quantitative approach was adopted in this study to validate a questionnaire, which is highly favorable to verify a questionnaire. However, a qualitative approach or mix-methods approach may be needed in order to have a deeper understanding of how digital skills are developed among Chinese undergraduate students, which may be a direction of future research.
Future research may also explore the associations between digital skills and other elements of education, such as the relationship between digital skills and adoption of blended teaching models, digital skills and the use of digital tools for learning, digital skills and academic performance, and so on.

8. Conclusions

This study adopted a mixed-methods approach with two phases to develop and vali-date a questionnaire to measure the digital skills of Chinese undergraduates. Combing qualitative methods such as a focus-group interview with quantitative methods such as EFA and CFA, this study constructed a reliable and valid questionnaire to measure the digital skills of Chinese undergraduates theoretically and empirically.
A questionnaire DigSki-CUS, which was based on a conceptual framework with 5 dimensions (information skills, communication skills, creation skills, digital safety skills, digital empathy skills) and 38 items, was initially developed. The research questions about the internal factor, the reliability, and the validity of the questionnaire were answered through analyzing data from Chinese undergraduates.
To analyze the internal factor, exploratory factor analysis (EFA) was adopted. The final 27 items were grouped into 6 factors: access to and management of digital content, use of digital means, communication of digital content, creation of digital content, digital safety, and digital empathy. The factor “access to and management of digital content” (5 items) measures skills to search for and manage digital content correctly. The factor “use of digital means” (5 items) estimates abilities to appropriately use different types of digital means. These 2 factors reflect the dimension of information skills. The factor “communication of digital content” (3 items) assesses abilities to communicate digital content with others. The factor “creation of digital content” (3 items) estimates abilities to create and edit new digital content. The factor “digital safety” (5 items) measures abilities to protect personal privacy and digital content in a digital world. The factor “digital empathy” (6 items) measures a person’s cognitive and emotional ability to be reflective and socially empathic in digital environments. These 4 factors reflect dimensions of communication skills, creation skills, digital safety skills, and digital empathy skills respectively.
To verify the reliability and validity of the questionnaire, confirmatory factor analysis (CFA) was used. Indices of reliability, model-data fit, convergent validity, and discriminant validity of each factor were examined.
Cronbach’s α of the factor “access to and management of digital content” is 0.874, indicating the reliability of this factor. Indices of model-data fit (x2/df = 2.064, RMSEA = 0.068, CFI = 0.989, TLI = 0.978) are within the acceptable levels, showing the fitness of the model and data. AVE = 0.553 and CR = 0.859 are above the cutoffs, verifying the convergent validity of the factor. The square root of the AVE value is 0.744, higher than all the other correlation coefficients, testifying the discriminant validity of the factor.
The factor “use of digital means” has a Cronbach’s α of 0.817, showing that this factor is reliable. Acceptable indices x2/df = 2.228, RMSEA = 0.073, CFI = 0.981, TLI = 0.962 indicate the good model-data fit. AVE = 0.458 and CR = 0.808 meet the convergent validity thresholds. The square root of the AVE value is 0.677, higher than all the other correlation coefficients, verifying the discriminant validity of this factor.
Cronbach’s α of the factor “communication of digital content” is 0.778, indicating the reliability of this factor. It is a just-identified model. AVE = 0.534 and CR = 0.770 are above the cutoffs, verifying the convergent validity. The square root of the AVE value is 0.731, higher than all the other correlation coefficients, testifying the discriminant validity of this factor.
The factor “creation of digital content”: Cronbach’s α = 0.844 shows the reliability of this fac-tor. It is a just-identified model. The convergent validity was verified by AVE = 0.628 and CR = 0.835. The square root of the AVE value is 0.792, higher than all the other correlation coefficients, verifying the discriminant validity of the factor.
Cronbach’s α of the factor “digital safety” is 0.852, showing the reliability of this factor. Indices x2/df = 2.228, RMSEA = 0.047, CFI = 0.989, TLI = 0.982 are acceptable, indicating the fitness of the model and data. AVE = 0.455 and CR = 0.832 meet the convergent validity thresholds. The square root of the AVE value is 0.675, higher than all the other correlation coefficients, testifying the discriminant validity of the factor.
The factor “digital empathy”: Cronbach’s α = 0.861 indicates the reliability of this factor. Acceptable indices x2/df = 2.182, RMSEA = 0.072, CFI = 0.989, TLI = 0.978 show a good model-data fit. AVE = 0.575 and CR = 0.870 are above the cutoffs, verifying the convergent validity. The square root of the AVE value is 0.758, higher than all the other correlation coefficients, verifying the discriminant validity of this factor.
All indices demonstrate that the questionnaire DigSki-CUS developed in this study is reliable and valid. T DigSki-CUS tends to be an effective tool to assess the digital skills of undergraduates in the Chinese educational context.
Considering the complexity of the digital world, digital safety and digital empathy were constructed as two independent dimensions in this study. Concerning the Chi-nese educational contexts, items of the dimension of use of digital means were adapted according to a Chinese government document, having good reliability and validity. These provide us with insightful viewpoints that developing a questionnaire should consider not only the requirements of times but also the context of the application of this questionnaire.

Author Contributions

Conceptualization, C.F. and J.W.; methodology, C.F. and J.W.; software, C.F.; validation, C.F. and J.W.; formal analysis, C.F; investigation, C.F. and J.W; writing—original draft preparation, C.F.; writing—review and editing, C.F. and J.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Education and Teaching Research Project of Shandong Province (2020JXY028) and the Teaching Reform Program of Qufu Normal University in 2020.

Institutional Review Board Statement

This study was approved by the Ethics Committee of Qufu Normal University (protocol code 2021-092 and 13 November 2021).

Informed Consent Statement

Informed consent was obtained from all participants.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hargittai, E.; Shaw, A. Digitally savvy citizenship: The role of internet skills and engagement in young adults’ political participation around the 2008 presidential election. J. Broadcasting Electron. Media 2013, 57, 115–134. [Google Scholar] [CrossRef]
  2. Kahne, J.; Bowyer, B. Can media literacy increase digital engagement in politics? Learn. Media Technol. 2019, 44, 211–224. [Google Scholar] [CrossRef]
  3. Janschita, G.; Penker, M. How digital are ‘digital natives’ actually? Developing an instrument to measure the degree of digitalisation of university students—The DDS-Index. Tools Instrum. 2022, 153, 127–159. [Google Scholar]
  4. Hernández-Martín, A.; Martín-del-Pozo, M.; Iglesias-Rodríguez, A. Pre-adolescents’ digital competences in the area of safety. Does frequency of social media use mean safer and more knowledgeable digital usage? Educ. Inf. Technol. 2021, 26, 1043–1067. [Google Scholar] [CrossRef]
  5. Eshet, Y. Digital literacy: A conceptual framework for survival skills in the digital era. J. Educ. Multimed. Hypermedia 2004, 13, 93–106. [Google Scholar]
  6. Van Deursen, A.J.A.M.; Van Dijk, J.A.G.M. Internet skills levels increase, but gaps widen: A longitudinal cross-sectional analysis (2010–2013) among the Dutch population. Inf. Commun. Soc. 2011, 18, 782–797. [Google Scholar] [CrossRef]
  7. OECD. Students, Computers and Learning: Making the Connection. PISA, OECD. 2015. Available online: https//doi.org/10.1787/9789274239555-en (accessed on 20 January 2020).
  8. Radovanovic, D.; Hogan, B.; Lalic, D. Overcoming digital divides in higher education: Digital literacy beyond Facebook. New Media Soc. 2015, 17, 1733–1749. [Google Scholar] [CrossRef]
  9. Li, X.J.; Hu, R.J. Developing and validating the digital skills scale for school children (DSS-SC). Inf. Commun. Soc. 2018. [Google Scholar] [CrossRef]
  10. Liu, Q.T.; Wu, L.X.; Zhang, S.; Mao, G. Research of a model for teachers’ digital ability standard. China Educ. Technol. 2015, 340, 14–19. (In Chinese) [Google Scholar]
  11. Peled, Y. Pre-service teachers’ self-perception of digital literacy: The case of Isreal. Educ. Inf. Technol. 2021, 26, 2879–2896. [Google Scholar] [CrossRef]
  12. Peart, M.T.; Gutierrez-Esteban, P.; Cubo-Delgado, S. Development of the digital and socio-civic skills (DIGSOC) questionnaire. Educ. Tech. Res. Dev. 2020, 68, 3327–3351. [Google Scholar] [CrossRef]
  13. Correa, T. Digital skills and social media use: How internet skills are related to different types of Facebook use among ‘digital natives’. Inf. Commun. Soc. 2016, 19, 1095–1107. [Google Scholar] [CrossRef]
  14. Handley, F. Developing digital skills and literacies in UK higher education: Recent developments and a case study of the digital literacies framework at the University of Brighton, UK. Publications 2018, 48, 109–126. [Google Scholar] [CrossRef] [Green Version]
  15. Greene, J.A.; Yu, S.B.; Copeland, D.Z. Measuring critical components of digital literacy and their relationships with learning. Comput. Educ. 2014, 76, 55–69. [Google Scholar] [CrossRef]
  16. Mohammadyari, S.; Singh, H. Understanding the effect of e-learning on individual performance: The role of digital literacy. Comput. Educ. 2015, 82, 11–25. [Google Scholar] [CrossRef]
  17. Burgos-Videla, C.G.; Castillo Rojas, W.A.; López Meneses, E.; Martínez, J. Digital Competence Analysis of University Students Using Latent Classes. Educ. Sci. 2021, 11, 385. [Google Scholar] [CrossRef]
  18. Ferrari, A. DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe; Joint Research Centre Reports; Publications Office of the European Union: Luxemberg, 2013. [Google Scholar]
  19. Janseen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ views on digital competence: Commonalities and differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
  20. List, A. Defining digital literacy development: An examination of pre-service teachers’ beliefs. Comput. Educ. 2019, 138, 146–158. [Google Scholar] [CrossRef]
  21. UNESCO. A Policy Review: Building Digital Citizenship in Asia Pacific through Safe, Effective and Responsible Use of ICT; UNESCO Asia and Pacific Regional Bureau for Education: Bangkok, Thailand, 2016. [Google Scholar]
  22. Gorsuch, R.L. Factor Analysis Hillsdale, 2nd ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1983. [Google Scholar]
  23. Henson, R.K.; Capraro, R.M.; Capraro, M.M. Reporting practices and use of exploratory factor analyses in educational research journals: Errors and explanation. Res. Sch. 2004, 11, 61–72. [Google Scholar]
  24. Thompson, B.; Daniel, L. Factor analytic evidence for the construct validity of scores: A historical overview and some guidelines. Educ. Psychol. Meas. 1996, 56, 197–208. [Google Scholar] [CrossRef]
  25. van Laar, E.; van Deursen, A.J.A.M.; van Dijk, J.A.G.M.; de Haan, J. The relation between 21st-century skills and digital skills: A systematic literature review. Comput. Hum. Behav. 2017, 72, 577–588. [Google Scholar] [CrossRef]
  26. Gilster, P. Digital Literacy; Wiley: New York, NY, USA, 1997. [Google Scholar]
  27. Ng, W. Can we teach digital natives’ digital literacy? Comput. Educ. 2012, 59, 1065–1078. [Google Scholar] [CrossRef]
  28. American Library Association. Digital Literacy. Libraries and Public Policy: Report of the Office for Information Technology Policy’s Digital Literacy Taskforce. 2013. Available online: https://www.atalm.org/sites/default/files/Digital%20Literacy,%20Libraries,%20and%20Public%20Policy.pdf (accessed on 8 February 2022).
  29. European Commission. Recommendation 2009/625/CE of the Commission, 20 August 2009, on Media Literacy in Digital Environments for Audio-Visual Industries and Entrepreneur and Knowledge Society; European Commission Press: Brussels, Belgium, 2009. [Google Scholar]
  30. Vuorikari, R.; Punie, Y.; Carretero Gomez, S.; Van den Brande, G. DigComp 2.0: The Digital Competence Framework for Citizens. Update Phase 1: The Conceptual Reference Model; Luxembourg Publication Office of the European Union: Luxembourg, 2016. [Google Scholar]
  31. Adorjan, M.; Ricciardelli, R. Student perspectives towards school responses to cyber-risk and safety: The presumption of the prudent digital citizen. Learn. Media Technol. 2019, 44, 430–442. [Google Scholar] [CrossRef]
  32. Johnston, N. The shift towards digital literacy in Australian university libraries: Developing a digital literacy framework. J. Aust. Libr. Inf. Assoc. 2020, 69, 93–101. [Google Scholar] [CrossRef]
  33. Mengual-Andrés, S.; Roig-Vila, R.; Mira, J.B. Delphi study for the design and validation of a questionnaire about digital competences in higher education. Int. J. Educ. Technol. High. Educ. 2016, 13, 12–23. [Google Scholar] [CrossRef] [Green Version]
  34. Rolstad, S.; PhLic, J.A.; Rydén, A. Response burden and questionnaire length: Is shorter better? A review and meta-analysis. Value Health 2011, 14, 1101–1108. [Google Scholar] [CrossRef] [Green Version]
  35. Chen, C.W. Developing EFL students’ digital empathy through video production. System 2018, 77, 50–57. [Google Scholar] [CrossRef]
  36. Petousi, V.; Sifaki, E. Contextualizing harm in the framework of research misconduct. Findings from discourse analysis of scientific publications. Int. J. Sustain. Dev. 2020, 23, 149–174. [Google Scholar] [CrossRef]
  37. Hair, J.F., Jr.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis; CENGAGE: Boston, MA, USA, 2018. [Google Scholar]
  38. MacCallum, R.C.; Roznowski, M.; Mar, C.; Reith, J.V. Alternative strategies for cross-validation of covariance structure models. Multivar. Behav. Res. 1994, 29, 1–32. [Google Scholar] [CrossRef]
  39. Jomeen, J.; Martin, C.R. Confirmation of an occluded anxiety component within the Edinburgh Postnatal Depression Scale (EPDS) during early pregnancy. J. Reprod. Infant. Psychol. 2005, 23, 143–154. [Google Scholar] [CrossRef]
  40. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Modeling 1999, 6, 1–55. [Google Scholar] [CrossRef]
  41. Schumacker, R.E.; Lomax, R.G. A Beginner’s Guide to Structural Equation Modeling; Lawrence Erlbaum: Mahwah, NJ, USA, 1996. [Google Scholar]
  42. Prudon, P. Confirmatory factor analysis as a tool in research using questionnaires: A critique. Compr. Psychol. 2015, 4, 10. [Google Scholar] [CrossRef] [Green Version]
  43. Streiner, D.L. Figuring out factors: The use and misuse of factor analysis. Can. J. Psychiatry 1994, 39, 135–140. [Google Scholar] [CrossRef] [PubMed]
  44. Howard, M.C. A review of exploratory factor analysis decisions and overview of current practices: What we are doing and how can we improve? Int. J. Hum. Comput. Interact. 2016, 32, 51–62. [Google Scholar] [CrossRef]
  45. Martin, C.R.; Tweed, A.E.; Metcalfe, M.S. A psychometric evaluation of the hospital anxiety and depression scale in patients diagnosed with end-stage renal disease. Br. J. Clin. Psychol. 2004, 43, 51–64. [Google Scholar] [CrossRef]
  46. Carter, S.R. Using confirmatory factor analysis to manage discriminant validity issues in social pharmacy research. Int. J. Clin. Pharm. 2016, 38, 731–737. [Google Scholar] [CrossRef]
  47. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
Figure 1. Phases and methods of development and validation of the DigSki-CUS.
Figure 1. Phases and methods of development and validation of the DigSki-CUS.
Sustainability 14 03539 g001
Figure 2. Demographic characteristics of participants for EFA and CFA of the DigSki-CUS.
Figure 2. Demographic characteristics of participants for EFA and CFA of the DigSki-CUS.
Sustainability 14 03539 g002
Figure 3. Scree plot of the third version of the DigSki-CUS (33 items).
Figure 3. Scree plot of the third version of the DigSki-CUS (33 items).
Sustainability 14 03539 g003
Figure 4. Names and explanations of each factor obtained from EFA of the third version of the DigSki-CUS (33 items).
Figure 4. Names and explanations of each factor obtained from EFA of the third version of the DigSki-CUS (33 items).
Sustainability 14 03539 g004
Table 1. Dimensions and measuring items of the initial version of DigSki-CUS (38 items).
Table 1. Dimensions and measuring items of the initial version of DigSki-CUS (38 items).
DimensionsMeasuring Items zLiterature Sources
Information Skills
1I have apps that keep me up to date with news.Li & Hu [9]
Peart et al. [12]
2I am able to search for and access information in digital environments.
3I can use different tools to store and manage information.
4I am able to search for information that I need on the Internet.
5I can understand the information that I get from the Internet.
6I consider only the most important information on the Internet.
7I skillfully use digital software to complete learning tasks.Based on Code for the Construction of Digital Campuses in Colleges and Universities (for Trial Implementation) (2021)
8I can complete digital content that meets the minimum requirements of learning tasks.
9I can create and edit digital content with higher standards according to the requirements of work or study.
10I am able to use digital means to clearly express my views to others
11I am able to use digital means to cooperate with others to complete tasks.
12I am able to use digital means to solve problems encountered in my study.
13I am able to use digital means to detect plagiarism of content that I created.
Communication skills
14I can communicate with others in digital environments.Li & Hu [9]
Peart et al. [12]
15I know how to communicate with others through different digital means.
16I know how to communicate with others in different ways (e.g., images, texts, videos ...).
17I communicate my ideas to people whom I know in digital environments.
18I share information and content with other people via digital tools or websites.
Creation skills
19I know different ways to create and edit digital content (e.g., videos, photographs, texts, animations...).Li & Hu [9]
Peart et al. [12]
20I am able to accurately present what I want to deliver in digital environments.
21I can transform information and organize it in different formats.
Digital safety skills
22I am careful and try to ensure that my messages do not irritate others.Li & Hu [9]
Peart et al. [12]
23I am careful with my personal information.
24I am careful with the information of other people.
25I avoid having arguments with others in digital environments
26I am able to identify harmful behaviors that can affect me on social networks.
27I avoid behaviors that are harmful on social networks.
28Before doing a digital activity (e.g., upload a photo, comment ...), I think about the possible consequences.
37When sharing digital information, I am able to protect my privacy and security.
38I reflect on whether a digital environment is safe.
Digital empathy skills
29I am able to put myself in other people’s shoes in digital environments.Li & Hu [9]
Peart et al. [12]
30I am willing to help other people in digital environments.
31I am able to use digital technologies to exercise my citizenship.
32I respect other people in digital environments.
33I take into account the opinion of others in digital environments.
34I get informed before commenting on a topic.
35I am able to restrict my behavior based on the qualities of netizens.
36When forwarding information, I consider whether the information source is reliable.
z Some items were selected from Li & Hu [9] and Peart et al. [12] without any change. Some items were revised to make them easier to be understood by Chinese undergraduate students. Some items were added according to the educational situation in China.
Table 2. EFA-based factor loadings with oblimin rotation in the third version of the DigSki-CUS (33 items).
Table 2. EFA-based factor loadings with oblimin rotation in the third version of the DigSki-CUS (33 items).
ItemsFactor Loadings z
Factor 1Factor 2Factor 3Factor 4Factor 5Factor 6
320.833 ***
330.713 ***
290.593 ***
350.466 ***
310.397 ***
340.383 ***
300.328 ***
4 0.883 ***
1 0.823 ***
2 0.724 ***
3 0.621 ***
5 0.592 ***
9 0.658 ***
7 0.643 ***
13 0.514 ***
11 0.511 ***
8 0.442 ***
12 0.441 ***
26 0.774 ***
28 0.697 ***
25 0.587 ***
27 0.587 ***
23 0.434 ***
37 0.397 ***
38 0.388 ***
22 0.313 ***
16 0.716 ***
15 0.569 ***
14 0.510 ***
17 0.502 ***
21 0.615 ***
20 0.562 ***
19 0.350 ***
z Items with factor loadings <0.3 are omitted. *** p < 0.001.
Table 3. Reliability scores of each factor in the third version of the DigSki-CUS (33 items).
Table 3. Reliability scores of each factor in the third version of the DigSki-CUS (33 items).
FactorsItems zCronbach’s α
Use of digital means9, 7, 13, 11, 8, 120.854
Access to and management of digital content4, 1, 2, 3, 50.874
Communication of digital content15, 14, 16, 170.784
Creation of digital content20, 190.844
Digital empathy32, 33, 29, 35, 31, 34, 300.888
Digital safety26, 28, 25, 27, 23, 37, 38, 220.866
z Items in each factor were ordered according to their factor loadings. Items having higher factor loadings were located in front.
Table 4. Model fit indices of the fourth version of the DigSki-CUS (27 items).
Table 4. Model fit indices of the fourth version of the DigSki-CUS (27 items).
FactorsItemsX2dfX2/dfRMSEA CFITLI
Use of digital means7, 8, 9, 12, 1311.14252.2280.0730.9810.962
Access to and management of digital content1, 2, 3, 4, 510.32052.0640.0680.9890.978
Communication of digital content14, 15, 16 0.00000.0000.0001.0001.000
Creation of digital content19, 20, 21 0.00000.0000.0001.0001.000
Digital empathy29, 30, 32, 33, 3410.91452.1820.0720.9890.978
Digital safety23, 25, 26, 27, 28, 3713.59492.2280.0470.9890.982
Note: x2: Satorra–Bentler-scaled Chi-squared, df: degrees-of-freedom, x2/df: ratio of chi-squared to degrees-of-freedom (good if x2/df < 2 and below; acceptable if x2/df < 5), RMSEA: root-mean-square error of approximation (good if RMSEA < 0.06 and below; acceptable if RESEA < 0.08), CFI: comparative fit index (good if CFI > 0.95; acceptable if CFI > 0.90), TLI: Tucker–Lewis index (good if TLI > 0.95; acceptable if TLI > 0.90).
Table 5. Indices of convergent validity of the fourth version of the DigSki-CUS (27 items).
Table 5. Indices of convergent validity of the fourth version of the DigSki-CUS (27 items).
FactorsItemsAVE zCR y
Use of digital means7, 8, 9, 12, 130.4580.808
Access to and management of digital content1,2,3,4,50.5530.859
Communication of digital content14, 15, 160.5340.770
Creation of digital content19, 20, 210.6280.835
Digital empathy29, 30, 32, 33, 340.5750.870
Digital safety23, 25, 26, 27, 28, 370.4550.832
z AVE: Average variance extracted (acceptable if AVE > 0.5, the threshold is 0.36–0.5). y CR: composite reliability (acceptable if CR > 0.7).
Table 6. Indices of discriminant validity of the fourth version of the DigSki-CUS (27 items).
Table 6. Indices of discriminant validity of the fourth version of the DigSki-CUS (27 items).
FactorsAVEUse of Digital MeansAccess to and Management of Digital ContentCommunication of Digital ContentCreation of Digital ContentDigital EmpathyDigital Safety
Use of digital means0.4580.677
Access to and management of digital content0.5530.4520.744
Communication of digital content0.5340.5030.7210.731
Creation of digital content0.6280.6180.5420.6300.792
Digital empathy0.5750.6400.4660.5280.4630.758
Digital safety0.4550.5540.5600.6480.6270.5880.675
Note: Diagonal elements in bold are the square root of AVE values. Non-diagonal elements are the correlation coefficients between dimensions.
Table 7. Factors and items of the final version of DigSki-CUS (27 items).
Table 7. Factors and items of the final version of DigSki-CUS (27 items).
DimensionsFactorsItemsFactor
loading
Information
skills
Access to
and management
of digital content
1. I have apps that keep me up to date with news.0.823
2. I am able to search for and access information in digital environments.0.724
3. I can use different tools to store and manage information.0.621
4. I am able to search for information that I need on the Internet.0.883
5. I can understand the information that I get from the Internet.0.592
Use
of digital means
7. I skillfully use digital software to complete learning tasks.0.643
8. I can complete digital content that meets the minimum requirements of learning tasks.0.442
9. I can create and edit digital content with higher standards according to the requirements of work or study.0.658
12. I am able to use digital means to solve problems encountered in my study.0.441
13. I am able to use digital means to detect plagiarism of content that I created.0.514
Communication skillsCommunication
of digital content
14. I can communicate with others in digital environments.0.510
15. I know how to communicate with others through different digital means.0.569
16. I know how to communicate with others in different ways (e.g., images, texts, videos ...).0.716
Creation
skills
Creation
of digital content
19. I know different ways to create and edit digital content (e.g., videos, photographs, texts, animations...).0.350
20. I am able to accurately present what I want to deliver in digital environments.0.562
21. I can transform information and organize it in different formats.0.615
Digital safety
skills
Digital safety23. I am careful with my personal information.0.434
25. I avoid having arguments with others in digital environments0.587
26. I am able to identify harmful behaviors that can affect me on social networks.0.774
27. I avoid behaviors that are harmful on social networks.0.587
28. Before doing a digital activity (e.g., upload a photo, comment ...), I think about the possible consequences.0.697
37. When sharing digital information, I am able to protect my privacy and security.0.397
Digital empathy
skills
Digital empathy29. I am able to put myself in other people’s shoes in digital environments.0.593
30. I am willing to help other people in digital environments.0.328
32. I respect other people in digital environments.0.833
33. I take into account the opinion of others in digital environments.0.713
34. I get informed before commenting on a topic.0.383
Table 8. Reliability scores of each factor in the final version of DigSki-CUS (27 items).
Table 8. Reliability scores of each factor in the final version of DigSki-CUS (27 items).
FactorsItems zCronbach’s α
Access to and management of digital content 4, 1, 2, 3, 50.874
Use of digital means9, 7, 13, 8, 120.817
Communication of digital content15, 14, 16,0.778
Creation of digital content21, 20, 190.844
Digital safety26, 28, 25, 27, 23, 370.852
Digital empathy32, 33, 29, 34, 300.861
z Items in each factor were ordered according to their factor loadings. Items having higher factor loadings were located in front.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fan, C.; Wang, J. Development and Validation of a Questionnaire to Measure Digital Skills of Chinese Undergraduates. Sustainability 2022, 14, 3539. https://doi.org/10.3390/su14063539

AMA Style

Fan C, Wang J. Development and Validation of a Questionnaire to Measure Digital Skills of Chinese Undergraduates. Sustainability. 2022; 14(6):3539. https://doi.org/10.3390/su14063539

Chicago/Turabian Style

Fan, Cunying, and Juan Wang. 2022. "Development and Validation of a Questionnaire to Measure Digital Skills of Chinese Undergraduates" Sustainability 14, no. 6: 3539. https://doi.org/10.3390/su14063539

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop