Next Article in Journal
Insights from a Pre-Pandemic K-12 Virtual American Sign Language Program for a Post-Pandemic Online Era
Previous Article in Journal
Transforming Educational Leadership: A Historical Context
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unveiling University Students’ Perceptions on Their Teachers’ Digital Competence

1
School of Education, Humanities and Social Sciences, Halmstad University, Kristian IV:s väg 3, 301 18 Halmstad, Sweden
2
School of Business, Innovation and Sustainability, Halmstad University, Kristian IV:s väg 3, 301 18 Halmstad, Sweden
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 891; https://doi.org/10.3390/educsci14080891
Submission received: 8 June 2024 / Revised: 6 August 2024 / Accepted: 12 August 2024 / Published: 15 August 2024

Abstract

:
In contemporary society, digital competence has become increasingly important for people in everyday life as well as in working life. Hence, it is vital that today’s higher education contain an appropriate degree of digitization both in terms of content and approaches, something which is particularly important in higher education, where there is a pronounced expectation of what one should be able to handle in terms of digitization when one enters the workplace. Drawing on insights from previous research, this study seeks to explore the integration of digital elements in higher education pedagogy via students’ perceptions of digital integration in their education and evaluate their assessments of teachers’ digital competence. Special attention is given to collaborative learning practices facilitated by digital technologies. The research questions posed to guide the aim are: How do university students perceive the integration of digital elements within their education, and what are their assessments of their teachers’ digital competence in utilizing these technologies? The results show that teachers’ digital competence varies across disciplines which might influence their utilization of digital pedagogical methods and tools. Moreover, both engineering and non-engineering students reported varying levels of usage of digital collaborative learning methods which might reflect discipline-specific preferences and practices in collaborative learning. Also, despite high confidence levels in using digital tools, students exhibited limited awareness of existing digital functionalities. These results seek to inform pedagogical practices, institutional policies, and professional development initiatives to cultivate a digitally proficient educator workforce and have relevance globally for all involved in teaching and learning in higher education.

1. Introduction

Digital competence and proficiency in various parts of everyday life has become increasingly important in today’s society, not least in working life. For example, the European Union has identified digital competence as one of the most important abilities for contemporary and future citizens [1,2,3]. Hence, it is important that various kinds of educational programs in higher education keep up to date with digital skills that are current in their respective fields in order to provide students with relevant education [4]. This is linked both to the actual content of the education when it comes to digitization as well as to the teachers’ use of digital tools and their overall digital competence [5]. However, defining digital competence in higher education is not straightforward. On the contrary, research has shown that there are different variations of definitions when referring to digital competence in relation to higher education, e.g., [6,7], depending on if the concepts are defined by policy, research, or both and whether they focus on technical skills or social practices [6].
Overall, research is relatively scarce when it comes to the process through which university teachers’ digital competence can influence or enhance their technology acceptance and use intention, especially in the context of higher education and training [8]. On a more general level, previous research has shown that most university students and teachers have a basic level of digital competence [7,9,10,11] and that higher education teachers have different attitudes towards using digital tools in their teaching where some are positive and curious about the use while others are more doubtful [12,13]. This somewhat ambivalent posture is usually linked to the fact that integration of technology in teaching is connected to organizational and societal processes beyond the teacher’s direct control, as was often the case before the COVID-19 pandemic [13]. During the pandemic, it became urgent for teachers worldwide to develop their digital skills as the use of digital technologies increased significantly due to the emergent situation [14] and the development of teachers’ digital competence became pivotal [15]. In a recent study by de Obesso et al. [16], the researchers focused on students’ perceptions of teacher’s digital competence in the aftermath of COVID-19, using a digital survey as an instrument to measure teachers’ digital competence from students’ point of view in relation to their self-perception of learning [16]. The study proposes four hypotheses that include elements that affect students’ self-perception of learning: (1) educators’ digital skills; (2) the use of technology for communication, monitoring, and assessment; (3) educators’ engagement in digital ecosystems; and (4) students’ data security in the learning process. Their results show a significant relationship with three of the hypotheses, but no confirmation of the fourth. These findings have implications for higher education institutions and policymakers to ensure that the digitalization of education drives effectiveness and quality.
However, digital proficiency in higher education pedagogy is of a multifaceted nature [17,18]. Beyond mere technical expertise, it encompasses the ability to effectively utilize digital resources, adapt teaching strategies to diverse learning styles, foster collaboration in virtual environments, and critically evaluate online information [19,20]. Furthermore, digital proficiency in university teachers extends beyond the classroom, shaping students’ preparedness for an increasingly digital-centric workforce. With technological advancements shaping the way knowledge is disseminated and absorbed, the role of digital competence among university teachers has garnered significant attention and the acquisition of digital competencies is indispensable for higher education teachers to harness the benefits presented by technological progress and formulate strategies for their professional growth. Additionally, it is essential for them to enhance students’ digital proficiency, thereby contributing to their overall digital literacy [10,21]. As a consequence, and as universities embrace digital tools and platforms to enhance teaching and learning experiences, it becomes imperative to understand how students perceive and evaluate their teachers’ proficiency in leveraging these technologies [22].
So how do students view the digital elements they encounter in their education and how do they view the teachers’ digital competence? For some time now, there have been examples in higher education where students claim they do not encounter the digital tools or other digital elements in their education that they expected and which they feel are expected of them when they enter into professional life [22,23]. In a systematic and bibliographic update of the digital competence of the university student made by Marrero-Sánchez and Vergara-Romero, they found that there is a need to “strengthen the training in digital skills of university students…so that they can take advantage of the communication tools available in the knowledge society” [24] (p. 9). This is of course important, but what Marrero-Sánchez and Vergara-Romero further found in their review is that when students develop digital competence, it also stimulates innovation and creativity, and they develop soft skills such as teamwork. But how far has digitization in higher education really come in practice in terms of digital elements in education and teachers’ use of digital tools? This article focuses on the dynamics between digital proficiency and pedagogy from the perspective of university students. By examining their perceptions of their teachers’ digital competence, we aim to shed some light on the effectiveness of digital integration in higher education. Furthermore, we aim to explore the extent to which educators’ digital skills influence student engagement, learning outcomes, and overall educational experiences in order to provide a nuanced understanding of the strengths, challenges, and opportunities associated with university teachers’ digital competence. Ultimately, our findings seek to inform pedagogical practices, institutional policies, and professional development initiatives aimed at cultivating a digitally proficient educator workforce capable of meeting the evolving needs of 21st-century learners. The research questions posed in this study to support the aim are:
  • RQ1: How do university students perceive the integration of digital elements within their education?
  • RQ2: What are their assessments of their teachers’ digital competence in utilizing these technologies, especially when it comes to collaborative learning practices?
Here, collaborative learning practices refer to collaborative pedagogical methods used by the teachers including both learning approaches where students can collaborate in small groups and collaboration among students and professors. Examples of collaborative pedagogical methods are the flipped classroom; seminars in smaller groups; lab exercises and laboratories; case studies; simulations and supervision.
To answer RQ1 and RQ2, the theoretical framework TPACK [25] was utilized. TPACK is an abbreviation of Technological Pedagogical Content Knowledge and the TPACK model consists of three overlapping areas of competence: subject/content, pedagogy, and technology (see Figure 1 below). The overlaps create new areas of required teacher competence. The space in the middle represents the optimal level of competence that shows how all areas interact together to provide support for good teaching. The dashed circle surrounding the three overlapping areas represents the context, which is of great relevance in relation to the competence areas [26]. In the past couple of decades, the TPACK framework has proven to be a useful theoretical model in terms of its application in higher education learning and teaching with its focus on technological, pedagogical and content knowledge [26], to understand and enhance the educational practice. In this article, our focus is on the integration of technology and pedagogy (TPK) from a student perspective.

2. Materials and Methods

The present study is interdisciplinary with a research group represented by three researchers and one student from two disciplines (teacher education and engineering). The empirical material consists of data from a digital survey conducted on 175 students from a small Swedish university. The digital survey was initiated by one of the faculties and was performed in collaboration with the Educational Development Center at the university. Initially, the survey was piloted in all four faculties at the university and in total, 233 respondents answered the survey (including the pilot stage). Out of those, 183 students from the faculty that we focused on answered. Out of those responses/cases, eight had missing data and thus were excluded from the subsequent analysis. Therefore, our analysis was based on 175 answers, which were complete. Students who participated in the survey were enrolled in the following educational areas: development engineering, innovation management, construction engineering, business administration, economics, and innovation in the built environment.
Methodologically, our study includes data from a digital survey with closed-ended and open-ended questions. The survey was distributed digitally both by the learning management system (LMS) at the university (which is Blackboard Learn 3900.91) and in person in the classroom, where the purpose of the study was presented, and the students were encouraged to fill out the survey on their digital devices and ask questions about the study if they wanted. As mentioned, the data sample consisted of 175 respondents, and the overall answering frequency varied among the 21 programs represented (first-year students were excluded). Out of the total number of responses, the highest number of responses was obtained from construction engineers (29%), innovation engineering bachelors (25%) and business administration and economics programs (13%). The implementation of the digital survey was carried out during the fall semester of 2023 when it was distributed to the respondents as described above. The respondents were informed that the purpose of the project was to map students’ experience of the extent to which the teachers on their programs use digital tools in the programs/single courses. The intention of the survey was communicated to use the results to make meaningful changes to improve the quality of education at the university.
The survey included 22 questions and had an estimated time of approximately 10–15 min to complete (see Table 1). Our selected constructs are presented in Table 1, based on the TPACK framework—pedagogical knowledge, technological knowledge and technological pedagogical knowledge. These constructs of focus are useful to answer RQ1 (technological knowledge and technological pedagogical knowledge), and RQ2 (pedagogical knowledge and technological pedagogical knowledge).
After intervention in the classroom, the survey link was uploaded on Blackboard Learn 3900.91 and students were reminded to fill it out. The survey included four background variables (gender, faculty, study program, and/or single subject course). The rest of the questions were categorized depending on their focus in Table 1. The majority of the questions were closed-ended (3–5-point Likert scale, see Supplementary Materials), and some had an open-ended alternative. The respondents were asked to rate the following:
-
to what extent they feel the teachers are using pedagogical methods in their education (3-point Likert scale);
-
what pedagogical methods and tools support their learning (4-point Likert scale);
-
to what extent they are using tools/programs/services in their education (3-point Likert scale);
-
to what extent teachers are using tools/programs/services in their teaching (3-point Likert scale)
-
respondents’ competence, training experience, and needs regarding digital tools (5-point Likert scale)
As described above, response categories varied on a 3–5-point Likert scale.
-
When using a 3-point Likert scale, we used often, sometimes, and rarely.
-
When using a 4-point Likert scale, we used totally agree, partly agree, partly disagree, disagree.
-
When using a 5-point Likert scale, we used often, sometimes, rarely, never (chosen not to), never (not aware of).
To obtain more robust analyses, response categories 1 and 2 were combined to represent a low rating, category 3 represents a medium rating, and categories 4 and 5 were combined to represent a high rating.
In question nine, the participants were asked to express to what extent they agreed with seven statements on a 4-point Likert scale: I feel confident using digital tools in my education; I can solve any technical issues that may occur with the digital tools I use; I can quickly learn new technology; I keep myself informed on new digital tools that are relevant to my field of study; I often try new digital tools; I know a lot about digital tools; I have the necessary basic knowledge to be able to use new digital tools quickly. These statements were inspired from another survey where the theoretical framework TPACK [25] was used [28].

2.1. Ethical Considerations

Ethical considerations were made in accordance with the Swedish Research Council’s guidelines [29] for using informed consent. The respondents were informed of the purpose of the project and the aim of the survey, and it was made clear that the study was voluntary and anonymous, so individual responses could not be tracked, and that the participants could cancel their consent at any time.

2.2. Data Analysis Procedures

Our data analysis procedures were inspired by exploratory data analysis [30]. We found this approach helpful as this paper presents the initial stages of data analysis where the focus is to maximize the insights into the dataset and better understand the data with the help of descriptive statistics. Descriptive statistics provide a meaningful way to summarize and describe the basic features of the data which makes it easier to understand and interpret the data at hand [31]. As Cooksey [32] states, descriptive statistics also allow for easier comparison of different groups (e.g., engineering and non-engineering students in our study) to identify trends and easily communicate the results.
We started our analysis by exporting a data report from the survey tool we used—SUNET survey. It provided detailed information for each question in the survey showing both numerically and graphically the number and distribution of responses, mean, median, standard deviation, and coefficient of variation. The SUNET survey system also provided an opportunity to make different comparisons, for example, based on gender. We started our analysis by looking at the individual questions and their components based on the TPACK framework in relation to our RQs (see Table 1), focusing on identifying specific values, proportions and distributions and trying to identify high or low values and interpret their meaning. The structure of the analysis is presented as follows:
Description of data collection + research sample
Theme 1.
Pedagogical method.
Theme 2.
Usage of digital tools.
Theme 3.
Students’ confidence and awareness of the current digital tools.

3. Results

When it comes to the gender distribution, 51% of the respondents were male, 48% were female and 1% did not specify. Generally, both male and female participants showed the same pattern in their responses. The near-equal gender distribution among respondents highlights a balanced representation in the study. The respondents were also clustered into two groups—engineering and non-engineering students. Engineering students represent 53.5% of the research sample, non-engineering students represent 41.5% of the sample, and 5% of the answers were excluded as incomplete. The categorization of respondents into engineering and non-engineering students allows for targeted analysis, revealing distinct patterns within these educational domains.
When it comes to the teacher’s usage of pedagogical methods as perceived by students, our analysis indicates that both groups (engineering and non-engineering students) experience that simulations and field studies are much less used by teachers. The difference between the two groups shows that although pre-recorded lectures were used for engineering students, they were much less used for non-engineering students, whereas flipped classrooms and case studies were used for non-engineering participants but utilized to a lesser extent for engineering participants. Approximately half of the engineering students stated that lab work had been used in their education. Discrepancies between engineering and non-engineering students in the utilization of pedagogical methods shed light on tailored educational approaches within different disciplines. In Table 2 below, we present an overview of the pedagogical methods grouped by collaborative and non-collaborative pedagogical methods.
Another pattern that emerged in our analysis relates to the learning venue. Teachers of engineering students seem to prefer on-campus pedagogical methods more than online ones. However, teachers of non-engineering students seem to adopt a hybrid approach, mixing on-campus and online education. Furthermore, engineering students’ teachers use seminars in smaller groups, lab exercises, and guest lectures mostly on campus and, to a much lesser extent, online. On the other hand, students in non-engineering fields have seminars and guest lectures in hybrid mode: sometimes on campus and sometimes online. These varied preferences among teachers for on-campus versus online pedagogical methods suggest nuanced approaches to digital integration, influenced by disciplinary norms.
When it comes to the pedagogical methods that support students’ learning as perceived by them, our analysis indicates that both engineering and non-engineering students agree to a certain extent that traditional lectures and guest lectures support their learning whether it is held on campus or online. They also agree that field studies support their learning experiences. When it comes to lab exercises and simulations, results for both groups indicate that these methods strongly support student learning when it is held on campus (see Table 3). Agreement among both student groups regarding the supportive nature of traditional lectures and guest lectures underscores the perceived value of conventional teaching methods.
However, we noticed a number of differences between the two groups. The flipped classroom was perceived as more supportive in learning by non-engineering students. Students in engineering programs perceived seminars in smaller groups, case studies, and supervision as supportive only when carried out on campus. Teachers for both participant categories use collaborative and non-collaborative methods equally for their students. When it comes to engineering students, the collaborative methods used that are perceived as supportive of learning are seminars in smaller groups, lab exercises and laboratories, as well as supervision. Less frequently used but appreciated collaborative methods are case studies and field studies. When it comes to non-collaborative methods used, traditional lectures and pre-recorded lectures are appreciated.
When it comes to non-engineering students, collaborative methods used and found to be supportive are seminars in smaller groups and supervision, whether they are on campus or online. Less frequently used but appreciated are labs and case studies on campus, as well as field studies. On the other hand, non-collaborative methods used and appreciated by non-engineering students are traditional lectures, pre-recorded lectures, and guest lectures if they are on campus. Variances between engineering and non-engineering students in their perception of pedagogical support highlight discipline-specific needs and preferences.
Looking at pedagogical methods that support learning and gender distribution, we find that female students, to a higher degree, prefer traditional lectures on campus (97% compared to 92% for male students), seminars in smaller groups on campus (89% compared to 82% for male students) and case studies on campus (70% compared to 60% for male students). Male students, on the other hand, prefer traditional lectures online to a higher degree (61% of compared to 54% for female students). The differences are small, but it seems that female students prefer, to a higher degree, to study on campus compared to male students.
When it comes to the teachers’ usage vs. students’ support of digital collaborative methods as perceived by students based on the survey results, it is clear that teachers of engineering students do not use collaborative methods in a digital format to a large extent. Students also reported that collaborative pedagogical methods support their learning but only when they are carried out on campus. In contrast to engineering students, teachers of non-engineering students offer a more flexible teaching mode. Students agree that the digital collaborative methods used by the teachers (seminars in small groups, case studies, and supervision) supported their learning to a certain extent. As for lab exercises and simulations, most of the respondents reported that they are seldom used.
When it comes to the usage of digital tools, programs, and services as perceived by students (see Table 4), both respondent groups often use the following digital tools/programs: computers, word processing software, and cloud services. Tools that were used sometimes are spreadsheets and video streaming, whereas seldom-used tools are social media, votes and games. However, engineering students sometimes use video production services, whereas non-engineering students seldom use them.
When it comes to teachers’ usage of tools/programs/services, it is similar to what students use (e.g., computers and presentation programs). Video conferencing and production is sometimes used in teaching, whereas vote-counting systems, social media, cloud services and games are rarely used. According to the survey results, teachers and students in engineering programs use more collaborative digital tools than those in non-engineering programs. Consistency between students’ and teachers’ usage of certain digital tools suggests alignment in educational practices, and differences underscore opportunities for enhanced digital integration.
When it comes to the students’ perceived confidence and awareness of digital tools, both respondent groups feel confident in using digital tools in their education and are able to solve technical issues that may occur with these tools. Results also show that respondents affirm that they can quickly learn new technologies and, to a certain extent, keep themselves informed about new digital tools relevant to their educational fields. Moreover, the survey results show that students in both groups confirm, to a certain degree, their knowledge about digital tools and they partly agree that they do have the necessary basic knowledge to be able to use new digital tools quickly. Engineering students are less likely to try new educational digital tools. This may be explained by how satisfied these students are with the current digital tools that they are using in their education. In contrast, non-engineering students seem to be motivated to try new digital tools (see Table 5 below).
Even if both student groups showed strong confidence in their ability to accommodate different digital tools, the majority of students were not aware of around half of the existing digital functionalities LMS Blackboard Learn 3900.91 adopted at the university. To summarize, high confidence levels among students in using digital tools contrast with limited awareness of existing functionalities, signaling potential gaps in digital literacy despite proficiency.
Looking at students’ confidence when using digital tools in their education, there is no difference between female/male students. There is, however, a difference in the perception of the student’s ability to quickly learn new technology, where male students, to a higher degree (55%), perceive that they learn quickly compared to female students (40%). Also, when it comes to the question of keeping yourself informed about new digital tools that are relevant to your education, there is a difference: here, 38% of the male students totally agree compared to 21% of the female students. There are also differences when it comes to the questions about trying out new digital tools (32% of male students totally agree compared to 19% of female students), if you know a lot about digital tools (32% of male students totally agree compared to 16% of female students),and if you have the necessary basic knowledge to be able to learn digital tools quickly (51% of male students totally agree compared to 46% of female students).
We also asked the students the following questions concerning their training and use of digital tools in their education, and here there are differences in perceptions between male and female students, as presented below:
-
Do you think that you have received sufficient training to use digital tools in your education?” 56% of male students totally agree compared to 37% of female students.
-
Do you think that you have received sufficient possibilities to use digital tools in your education?” 66% of male students totally agree compared to 53% of female students.
-
Do you think that you have received sufficient explanation) (educational incentives) for using digital tools in your education?” 47% of male students totally agree compared to 32% of female students.
-
Do you think that you have received sufficient possibilities to critically evaluate digital tools in relation to your education and future profession?” 42% of male students totally agree compared to 31% of female students.

4. Discussion

Related to digital usage (RQ1), the results point to some differences in pedagogical methods used. Engineering and non-engineering students showed variations in the usage of digital pedagogical methods by their teachers, indicating discipline-specific preferences and practices in digital integration. Furthermore, teachers of engineering students tended to prefer on-campus pedagogical methods over online ones, suggesting a preference for traditional instructional approaches within this discipline. In contrast, teachers of non-engineering students adopted a hybrid approach, incorporating both on-campus and online pedagogical methods, reflecting a more flexible stance towards digital integration in this discipline. Students generally perceived digital collaborative methods, such as seminars in smaller groups and supervision, as supportive of their learning experiences, particularly when conducted on campus. Both students and teachers demonstrated consistency in the usage of certain digital tools (such as computers, word processing software and cloud services), indicating alignment in educational practices regarding these tools. Engineering students reported less frequent usage of collaborative digital tools by their teachers, suggesting a potential gap in leveraging digital technologies for collaborative learning experiences within this discipline. Despite high confidence levels in using digital tools, students exhibited limited awareness of existing digital functionalities, highlighting the need for enhanced digital literacy initiatives to bridge this gap. These results align well with the TPACK framework, highlighting the point of intersection between subject/content, pedagogy, and technology [25,26] and the variants of (digital) teaching competence needed.
When it comes to the students’ perspectives on their teachers’ digital competence (RQ2), the results indicate that there is some variability in the teachers’ utilization of digital pedagogical methods (with some methods being more commonly used than others), which suggests differing levels of digital competence among teachers. However, the differences in the usage of digital methods between engineering and non-engineering teachers implies discipline-specific preferences and practices, where teachers may adapt their approach to digital integration based on the specific needs and requirements of their discipline. Moreover, the preference for on-campus pedagogical methods among engineering teachers may suggest a reliance on traditional instructional approaches and a potential hesitancy to use digital technologies. This preference could be indicative of varying levels of digital competence among engineering teachers. In contrast, non-engineering teachers’ adoption of a hybrid approach, incorporating both on-campus and online methods, may indicate a higher level of digital competence and adaptability. These teachers appear more willing to explore digital tools and integrate them into their teaching practices. The overall limited usage of collaborative digital tools, particularly among engineering teachers, suggests a potential gap in their digital competence in leveraging technology for collaborative learning experiences. This finding highlights an area where further support and training may be beneficial to enhance teachers’ digital competence.
Overall, the results suggest that teachers’ digital competence varies across disciplines and may influence their utilization of digital pedagogical methods and tools. Addressing these variations and providing targeted support and training can help enhance teachers’ digital competence and promote effective digital integration in higher education. These findings underscore the importance of discipline-specific considerations in digital integration efforts within higher education, emphasizing the need for tailored approaches to digital pedagogy and ongoing support for enhancing digital literacy among students and teachers alike. This aligns with Zhao et al. [22] who points out the need for training related to the use of ICT and digital competencies among university teachers. Furthermore, Kyndt et al. [4] underlines that higher education teachers must maintain awareness regarding the practical activities undertaken by their students and the corresponding skill sets cultivated within workplace contexts, including the digital elements. This awareness facilitates the adaptation of classroom pedagogy, ensuring a cohesive integration of theoretical principles with hands-on application [4].
Focusing specifically on collaborative learning methods used (such as seminars in smaller groups, labs, case studies and supervision), both engineering and non-engineering students reported varying levels of usage, which might reflect discipline-specific preferences and practices in collaborative learning. The reported utilization of digital collaborative tools (such as video conferencing programs and collaborative software) provides insights into the extent to which technology is leveraged to facilitate collaborative learning experiences. Differences in usage patterns between disciplines may reflect varying levels of integration of digital technologies into collaborative learning practices. The adoption of hybrid approaches to collaboration, incorporating both on-campus and online components, suggests flexibility in facilitating collaborative learning experiences. This hybrid approach may accommodate diverse learning preferences and enhance accessibility for students. In summary, students’ views on the digital tools used in their education reflect the prevalence, utility, and effectiveness of these tools in supporting their learning experiences. Variations in tool usage between disciplines and students’ reported confidence and awareness highlight opportunities for enhancing digital integration and literacy initiatives in higher education.
When looking at gender differences, we see a minor difference regarding preferences of pedagogical methods, where female students prefer to study on campus to a slightly greater degree than male students. Moreover, male students are slightly more confident in their knowledge about the use of digital tools use than female students. Male students also believe that they have received sufficient training to use digital tools in a higher degree compared to female students. These findings in regard to gender differences are worth noting for educators when planning their teaching.
A more general result of the study is that students are exposed to a wide range of educational methods, both online and on campus, including traditional lectures, seminars, case studies, and guest lectures, reflecting a diverse educational approach (as is illustrated in Figure 2 below).
Aligned with this, and with the aim of the study, some important insights can be made regarding the students’ perception of the use of digital elements in their education which also reflects their views on their teachers’ digital competence. Firstly, although there is an evident shift towards digital learning tools, students still strongly prefer traditional learning methods like face-to-face lectures and seminars. This indicates that although digital tools are valued, conventional learning modes still hold significant importance in the students’ educational experience. Secondly, there is a notable split among students regarding whether they have received sufficient training to use digital tools effectively in their education. This indicates a need for more consistent and comprehensive training programs to ensure all students are equally prepared for digital learning. Thirdly, the results also show that students generally feel confident using basic digital tools, but there is, however, a gap in their exposure to and utilization of more advanced features or less common digital tools. This suggests an opportunity for educational institutions to enhance digital literacy beyond the basics. Additionally, although students are using digital tools, there seems to be a lack of depth in their understanding and ability to critically evaluate these tools in relation to their education and future profession. This suggests an educational gap where students are not fully equipped to assess the relevance and effectiveness of digital tools in a broader educational and professional context. Lastly, the results suggest that students regard digital tools more as a complement to traditional learning methods rather than a replacement, which pinpoints the need for a balanced educational approach integrating traditional and digital methods to provide a more comprehensive learning experience. Something that struck us as quite surprising were the results showing that the flipped classroom model and online case studies are less frequently used or experienced by students, with 54 and 67 respondents, respectively, never engaging in these methods, indicating either a lack of availability or student reluctance towards these approaches, which we would like to explore further.
Even though this study presents interesting insights about students’ perceptions on their teachers’ digital competence, future research could provide a more in-depth analysis by employing different statistical methods and going beyond descriptive statistics alone. Future research can also include the content knowledge dimension of the TPACK framework to provide a comprehensive overview of students’ perceptions.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14080891/s1, File S1: Survey Questions.

Author Contributions

Conceptualization, J.S., M.H. and J.T., methodology, J.S., M.H. and J.T.; formal analysis, J.S., M.H., J.T. and W.C.; investigation, J.S., M.H., J.T. and W.C.; writing—original draft preparation J.S., M.H. and J.T.; writing—review and editing—J.S., M.H. and J.T.; visualization, J.S., M.H. and J.T.; supervision, J.S., M.H. and J.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data can be made available by request.

Acknowledgments

This research was initiated and supported by the Academy for Business, Innovation and Sustainability (FIH), Halmstad University and is an activity within LeaDS—Learning in a Digitalised Society research program. We also acknowledge Matilda Lundström’s involvement and support in the initial phase of data collection. Finally, we thank the anonymous reviewers for their constructive comments during the review process.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. European Commission. Digital Agenda for Europe—Rebooting Europe’s Economy, 8th ed.; European Commission: Brussels, Belgium, 2014. [Google Scholar] [CrossRef]
  2. European Education Area. Digital Education Action Plan (2021–2027); European Education Area: Brussels, Belgium, 2021; Available online: https://education.ec.europa.eu/focus-topics/digital/education-action-plan (accessed on 21 September 2023).
  3. García, R.C.; Buzón-García, O.; de Paz-Lugo, P. Improving future teachers’ digital competence using active methodologies. Sustainability 2020, 12, 7798. [Google Scholar] [CrossRef]
  4. Kyndt, E.; Beausaert, S.; Zitter, I. (Eds.) Developing Connectivity between Education and Work: Principles and Practices; Routledge: Boca Raton, FL, USA, 2022. [Google Scholar] [CrossRef]
  5. Hamalainen, R.; Nissinen, K.; Mannonen, J. Understanding teaching professionals’ digital competence: What do PIAAC and TALIS reveal about technology-related skills, attitudes, and knowledge? Comput. Hum. Behav. 2021, 117, 106672. [Google Scholar] [CrossRef]
  6. Spante, M.; Sofkova Hashemi, S.; Lundin, M.; Algers, A. Digital competence and digital literacy in higher education research: Systematic review of concept use. Cogent Educ. 2018, 5, 1519143. [Google Scholar] [CrossRef]
  7. Zhao, Y.; Sánchez Gómez, M.C.; Pinto Llorente, A.M.; Zhao, L. Digital Competence in Higher Education: Students’ Perception and Personal Factors. Sustainability 2021, 13, 12184. [Google Scholar] [CrossRef]
  8. Antonietti, C.; Cattaneo, A.; Amenduni, F. Can teachers’ digital competence influence technology acceptance in vocational education? Comput. Hum. Behav. 2022, 132, 107266. [Google Scholar] [CrossRef]
  9. Basilotta-Gómez-Pablos, V.; Matarranz, M.; Casado-Aranda, L.-A.; Otto, A. Teachers’ digital competencies in higher education: A systematic literature review. Int. J. Educ. Technol. High. Educ. 2022, 19, 8. [Google Scholar] [CrossRef]
  10. Inamorato dos Santos, A.; Chinkes, E.; Carvalho, M.A.; Solórzano, C.M.; Marroni, L.S. The digital competence of academics in higher education: Is the glass half empty or half full? Int. J. Educ. Technol. High. Educ. 2023, 20, 9. [Google Scholar] [CrossRef] [PubMed]
  11. Saltos-Rivas, R.; Novoa-Hernández, P.; Rodríguez, R.S. Understanding university teachers’ digital competencies: A systematic mapping study. Educ. Inf. Technol. 2023, 28, 16771–16822. [Google Scholar] [CrossRef]
  12. Elm, A.; Nilsson, K.S.; Björkman, A.; Sjöberg, J. Academic teachers’ experiences of technology enhanced learning (TEL) in higher education—A Swedish case. Cogent Educ. 2023, 10, 2237329. [Google Scholar] [CrossRef]
  13. Sjöberg, J.; Lilja, P. University Teachers’ Ambivalence about the Digital Transformation of Higher Education. Int. J. Learn. Teach. Educ. Res. 2019, 18, 133–149. [Google Scholar] [CrossRef]
  14. Núñez-Canal, M.; de Obesso, M.d.L.M.; Pérez-Rivero, C.A. New challenges in higher education: A study of the digital competence of educators in Covid times. Technol. Forecast. Soc. Chang. 2022, 174, 121270. [Google Scholar] [CrossRef]
  15. Sofkova Hashemi, S.; Berbyuk Lindström, N.; Brooks, E.; Hahn, J.; Sjöberg, J. Impact of Emergency Online Teaching on Teachers’ Professional Digital Competence: Experiences from the Nordic Higher Education Institutions. In Proceedings of the International Conference on Information Systems (ICIS): Rising like a Phoenix: Emerging from the Pandemic and Reshaping Human Endeavors with Digital Technologies, Hyderabad, India, 10–13 December 2023; Available online: https://aisel.aisnet.org/icis2023/learnandiscurricula/learnandiscurricula/12 (accessed on 6 August 2024).
  16. de Obesso, M.; Núnez-Canal, M.; Pérez-Rivero, C.A. How do students perceive educators’ digital competence in higher education? Technol. Forecast. Soc. Chang. 2023, 188, 122284. [Google Scholar] [CrossRef]
  17. Price, L.; Kirkwood, A. Using technology for teaching and learning in Higher Education. A critical review of the role of evidence in informing practice. High. Educ. Res. Dev. 2014, 33, 549–564. [Google Scholar] [CrossRef]
  18. Singh, G.; Hardaker, G. Barriers and enablers to adoption and diffusion of eLearning: A systematic review of the literature—A need for an integrative approach. Educ. Train. 2014, 56, 105–121. [Google Scholar] [CrossRef]
  19. Krumsvik, R. Situated learning in the network society and the digitised school. Eur. J. Teach. Educ. 2009, 32, 167–185. [Google Scholar] [CrossRef]
  20. Cabero-Almenara, J.; Gutiérrez-Castillo, J.J.; Palácios-Rodriguez, A.; Barroso-Osuna, J. Development of the teacher digital competence validation of digcompedu check-in questionnaire in the university context of Andalusia (Spain). Sustainability 2020, 15, 6094. [Google Scholar] [CrossRef]
  21. Henderson, M.; Selwyn, N.; Aston, R. What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Stud. High. Educ. 2017, 42, 1567–1579. [Google Scholar] [CrossRef]
  22. Zhao, Y.; Llorente, A.; Gómez, M. Digital competence in higher education research: A systematic literature review. Comput. Educ. 2021, 168, 104212. [Google Scholar] [CrossRef]
  23. Hansson, E.; Sjöberg, J. Making use of students’ digital habits in higher education: What they already know and what they learn. J. Learn. Dev. High. Educ. 2019, 14. [Google Scholar] [CrossRef]
  24. Marrero-Sánchez, O.; Vergara-Romero, A. Digital competence of the university student. A systematic and bibliographic update. Amazon. Investig. 2023, 12, 9–18. [Google Scholar] [CrossRef]
  25. Mishra, P.; Koehler, M.J. Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  26. Herring, M.C.; Koehler, M.J.; Mishra, P. Handbook of Technological Pedagogical Content Knowledge (TPACK) for Educators, 2nd ed.; Routledge: New York, NY, USA, 2016. [Google Scholar]
  27. Mishra, P. Using the TPACK Image. 2011. Available online: http://www.tpack.org (accessed on 15 December 2023).
  28. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Koehler, M.J.; Mishra, P.; Shin, T. Survey of Preservice Teachers’ Knowledge of Teaching and Technology. In Proceedings of the 2009 Annual Meeting of the American Educational Research Association, San Diego, CA, USA, 13–17 April 2009. [Google Scholar]
  29. Vetenskapsrådet (Swedish Research Counsel). God Forskningssed; Vetenskapsrådet: Stockholm, Sweden, 2017; Available online: http://www.vr.se (accessed on 19 March 2024).
  30. Tukey, J.W. Exploratory Data Analysis; Addison-Wesley: Reading, MA, USA, 1977; Volume 2, pp. 131–160. [Google Scholar]
  31. Mishra, P.; Pandey, C.M.; Singh, U.; Gupta, A.; Sahu, C.; Keshri, A. Descriptive statistics and normality tests for statistical data. Ann. Card. Anaesth. 2019, 22, 67–72. [Google Scholar]
  32. Cooksey, R.W.; Cooksey, R.W. Descriptive statistics for summarising data. In Illustrating Statistical Procedures: Finding Meaning in Quantitative Data; Springer Nature: Dordrecht, The Netherlands, 2020; pp. 61–139. [Google Scholar]
Figure 1. Technological pedagogical content knowledge framework [27].
Figure 1. Technological pedagogical content knowledge framework [27].
Education 14 00891 g001
Figure 2. Educational methods along the two axes, active/passive learning approaches and internal (university)/external (industry/organisation) resources.
Figure 2. Educational methods along the two axes, active/passive learning approaches and internal (university)/external (industry/organisation) resources.
Education 14 00891 g002
Table 1. Overview of the survey questions and related constructs from the TPACK framework.
Table 1. Overview of the survey questions and related constructs from the TPACK framework.
Question Category (Q1–22 are Visible in Supplementary Materials)Number of Questions IncludedQuestion Tackles the Following Constructs Contributing to the Study
Background information (Q1–4)4Overall demographics of the survey population
Pedagogical methods utilized (Q5–6)2Pedagogical knowlege from TPACK framework to map the utilized pedagogical methods
Students’ use of
tools/programs services
(Q9, Q10, Q16, Q17, Q18, Q19, Q20)
7Technological knowledge from TPACK framework to map the utilized tools/programs/services
Teachers use
of tools/programs/services
(Q11, Q12)
2Technological knowledge from TPACK framework to map the utilized tools/programs/services
Tools/programs/services
supporting learning
(Q7, Q8, Q13, Q14)
4Technological pedagogical knowledge from TPACK framework to map how tools/programs/services supporting learning
Confidence in using
digital tools (Q15, Q21)
2TPACK framework to map confidence in using the digital tools
Open question (Q22)1Possibility to add any aspects that the survey has not covered
Table 2. Frequency of use of pedagogical methods for engineering and non-engineering respondents.
Table 2. Frequency of use of pedagogical methods for engineering and non-engineering respondents.
Pedagogical MethodEngineering StudentsNon-Engineering Students
Non-Collaborative Pedagogical MethodsTraditional lecturesOften on campus (83%)
Never online (66%)
Often on campus (66%)
Sometimes online (68%)
Pre-recorded lecturesSometimes (55%)Never (72%)
Guest lecturesSometimes on campus (81%)
Never online (83%)
Sometimes on campus (63%)
Sometimes online (49%)
Collaborative Pedagogical MethodsFlipped classroomNever (61%)Sometimes (45%)
Seminars in smaller groupsSometimes on campus (75%)
Never online (86%)
Often on campus (45%)
Sometimes online (49%)
LabsSometimes on campus (53%)
Never online (88%)
Never (53%) on campus
Never online (71%)
Case studiesNever on campus (63%)
Never online (91%)
Sometimes on campus (47%)
Sometimes online (45%)
SimulationsNever on campus (77%)
Never online (89%)
Never on campus (47%)
Never online (46%)
Field studiesNever (70%)Never (57%)
SupervisionNever on campus (54%)
Never online (80%)
Never on campus (36%)
Sometimes online (43%)
Table 3. Support of pedagogical methods in engineering and non-engineering students’ education.
Table 3. Support of pedagogical methods in engineering and non-engineering students’ education.
Pedagogical MethodEngineering StudentsNon-Engineering Students
Non-collaborative Pedagogical MethodsTraditional lecturesTotally agree on campus (53%)
Partly agree online (42%)
Totally agree on campus (66%)
Partly agree online (47%)
Pre-recorded lecturesPartly agree (50%)Partly agree (34%)
Guest lecturesPartly agree on campus (51%)
Partly agree online (38%)
Totally agree on campus (45%)
Totally agree and partly agree online (40%)
Collaborative Pedagogical MethodsFlipped classroomPartly disagree (40%)Partly agree (34%)
Seminars in smaller groupsPartly agree on campus (56%)
Disagree online (38%)
Totally agree on campus (50%)
Partly agree online (34%)
LabsPartly agree on campus 55%
Disagree online (42%)
Totally Agree (38%)
Disagree online (38%)
Case studiesPartly agree on-campus (36%)
Disagree online (47%)
Agree on campus (51%)
Partly agree online (36%)
SimulationsPartly agree on campus (49%)
Disagree online (43%)
Totally Agree on campus (38%)
Disagree online (30%)
Field studiesPartly agree (50%)Totally agree (50%)
SupervisionPartly agree on campus (49%)
Disagree online (38%)
Totally agree on campus (46%)
Partly agree online (33%)
Table 4. Students’ usage of programs/tools/programs/services in their education.
Table 4. Students’ usage of programs/tools/programs/services in their education.
Tools/Programs/ServicesEngineering StudentsNon-Engineering Students
ComputerOften (84%)Often (60%)
TabletsNever (56%)Never (59%)
Presentation programs/services/pptSometimes (46%)Often (61%)
Word processing software: e.g., Word, PagesOften (71%)Often (80%)
Spreadsheet programs, e.g., Excel, NumbersSometimes (71%)Sometimes (58%)
Video conference programs, e.g., Skype, Facetime, Hangouts, Adobe Connect, etc.Sometimes (48%)Sometimes (47%)
Votes counting systems/services, e.g., Mentimeter, Klickers, PollEverywhere, etc.Never (82%)Never (59%)
Video production, e.g., Youtube, Kultura, Vimeo, etc.Sometimes (50%)Never (41%)
Social media, e.g., Facebook, Twitter, Instagram, Snapchat, etc.Never (47%)Never (43%)
Cloud services, e.g., Dropbox, BOX, Google Drive, iCloud, etc.Often (49%)Often (47%)
Video streaming services, e.g., YouTube, Ted Talks, Netflix, SVT playSometimes (58%)Sometimes (43%)
Games, e.g., Game consoles, mobile devices, computers, etc.Never (71%)Never (65%)
Other tools/programs/servicesNever (66%)Never (65%)
Table 5. Student perceptions of their behavior with digital tools in engineering and non-engineering programs.
Table 5. Student perceptions of their behavior with digital tools in engineering and non-engineering programs.
StatementsEngineering StudentsNon-Engineering Students
I feel confident using digital tools in my educationTotally agree 65%Totally agree 68%
Partly agree 23%Partly agree 24%
Partly disagree 6%Partly disagree 5%
Disagree 6%Disagree 3%
I can solve any technical issues that may occur with the digital tools I useTotally agree 28%Totally agree 37%
Partly agree 45%Partly agree 43%
Partly disagree 18%Partly disagree 17%
Disagree 9%Disagree 3%
I can quickly learn new technologyTotally agree 41%Totally agree 55%
Partly agree 44%Partly agree 34%
Partly disagree 8%Partly disagree 9%
Disagree 7%Disagree 2%
I keep myself informed on new digital tools that are relevant to my field of studyTotally agree 20%Totally agree 40%
Partly agree 53%Partly agree 46%
Partly disagree 20%Partly disagree 9%
Disagree 7%Disagree 5%
I often try new digital toolsTotally agree 15%Totally agree 34%
Partially agree 39%Partly agree 34%
Partly disagree 41%Partially disagree 25%
Disagree 11%Disagree 7%
I know a lot about digital toolsTotally agree 21%Totally agree 27%
Partly agree 46%Partly agree 46%
Partially disagree 27%Partically disagree 22%
Disagree 6%Disagree 5%
I have the necessary basic knowledge to be able to use new digital tools quicklyTotally agree 43%Totally agree 55%
Partly agree 43%Partly agree 41%
Partially disagree 12%Partically disagree 3%
Disagree 2%Disagree 1%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sjöberg, J.; Hoveskog, M.; Tell, J.; Cherni, W. Unveiling University Students’ Perceptions on Their Teachers’ Digital Competence. Educ. Sci. 2024, 14, 891. https://doi.org/10.3390/educsci14080891

AMA Style

Sjöberg J, Hoveskog M, Tell J, Cherni W. Unveiling University Students’ Perceptions on Their Teachers’ Digital Competence. Education Sciences. 2024; 14(8):891. https://doi.org/10.3390/educsci14080891

Chicago/Turabian Style

Sjöberg, Jeanette, Maya Hoveskog, Joakim Tell, and Wiem Cherni. 2024. "Unveiling University Students’ Perceptions on Their Teachers’ Digital Competence" Education Sciences 14, no. 8: 891. https://doi.org/10.3390/educsci14080891

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop