*Article* **STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students' Reflection on Students' Learning**

**Cindy Lenhart \* and Jana Bouwma-Gearhart**

College of Education, Oregon State University, Corvallis, OR 97331, USA; Jana.Bouwma-Gearhart@oregonstate.edu **\*** Correspondence: lenhartc@oregonstate.edu

**Abstract:** This paper explores the affordances and constraints of STEM faculty members' instructional data-use practices and how they engage students (or not) in reflection around their own learning data. We found faculty used a wide variety of instructional data-use practices. We also found several constraints that influenced their instructional data-use practices, including perceived lack of time, standardized curriculum and assessments predetermined in scope and sequence, and a perceived lack of confidence and competence in their instructional data-use practices. Novel findings include faculty descriptions of instructional technology that afforded them access to immediate and nuanced instructional data. However, faculty described limited use of instructional data that engaged students in reflecting on their own learning data. We consider implications for faculty's instructional datause practices on departmental and institutional policies and procedures, professional development experts, and for faculty themselves.

**Keywords:** STEM; undergraduates; instructional data; teaching practices; instructional technology; assessment; student reflection on learning; policy

### **1. Introduction**

#### *1.1. Calls for Increasing Instructional Data-Use Practices in Postsecondary Education*

In response to calls for increased accountability from policymakers, accreditation agencies, and other stakeholders, higher education institutions are devoting more resources to gathering and analyzing evidence around student learning outcomes to inform strategies promoting student success and persistence [1–3]. Specifically, educators are asked to gather and respond to evidence of student learning to inform their future teaching-related decisions and practices. These calls, including for those faculty working in the STEM disciplines (science, technology, engineering, and mathematics), demonstrate a growing focus for many concerned with improvements to postsecondary education [4,5]. This push is well founded. Educators' systematic use of instruction-related data has been shown to enhance student learning and achievement via faculty data-driven decisions [6].

The push for faculty to engage in systematic instructional data-use practices goes beyond their summative examination of students, often infrequent and not particularly illuminative [1,5], to include more formative data-use practices, including those that can inform immediate teaching practices. These connote increased repertoires of practice for many faculty, placing additional demands on them and, by association, those who seek to help develop their teaching-related practices. Emerging research indicates that STEM faculty are not necessarily ready to utilize diverse instructional data effectively or to constructively inform practice generally [5]. Teaching improvement interventions that target STEM faculty development of effective instructional data-use practices are becoming more numerous [1,4]. However, alongside limited knowledge about how postsecondary educators make decisions about their teaching overall [7], we still know little about how

**Citation:** Lenhart, C.; Bouwma-Gearhart, J. STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students' Reflection on Students' Learning. *Educ. Sci.* **2021**, *11*, 291. https://doi.org/10.3390/ educsci11060291

Academic Editors: Maria José Sousa, Fátima Suleman, Pere Mercadé Melé and Jesús Molina Gómez

Received: 21 May 2021 Accepted: 9 June 2021 Published: 12 June 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

instructional data informs faculty teaching-related decisions [5,8]. This lack of knowledge limits abilities to help faculty enhance their instructional data-use practices, meet the calls of those calling for this, and respond to faculty members' actual realities and needs.

What we do know from the research around educators' instructional data-use practices largely concerns the degree to which K-12 educators implement policies mandating these practices [9,10]. A growing body of practice-based research investigates the experiences of K-12 educators' agency and capability to employ data-use interventions in light of their professional contexts, e.g., [10–12]. From this research, we have some insight into why and how instructional data-use interventions are effective (or not). For instance, we know that educators must find instructional data relevant and meaningful concerning their teaching realities to consider changing their teaching practices in light of it [10]. In addition, we know that K-12 educators appreciate talking with other educators about data, including how to interpret it in light of practice [12]. Furthermore, educators' data use happens in light of larger contextual complexities. Institutional norms and structures, including departmental and social networks, can influence educators' access to and practices around data, including the knowledge and skills they need to analyze and use data to improve instruction [12–14]. Research in K-12 educators' successful interventions that target educator's data-use practices can lead to meaningful and more reliable assessments of students' learning [15].

#### *1.2. What We Know About Postsecondary Faculty's Instructional Data-Use Practices*

While the above K-12-focused research provides some important insights, we need empirical evidence that examines how postsecondary educators use instructional data [4]. Limited empirical research indicates that most postsecondary educators may not consider collecting and reflecting on instructional data to be their responsibility [16]. In addition, faculty may not have access to meaningful instructional data that they feel can inform their teaching [16], including those in the STEM disciplines [4,8]. Faculty also may not feel confident or competent in analyzing instructional data [3,17]. They may not have adequate time or appropriate resources or tools to engage in data-driven decision-making concerning instructional practices [5,18].

Even those postsecondary educators trained in STEM disciplines, where effective data collection skills and use may be assumed, have shown more limited instructional data use, even when data were made available to them. In one study, Bouwma-Gearhart and Hora and colleagues [4,5,8] explored the instructional data-use practices of 59 STEM faculty and 20 administrators at three institutions. Faculty noted they collected instructional data due to accreditation requirements and policies related to departmental reviews, but less so as part of their other instructional decision-making processes. Departmental and institutional interventions designed to improve faculty use of instructional data had low impact due to limits to faculty time and their lack of expertise with using teaching-related data. Notably, faculty found the availability of instructional data experts at their institutions to be an affordance, such as other faculty and staff, sometimes located in centers for teaching and learning. Faculty were more likely to implement instructional data-use practices and types of data if they aligned with their overall instructional goals. They generally found institutionally collected data, such as student evaluations, to be unreliable and insufficient, which discouraged their use of this data and provided motivation to create and implement some other instructional data practices. Although many faculty did not implement many instructional data-use practices, they found that those who did across their wide sample used various instructional data-use practices, both quantitative and qualitative. Decisions regarding data analysis that applied to practice included altering future versions of the courses based on analysis of exams and altering the pace of teaching, including time spent on particular topics.

While the above studies are helpful in understanding faculty use of instructional data, we need additional research to confirm or test these limited findings. Much of the scholarship related to arguments for the use by postsecondary educators of instruction-

related data is anecdotal with limited empirical rooting [1]. What we do know about these faculty members comes from one study of STEM faculty from three universities, i.e., [4,5,8]. Specifically, we need additional inquiry about faculty realities that may drive interventions, development, and implementation to serve as affordances for what happens when faculty interact with data in their work [12]. Additionally, while instructional technologies hold the promise of influencing faculty's collection and use of data and the ability to involve students more in the learning process, we largely lack evidence of this promise for these educators. Further, given the increased recognition of the value of involving students in reflecting on their learning, more research is needed on how STEM faculty can foster and engage students in reflecting on data around their learning [19,20]. Knowledge of these phenomena can assist in designing effective supports for faculty to more frequently and better use instructional data to improve their teaching and student learning, ultimately translating into greater success for students [21].

#### *1.3. Instructional Technology That Influences Instructional Data-Use Practices*

One affordance championed for faculty collection and instructional data-use practices is instructional technologies (e.g., electronic learning platforms and their analytics tools). Behind the push for faculty to use these tools are assumptions around the easy and quick collection and basic analyses of learning-related data they can provide [22]. Studies suggest faculty adoption and use of instructional technologies is slow and inconsistent [23]. In a report, Lester et al. [24] state much of the research around instructional technologies focuses on the different types of technology tools that faculty and institutions may use (e.g., clickers, learning management systems, adaptive learning). This research includes how technologies are adopted and adapted by institutions and their use and perceived usefulness by students of these tools. Lester and colleagues also state a paucity of research on faculty pedagogical changes resulting from using instructional technologies. However, they suggest insights gained from research around individual faculty decisions to incorporate other innovations into their teaching practices can inform faculty use of instructional technology. Specifically, they cited research that has identified faculty identity and beliefs established through disciplinary socialization and behaviors that can affect decision-making, e.g., [23,25,26].

Although instructional technologies may be appreciated by faculty as a potential means to improve teaching and learning, other studies found that many instructional technologies were not used to their fullest potential. Klein et al. [27], in a study of 6 faculty and 21 advising staff, identified barriers to effective use included a lack of reliable data infrastructure that was deemed cumbersome and misaligned with user needs as a deterrent to technology use. In another study, Hora and Holden [23] interviewed and observed 40 faculty in STEM departments at 3 universities in the U.S. Not surprisingly, perhaps, they found that faculty use of instructional technologies largely depended on its availability (or not). They also found faculty use turned on faculty perceptions of relevance to and alignment with their pre-existing beliefs and instruction goals, meaning faculty needed to see the instructional tool supporting their pedagogical practices. They also found that faculty members' prior experiences with the technology and the perceived affordance of particular tools influenced their use. These studies suggest that faculty use of instructional technologies, and the promise of such for their instructional data use practices, may be more complex than just access to technology. Still, more research is needed in this area to confirm or challenge these limited findings and provide insight into how faculty use instructional technology to inform decisions related to their teaching practices.

#### *1.4. Instructional Data-Use Practices That Involve Students Reflecting on Their Learning*

Calls for research investigating instructional data-use practices that involve students reflecting on their own learning-related data are growing as faculty are encouraged to shift their instructional practices from teacher-centered to more learner-centered approaches [16]. Faculty must shift their thinking from seeing their role, primarily, as a transmitter of knowledge to one that empowers students in their own learning and provides students more

meaningful feedback around their learning, beyond only results of summative assessments and final grades [21,28,29]. Ryan and Ryan [30] assert that including students in reflecting on their learning data is more than just sharing grades or exams with them. They state that students reflecting on their learning have generally included having students complete structured and unstructured reflective journaling, formal reflective papers, interviewing, and group memory work. Ryan and Ryan acknowledge the value of these activities but contend that examples of faculty engaging their students in systematic and deliberate activities that involve reflective learning are rare. The researchers noted potential barriers that influence faculty engagement of students in reflecting on their learning, including factors related to students' socio-cognitive abilities. These factors include students' developmental stages, such as whether they were in their first year or a later year of study in the discipline or field of study. The faculty's use of reflective practices was also influenced by the context and potential complexity of the discipline in which the learning occurred. A final factor that may influence faculty when engaging students in reflective learning is the diversity of learners who bring prior knowledge, abilities, and experiences that may add to the challenges faculty may encounter when engaging students in reflective learning practices [30].

Indeed, the faculty's use of instructional activities that have students reflecting on their learning data has been shown to foster student learning of course content and instill a greater sense of accountability for their learning [21]. Some research results seem promising, demonstrating that some STEM faculty are meeting researchers' calls to engage students in reflecting on their learning-related data (e.g., [20]). Nonetheless, more research is needed that investigates how faculty use their instructional data to design activities that engage students more in reflecting on their data.

#### *1.5. Paper Focus*

This paper details an exploratory study conducted at one U.S. research university of STEM faculty members' instructional data-use practices. Specific findings confirm the work of Bouwma-Gearhart and Hora et al., including faculty constraints to their data-use practices such as lack of time needed to implement data-use practices, standardization of course content that constrained some types of data collection, and a perceived lack of confidence and competence in their instructional data-use practices [4,5]. Novel findings include faculty descriptions of instructional technologies that they claimed allowed more timely and complete data to respond to more immediately in practice. Faculty who used adaptive learning technologies specifically claimed it helped them collect more nuanced data on achievement trends for different groups of students. Although all faculty used summative data (i.e., exams, written assignments), several faculty described less reliance on these typical data indicators of student learning, rather using other practices such as group work to measure student learning. Faculty were mixed in their practice of engaging students in reflecting on their own learning data. These practices were generally described as activities in which students were asked to reflect on their overall performance in class and how their use of study techniques may (or not) be helping them. We discuss the affordances and constraints that faculty perceive in their instructional data-use practices. We also discuss implications for departmental and institutional leaders, faculty leaders, professional developers, and for faculty themselves.

#### **2. Methodology**

#### *2.1. Conceptual Frameworks*

We assume faculty work within "complex network[s] of structures, tasks, and traditions that create and facilitate practice" [31], p. 2, which Halverson terms *systems-of-practice*. Faculty encounter in their systems-of-practice structures such as procedural norms and policies, physical objects and tools, and activities that can serve as affordances and impediments to their work (see Figure 1). Perceived affordance theory [32,33] is also relevant to our study as it helps to explain how and which structures and activities are salient to

faculty and impact their practice. *Perceived affordances* are factors in the system that faculty sense and deem relevant around a task related to their self-perceived ability to attend to the task [33,34]. As examples, affordances can be structures or activities that educators believe will allow (or inspire) them to engage in collecting and using instructional data. For instance, educators may use instructional technologies, such as clickers, to collect certain types of instructional data of their interest, assuming they have some competency to use the tool. sense and deem relevant around a task related to their self-perceived ability to attend to the task [33,34]. As examples, affordances can be structures or activities that educators believe will allow (or inspire) them to engage in collecting and using instructional data. For instance, educators may use instructional technologies, such as clickers, to collect certain types of instructional data of their interest, assuming they have some competency to use the tool.

We assume faculty work within "complex network[s] of structures, tasks, and traditions that create and facilitate practice" [31], p. 2, which Halverson terms *systems-of-practice*. Faculty encounter in their systems-of-practice structures such as procedural norms and policies, physical objects and tools, and activities that can serve as affordances and impediments to their work (see Figure 1). Perceived affordance theory [32,33] is also relevant to our study as it helps to explain how and which structures and activities are salient to faculty and impact their practice. *Perceived affordances* are factors in the system that faculty

*Educ. Sci.* **2021**, *11*, x FOR PEER REVIEW 5 of 27

**2. Methodology** 

*2.1. Conceptual Frameworks* 

**Figure 1.** Conceptual Framework: Integrating Perceived Affordances [32,33] and System-**Figure 1.** Conceptual Framework: Integrating Perceived Affordances [32,33] and System-of-Practice [31] Theories.

Affordance theory has been used by other researchers who recognize postsecondary educators as functioning within complex socio-cultural systems. Hora [35] found that structural and sociocultural factors afforded and constrained teaching practices. Affordances included the high degree of autonomy faculty had in making decisions related to their teaching practices. Constraints included policy implications related to issues such as promotion and tenure requirements. As demonstrated in some of this past research, affordances are not always positive for an educator's actions or insights [33]. Affordances may also be barriers to action [7]. For instance, Bouwma-Gearhart and colleagues found that frequent formative assessments that took up class time could act as barriers if faculty felt pressed to cover large quantities of specific types of content. We use perceived affordance theory to illuminate the realities of educators operating in complex socio-cultural systems in light of their professional realities, including their pedagogical knowledge, skills, norms, and felt competencies. Affordance theory has been used by other researchers who recognize postsecondary educators as functioning within complex socio-cultural systems. Hora [35] found that structural and sociocultural factors afforded and constrained teaching practices. Affordances included the high degree of autonomy faculty had in making decisions related to their teaching practices. Constraints included policy implications related to issues such as promotion and tenure requirements. As demonstrated in some of this past research, affordances are not always positive for an educator's actions or insights [33]. Affordances may also be barriers to action [7]. For instance, Bouwma-Gearhart and colleagues found that frequent formative assessments that took up class time could act as barriers if faculty felt pressed to cover large quantities of specific types of content. We use perceived affordance theory to illuminate the realities of educators operating in complex socio-cultural systems in light of their professional realities, including their pedagogical knowledge, skills, norms, and felt competencies.

#### *2.2. Research Questions*

of-Practice [31] Theories

*2.2. Research Questions*  Our exploratory research is guided by the following questions:


3. To what extent do faculty engage students in reflecting on their own learning

#### data? *2.3. Study Context*

This study took place at one large university in the United States, classified in the Carnegie Classification of Institutions of Higher Education [36] as a "doctoral university with the highest research activity". A comprehensive (campus-wide) STEM education improvement initiative was underway to foster evidence-based instructional improvements in large-enrollment, lower-division STEM courses by leveraging the distributed expertise of faculty to learn from one another. Funded by the National Science Foundation, a project research goal was to investigate changing faculty perceptions of teaching and their teachingrelated practices in light of the initiative activities. This paper centers around findings

from interview data collected near the end of the initiative, in 2017, specifically around the questions exploring faculty instructional data-use practices.

Participant Sample and Data Collection

Prior to collecting the interview data that roots this paper, surveys were sent to 420 faculty across STEM disciplines with 127 faculty responding, a 30% response rate. Table 1 shows the total number of survey respondents in each of the disciplines surveyed.

**Table 1.** Faculty respondents in STEM disciplines. N = 420 (n = respondents); Engineering includes civil, chemical, biological and mechanical.


Quantitative survey data were collected to ascertain the influence of the initiative's various aspects and inform subsequent interview protocol development. While the survey and interviews, focused on various aspects of the teaching of interest to the larger NSFfunded project, this paper focuses solely on faculty experiences around instructional data use.

We used descriptive analysis from three survey questions to probe for faculty perceptions of the larger group regarding their gathering, analyzing, and responding to data that informed their teaching. Respondents indicated their level of agreement on a scale of 0 (Not true at all), 2 (Somewhat true), and 4 (Very true) to the following prompts:


When faculty were asked if they regularly gather, analyze, and respond to data that informs their teaching, the results of the 127 faculty surveyed were a mean score of 2.50 ("Somewhat true" on a 5-point scale; SD = 1.04). Faculty indicated a slightly higher mean of 2.74 ("Somewhat true"; SD = 1.02) when asked if they were committed to gathering, analyzing, and responding to data. Faculty indicated a mean score of 2.75 ("Somewhat true"; SD = 0.94) when asked whether they knew how to gather, analyze, and respond to data that informed their teaching (see Table 2). A standard deviation of approximately one showed a relatively uniform view of faculty perceptions regarding instructional data use.

**Table 2.** Faculty perceptions of gathering, analyzing, and responding to instructional data. <sup>1</sup> Variable mean coded on a 5-point scale of 0 = "Not true at all," 2 = "Somewhat true," 4 = "Very true" (n = 127).


These survey results prompted us to explore instructional data use in interviews to allow richer data around these issues. From those responding to the survey, invitations were sent to faculty across the represented disciplines to conduct interviews. Nineteen out of twenty-one faculty invitees consented to interviews (90% response rate). (See Appendix A for full interview protocol.) An external project evaluator conducted the semi-structured interviews, which lasted approximately one hour.

Table 3 shows disciplines, participant pseudonyms, and professional positions for the 19 STEM faculty who participated in the interviews for this study. Faculty disciplines included physics, biology, chemistry, mathematics, and engineering (chemical, biological,

environmental, and mechanical engineering). Nine of the faculty were in tenure-track faculty positions (assistant, associate, and full professor), and ten were in fixed-term faculty positions (instructor and senior instructor). Participants had taught at least one lowerdivision STEM course in the previous year and were involved in the campus initiative. Race/ethnicity and gender data were not collected in this study, and we do not want to make assumptions about participants' identities. We were further concerned with ensuring anonymity given the sample size, identification of disciplines, and professional positions of a group of faculty from just one university. Thus, we use pseudonyms that we perceive as gender-neutral.

**Table 3.** List of Disciplines, Participants' Pseudonyms, and Participants' Professional Positions. N = 19. Tenure-track faculty include assistant, associate, and full professors, Fixed-term faculty include instructors and senior instructors.


#### *2.4. Data Analysis*

Data analyzed from the interviews for this paper pertained to the following questions:

	- a. To what extent do you collect data/information about student learning?
	- b. Are your teaching practices informed by data/information about student learning?
	- c. Are there means in the classes/courses that you teach for students to reflect on their own learning data? (If yes), Can you detail these processes?

Interviews were transcribed verbatim and transferred to Dedoose coding software for qualitative analysis. The first author created inductive codes from a first read of the verbatim transcripts, drawing perspectives from interviewees' own words in response to interview questions [37]. We attempted to stay grounded in faculty descriptions and matching those with definitions of the concepts. During both rounds of coding and analysis, the first analyst created theoretical memos [38] to provide a record of developing ideas and interconnections.

We used several methods put forth by Creswell [39], pp. 201–202 to address our findings' trustworthiness. One method we used to ensure the trustworthiness of the analysis was peer debriefing. The second author supported the codes' development and participated in debriefing and data analysis sessions with the first author throughout the coding and analysis of the data. The second author also reviewed 20% of the coding to increase

reliability and consistency and provide ongoing contributions to the emerging codebook. (See Appendix B for codes.) In both phases, the authors discussed emerging concepts and themes based on their critical reflections on the data, and an ongoing discussion of codes and interpretations addressed (dis)agreements within the data [37]. At least two interviewees made all the claims we report in this paper. We included exact numbers of participants in conveying claims.

#### *2.5. Limitations*

We acknowledge the multiple limitations of our research. For one, our study took place at one institution with one improvement initiative targeting select STEM disciplines. Overall, our exploratory study is based on a small sample size, with some claims voiced by a few and, in some cases, two participants. As well, faculty who agreed to be interviewed may represent a biased sample of faculty who were engaged in making improvements via some affiliation with a campus-based improvement initiative, and thus, may not fully reflect the larger population of STEM faculty. Disciplinary norms and practices, which may influence faculty practices and perspectives, were also not explored per limited sample size. We did not collect observations of actual faculty data practices, including how faculty reacted to data or how types of data linked with any corresponding faculty actions across our sample. Finally, although we discuss our findings in light of past research around the effectiveness of types of data or faculty data-use practices, we did not collect data that would allow comment on effectiveness.

#### **3. Results**

#### *3.1. Types of Instructional Data Faculty Collected*

#### 3.1.1. Summative Data-Use Practices

All faculty indicated that they collected summative data to inform them about their students' learning. Summative instructional data generally included a combination of mid-term and final exams, quizzes, and, to a lesser extent, written assignments. Typically, these assessments were quantitative (e.g., multiple-choice) if class sizes were large and generally administered two to three times during the term. A majority of faculty perceived summative evaluations as an effective measure of student learning and determinants of grades.

Robin (physics) indicated that 70% of their students' grades came from exams, including two exams around "mid-term" and a final course exam. They felt these types of assessments, and the data they generated, gave them the best opportunity to know what an individual student comprehended.

*Let's start by saying roughly 70% of my student grades come from exams. There's two midterms, and a final and those are the best way that I know that student is presenting me the information that they personally know and they're not working with others.*

Alex, a chemistry instructor, described these summative data-use practices as "traditional," utilized, in part, because of the significant number of students in their classes. They described weekly individual quizzes, mid-term, and final exams consisting primarily of multiple-choice questions out of necessity, although they did have some open-ended questions on the exams.

*We do very traditional assessments in a sense because, in the fall term, we have fourteen hundred students. We have ten weekly quizzes, and those are individual. We have ten small group activities, one per week. We have two midterm exams and a final consisting of a section of multiple-choice, which is out of practicality, and about forty percent of that exam is open-ended, so it's free-response for students.*

#### 3.1.2. Perception of Changes to Summative Data Practices

While acknowledging the need for summative data-use practices, some faculty also signaled their data practices were shifting away from typical exams as the only determinant of student learning to other means such as group exams. A few faculty (4) described how they felt relying only on summative data-use practices was problematic, per student diversity and in terms of data quality. Two faculty indicated concern that typical means of gathering summative data (e.g., exams) did not allow all students ample opportunities to show what they had learned and, thus, did not accurately reflect student progress. These faculty also felt typical exams did not provide them with sufficient data to determine students' course grades and sought to minimize the use of summative data-use practices as the primary determinant of these grades.

Tracy, a chemistry instructor, exemplified both of these findings. Tracy described a shift they had made away from exams as the sole determination of students' grades to a grading structure that allowed for over fifty percent of the final grade to be determined by students' work on papers, class presentations, and online work. Some of these activities involved group work. Nonetheless, they were considered a significant part of the grade students earned. Tracy discussed how they were deemphasizing formal exams and a desire to engage students more actively in the material. Tracy stated,

*This year was a pretty dramatic change to over fifty percent of the grade, and the assessment was not exam-based. So the students were writing papers, which they got formative feedback on, and they were developing presentations that they gave in class and also published on the website where they also had some feedback and revision steps there. Teaching assistants were assigned to some of those activities, so they hopefully got some fairly frequent feedback. Most people were working in groups rather than individually on some of those assignments, sharing their results, presenting them, and all of that was pretty high stakes because the total for those activities like I said, was over half the grade. So we deemphasized formal exams, there were midterms and final exams, but they were lower stakes. That came out of both the desire to get students more actively engaged in the material. On the assessment end, I think we've recognized, I've seen over many years, that exams are great for many students, but I think they don't measure all student activity and success and learning.*

Tracy perceived that adding different student assessments to their teaching repertoire resulted in more traditional ones feeling like "lower stakes" for students.

#### 3.1.3. Formative Data-Use Practices

Nine faculty (47%) described engaging in collecting formative instructional data, which they felt gave them immediate information about student learning. These included qualitative forms of data, either curricular artifacts or verbal information qualitatively gathered from students. Nine faculty (47%) described collecting this type of formative data, described as "submission sheets," "exit points," "muddiest points," or "Tuesday problem." These activities required students to evaluate or respond to a question or statement related to course content. These activities resulted in data artifacts that faculty stated informed them about their students, learning, and interests. These activities were often described as "low stakes" for students, supposedly concerning more typical assessments of their learning (e.g., formal exams).

Drew, a mathematics instructor, described using "exit cards" at the end of class to assess students' level of understanding or confusion. They allowed students to be anonymous in their submission. They described this activity as giving them a quick opportunity to see what students were learning and what might need to be addressed again. They said it in this way, indicating some benefit of the anonymous, low-states (not graded) nature of the activity for students.

*In terms of formative assessment, I have used things like exit cards, where at the end of a class, I just have students [write down a question or comment]. There's no grade at all attached to this. It's just for me to get some sense of what did you [student] think was the most significant thing you learned today, what was the muddiest point. These kinds of quick questions that people jot down on a card and can even be anonymous, and then a* *quick look through all of that gives me a sense of, 'oh wow,' I really missed the boat here. I need to re-address that topic again.*

Lee, an engineering instructor, talked about using "muddiest point" activities that gave him information about topics that students wanted to know more about. Lee also perceived these formative assessments as providing him data about students' understanding of the concepts and via a low-stress activity to assess their level of understanding. Lee stated that the activity allowed insight into.

*...whether they want more coverage on a specific subject. In terms of reflecting on their own performance, I certainly think that when you make an assignment like full credit for participation or sort of the check if they are there working and engaged, it also sends a message to them about how they engage in the material. Both of those were sort of meant to reward them for being there and engaging, but not making it so high stakes, so it wasn't supposed to be a stress out sort of thing.*

Lee also felt giving credit for these kinds of assignments sent a message to students about the importance of attendance and being engaged in the assignments. Like Drew, Lee discussed this being lower stakes for the students.

Lynn, another engineering instructor, described using an activity they called "the Tuesday problem." This activity would give students a problem to work on during a break during a class. After the break, they walked around to observe and help students who presented as needing extra attention. Like Drew and Lee, Lynn indicated this activity gave them an informal way to engage with students, resulting in a less stressful assessment activity for them. Lynn described it this way,

*The Tuesday problem or something like this, where I take a break in a two-hour lecture, a ten-minute break, and I put up a problem, and I say you guys are welcome to solve it or not, but when we get back from the ten minutes I'll solve it, and then we'll talk about it, and you put it up, and you walk around, and you see if people are trying and you kind of help them, or you give them pointers on what direction to go. So there is a way to create way more informal engagement by doing things that way because there's very little stress because it doesn't count for any points, really.*

As Lynn also alluded to, faculty also detailed gathering verbal forms of data (i.e., talking with others) to provide them with information about their students' learning. Six faculty (32%) spoke of collecting instructional data through verbal interactions with students, usually informally, to gather information.

Jordan, a chemistry instructor, spoke about providing an open environment for students to discuss challenges and fears. They described how talking and interacting with students allowed them to "have a discussion about the growing pains of going through science education." Sidney (chemistry) said that informal interactions with students gave them some of the most valuable instructional data they collected. They describe how most of their assessments, depending on the course, were done through these informal, information-gathering interactions with students to determine if students liked the course or what parts they did not like.

*Most of my assessment, and I think this is true for most people, comes from informal interactions. Of course, it depends on the course, but oftentimes I informally really try to just talk to students as much as I can and see how things are going. I often say, 'Hey, what do you like about the course? What don't you like?'—again, about as informal as could be, but I sometimes find those are most valuable.*

Another faculty, Bailey (engineering), similarly talked about informal interactions with students to gather information about their courses. They indicated using their office hours and visits from students to ask them questions about how they were feeling about the content that was being taught. They even probed via questions to students about whether students understood a particular concept that the instructor was trying to convey.

*Then students in office hours, if they seem willing, I'll often ask how do you feel about this content area, or even more specific things like I tried to tell you this, did you notice that in class, or do I need to do that differently.*

Another faculty, Peyton (biology), talked about meeting with students frequently to assess what students were saying to understand where they were and where the instructor thought they ought to be. Peyton said,

*I meet with my students a lot, so I hear what they're saying, and I use that to inform where they're at and where I think they need to be.*

Two faculty described how talking with their teaching or learning assistants provided them insight into student learning. In this excerpt, Robin (physics) described a learning assistant program they developed that allowed them another way to collect formative types of data on students. During weekly meetings, the learning assistants provide feedback to the instructor about students, including what they perceived to be working or not working. Robin said,

*Qualitatively, I'm talking to my students constantly. I've developed a learning assistant program, so I have ten learning assistants, and they're constantly giving me feedback about what's working, what's not working, helping me try to guide the students. And I have seven T.A.'s [teaching assistants] at any given moment, and we also have meetings every week.*

Looking across these data, two main goals were apparent. Faculty collected instructional data to assign student grades and to assess learning. Some faculty recognized that certain summative data, from exams, was not adequately allowing for meaningful assessment and grading of students, thus motivating them to also collect other types of instruction-related data. Some faculty also went so far as to comment that data around student learning then allowed them to plan for future teaching (i.e., reteaching or teaching in a way that students most appreciated), to secure better student understanding or success. Faculty linked timely impact on teaching practices to formative forms of data.

#### *3.2. Affordances That Influenced Instructional Data-Use Practices*

#### 3.2.1. Faculty and Organizational Student Assessment Norms

We also found several affordances that influenced the instructional data-use practices of faculty. Not surprisingly, they often pointed to data they collected that provided insight around their teaching as data types and means that were norms for them as practitioners and within their larger organizations. For example, more traditional types of student learning assessments (i.e., exams, written assignments) remained privileged by faculty and their departments. Thus, faculty kept using them and claimed to be informed of students' progress via them. Nevertheless, many faculty also claimed a desire for other data to inform them in ways that more traditional assessments did not. Narrative and verbal types of data, for some students, provided them novel data around students' understanding and provided them more timely feedback for more immediate response to students, like "submission sheets," "exit points," "muddiest points," or the "Tuesday problem" exercise. Generally, faculty relied on "tried and true" methods, like course exams, to gather summative, often quantitative data to inform their teaching. When they sought different or more complete information or to help students feel more comfortable and relaxed in providing them with information around their learning, some faculty gathered and utilized other data, namely formative and largely qualitative.

Learning and teaching assistants were another affordance that two faculty used to gather data on students' performance. As described by one faculty, the development of a learning assistant program afforded them feedback on what was working or not related to students' learning.

#### 3.2.2. Instructional Technologies

Eleven faculty (58%) reported that instructional technologies afforded them opportunities to collect and analyze data to inform their teaching. These included audience response systems (i.e., clickers), online platforms for homework or other course materials, and adaptive learning technologies. Overall, faculty detailed these technologies in comparison to when they did not have them or had to rely on more traditional, or lower-tech data gathering means. The instructional technologies allowed more timely and complete data to respond to more immediately in practice.

Several faculty used clickers, which they claimed provided them in-the-moment snapshots of students' levels of understanding and the opportunity to correct student (mis)comprehensions. Peyton, a biology instructor, noted that if the class scored below 85% on the clicker question, they knew they needed to add an immediate class discussion, asking students to explain why they chose their answers. Peyton explained,

*Whenever we have a clicker response that's less than 85%, we'll spend time talking about why the right answer is right, why the wrong answer is wrong, and I'm always soliciting their voices for that. I've moved away from me explaining to getting them to explain and then affirming.*

Several faculty discussed the formative nature of the data they were collecting. In this example, Peyton (biology) again described doing formative assessment using clickers, alongside having students complete daily group assessments that they could then read after class to inform subsequent teaching practice. Peyton said,

*I do formative assessment in my classes through clickers, but I also have daily what I call submission sheets, so the groups work together to answer a couple of concept type questions, and they turn those into me, and I read those each day.*

Several faculty talked about having students complete homework and pre-class quizzes on online platforms, which they claimed afforded them a quick, formative assessment of student comprehension, allowing them more immediate adjustment to their instructional practices. For example, Sidney, a chemistry instructor, noted that students took a pre-class quiz after first viewing a video in an online course environment. Before the class session, Sidney would meet with colleagues who were also teaching the class to discuss understanding as reflected by the quizzes across the complete array of students in the course. Sidney noted they had built flexibility into their courses to adjust their instruction to accommodate any changes that such data indicated the need for. Sidney stated this process was an improvement to trying to ascertain student understanding from, perhaps, fewer students that they could check in with in class.

*Students are supposed to take a pre-class quiz, but the catch is that in order to be able to access it, they have to have first viewed the video. So ideally, it sort of forces them to watch the video and then take the quiz. The nice thing was that I and the colleague I taught with, we would have a discussion before class every time of, 'Hey, what questions on the quiz were they really getting? Which questions didn't they get?'* . . . *we were flexible enough that we could go into class that day and say, 'Hey, you know we realized we should spend a little more time on this.' That was a huge change that we hadn't [done previously]—we might have gotten a feel for it kind of walking around talking to students, but there we had very nice concrete data to inform what we would do and enough flexibility built in that we could say, 'Hey, today we're going to spend some more time going through Topic A quickly because most of you seem to be fine with that and spend more time on Topic B.*

Two faculty described adaptive learning technologies as affording them more immediately actionable data around their students' learning. Alex (chemistry) described adaptive technology as "a real eye-opener" related to their ability to respond and change their curriculum within days.

*Adaptive learning has certainly been a refinement that I made because I went from an adaptive learning model where changes were being made to the curriculum based on student understanding, perhaps term by term, and now I've shortened that gap where feedback is immediate, evaluation is immediate, and then changes could be made for the very next assignment, which would be the next meeting. So I think that has been a real eye-opener in refining the response time, in that a change to the curriculum is not occurring the next term it's occurring within the term, and, as a matter of fact, up to within two days.*

Jodi (biology) found the adaptive learning technologies helpful in understanding trends over time and particularly useful in understanding how underrepresented and first-generation students were doing and where they might need more support.

I think the predictive analytics things are useful if it's things like underrepresented minorities, first-generation college students, information like that, like more of the demographics of who my students are to figure out if there are pockets of the population that aren't doing really well in the class.

#### *3.3. Impediments to Instructional Data-Use Practices*

Faculty perceived several impediments to their instructional data-use practices. Constraints included (a) a perceived lack of time needed to implement instructional data-use practices, (b) standardization of course content, and (c) perceived lack of confidence and competence in instructional data-use practices.

#### 3.3.1. Perceived Lack of Time to Engage in Instructional Data-Use Practices

Six faculty described how some instructional data-use practices took a great deal of time to implement and were therefore difficult to utilize effectively. Casey, a chemistry instructor, said that time constraints hindered their ability to try new or innovative practices, and they viewed this as a problem with implementing new practices.

#### *Time constraints definitely hinder it [data-use practices], and they hinder actually doing anything innovative. That's actually a huge problem.*

Madison, a mathematics instructor, saw instructional data-use practices as necessary, but those practices were often pushed to the side by other instructional activities. They said, "Important things [like collecting instructional data] go to the back burner when the rubber hits the road, even when you know they're important." One faculty talked about the difficulty of fitting the instructional data-use practices into their curriculum. They wanted to do more and thought it was necessary but felt constrained by established practices.

#### *That's definitely something I've wanted to do more of, just the issue of where do you fit that into the curriculum, but I think that's important, and I wish I was doing more.*

Two faculty stated their instructional data-use practices were constrained by a lack of time, as they often taught large classes. Casey, chemistry, elaborated that large class size can equate with a lack of time to implement instructional data-use practices, at least ones they felt significant. They described having students complete short writing assignments if students were struggling. Although they acknowledged that students gained from this experience, they also acknowledged that this type of practice was difficult to do with hundreds of students to measure student learning. Casey said,

*A lot of times, I'll do a short writing assignment, especially if I think they're struggling with a concept, I'll have them write about it. But there's hundreds of them, so it's difficult to get a lot out of that, although the students get a lot out of it.*

Madison, mathematics, felt constrained by class sizes and the number of classes that instructors taught per quarter, and other responsibilities that influence data-use practices such as making exams predominantly multiple-choice.

*Because of the class sizes and some people are teaching, some of the instructors are teaching four courses per quarter. I have another portion of my job is managing the math learning center, so I usually only teach three, but when they have that size classes, a large portion of the exam needs to be multiple choice*.

#### 3.3.2. Constraints Due to the Standardization of Course Content

Several faculty (5) perceived constraints in using different kinds of instructional datause practices due to the "standardization" of course content taught by various faculty and usually indicated requirements for standardized exams and grading policies. In some cases, these constraints were tied to whether the course was a sizable lower-division course taught by several faculty simultaneously at the same institution. In other cases, faculty described being constrained in their practices if the course was also taught at a community college or as a dual credit option in local high schools. While all faculty indicated they had autonomy concerning *how* they taught, several faculty indicated they were constrained by requirements related to student assessments of learning. Kelly, mathematics, said, "Most of the assessment I do is out of my control. Sixty-five to seventy-five percent of the grade for the courses I teach have to come from two midterms and a final." Notably, such realities were most often detailed by mathematics faculty. Still, Kelly talked about moving towards more group exams, a novel practice, and described the reluctance (i.e., constraint) within their department to adopt this practice.

*I'm slowly trying to have conversations with the powers that be in the department to be adjustable [with doing exams] so maybe we'll do group exams, or maybe we'll try some other things other than just those very traditional midterm and final exam structures.*

3.3.3. Perceived Lack of Confidence and Competence in Instructional Data-Use Practices

Several faculty (4) stated a lack of confidence and competence in collecting and using instructional data to inform teaching practices. Bailey, an engineering instructor, described their instructional data-use practices as "terrible" in relation to more effective practices that they knew existed but did not know enough to implement.

*Yeah, they're [instructional data-use practices] terrible. I know enough to reject a lot of common practices, but not enough to replace them with better alternatives. So I am really struggling with that right now. It's not formulated at all.*

Leslie (mathematics) also perceived instructional data-use practices as the weakest part of their teaching. They understood the importance of assessing student learning and formative practices specifically but struggled to respond to data in their teaching practice directly.

*I would say that's probably the weakest part of my teaching practice. I'm not really formal about incorporating results of assessment into teaching, which sounds pretty bad. Yeah. Formative assessment, I read the literature, I drink the kool-aid, but that is the thing I drop the most in terms of my teaching practice. What I do is so informal. I don't know if I can even describe it.*

Another faculty was uncomfortable with their colleagues finding out about their instructional data-use practices. They perceived pushback that could result, partially based on their already being seen as an outsider in their field per their gender. They did not want to advertise what they did differently because of the possible repercussions they would experience.

*You know, gender-wise, honestly, I've come into this profession, and I've been an outsider. I'm not going to take something I do that's different than what other faculty members do and advertise it. I may be very successful at it, but if I advertise it, there will be repercussions.*

#### *3.4. Engaging Students in Reflecting on Their Own Learning Data*

The overall sample was mixed in their practice of engaging students in reflecting on their own learning data. A slight majority of faculty (n = 10, 53%) indicated that they did not implement any instructional practices for students to engage with or reflect on data around their learning. Of these instructors, five instructors stated they did use formative data to inform their teaching practices. However, they did not explicitly use these data to engage students in reflecting on their learning. Lynn, engineering, described providing students with exam scores, a bell curve showing averages, and the range of grades, but no other data. When asked if students had an opportunity to reflect on their learning, they said,

*Oh, no. Not aside for their own grades. They see averages and things like that. I guess that's really professor-dependent, but for me, whenever I go over the exam, I always put out the bell curve and say this is the average, this is the standard deviation, this is the range of grades.*

Two faculty mentioned the institution's end-of-term student evaluation of teaching survey as a means of students reflecting on their learning. One faculty stated they added a question to the survey related to whether students understood a particular concept covered in the course. Tracy, a chemistry teacher, said they would advocate for other faculty to do this kind of practice.

*That's the one where we add the question on their electronic evaluation of teaching, so in the electronic evaluation of teaching for the students, there's a series of standard questions, I think there's ten, how was the course basically, what was the instructor's contribution to the course and there's a few others, and then I add, and I advocate for all other faculty to do this as well, you add at the end of this course [a specific question related to the content].*

Still, around half of the faculty (n = 9, 47%) indicated they implement instructional practices that engaged students in reflecting on their learning data. These practices were generally described as activities in which students were asked to reflect on their overall performance in class and how their use of study techniques may (or not) be helping them. Jordan, a chemistry instructor, stated, "I really try to actively engage them in utilizing critical self-review and then coming together with others once that review has taken place to gather the information necessary to move forward."

Alex (chemistry) described posing to students open-ended questions on worksheets, asking them to describe whether they understood the material and how comfortable they were with their learning. Alex perceived this type of student reflection as allowing students more autonomy with their learning.

*[in the student's voice] "I understood this material, I feel comfortable with this material," and then they [students] produce a little bit of evidence and they will say things like "I am completely lost on buffer systems, I have no idea what is happening in a buffer system. I don't even know what a buffer system is." I think that's part of the empowerment [of students]. I think that's part of their confidence in that this seems to be very meta. So, students are plugged into their empowerment and their own understanding. They're not looking at it as how I did on an exam. They're looking at it as, I think I get this, I'm supposed to be learning these key concepts.*

Drew, a mathematics instructor, had students keep a journal of their progress in understanding course concepts and any related difficulties. Students received credit for their reflections.

*In some courses, I've gone as far as actually having students keep a journal of what they struggled with that they actually turn in with the homework. So there's actually some "credit" awarded for going through that exercise. But I think the bigger value of that is getting the students themselves to reflect on their own learning.*

Jodi, a biology instructor, had students do "real-time" writing in class in response to a question. They would write on a notecard and hand it in. While this served as formative feedback for Jodi around her practice, she also saw it as an opportunity for students to reflect on their learning data.

*When I'm teaching in the classroom, I also have them do some real-time writing. I think writing is a really good way to start to help them see what they don't understand. So I have them do an individual note card where they write down an answer to a prompt, and then I have someone else, not me, read them, because there are seven hundred of them, and give me some summaries, and then I go back over that with them in the class as sort of a way to see if their thinking is right or what is a good response to these things versus what's not a good response to these things. So those are kind of the way I think that they get to reflect on what they're learning.*

Lee, engineering, used a formative instructional data-use practice, "the muddiest point type thing," to have students reflect on their level of understanding.

*I often will do the muddiest point type thing, which has them reflect not so much on performance but on their level of understanding.*

Two faculty described using peer or small group activities to afford students opportunities to reflect on their learning. Casey, chemistry, has students complete short writing assignments, especially if they struggle with a concept. They describe collecting the writing but also giving students time to share in groups with assigned friends.

*Sometimes they give them to me. They are in groups, they have assigned friends in my class, so they do share among their group members also.*

Kelly, a mathematics instructor, had students engage in self-reflective activities during a "recitation class," where students from a large-enrollment main class meet in smaller groups to work. Students discussed their homework assignments and compared answers to those given by the instructor.

*It [student reflection on their own learning data] gets facilitated in smaller groups, in like a recitation situation. So normally my lecture would have a hundred people and then one day a week there's four different classes of twenty-five. When [students] get their written homework back with some sort of marks on it, they're encouraged to look over that and discuss the solutions that have been provided by me. They're asked to compare and contrast between what their answer looks like, what the solution organization looks like. It's the logical thought process of putting things together that I want them to focus on. So it's sort of done in small groups, face-to-face discussions.*

#### **4. Discussion**

#### *4.1. Instructional Data-Use Practices and Motivations*

In this paper, we report on a study of STEM faculty members' instructional datause practices, including the types of instructional data that faculty collected. In general, faculty claimed to use multiple instructional data-use practices to inform their teaching practices and their students' learning. We found that all faculty described mostly gathering data via summative assessments, such as mid-term and final exams and weekly quizzes. While a few faculty also discussed written assignments they generally gathered, most data coming from summative assessments were qualitative in nature. Faculty indicated they used these assessments most often in classes they described as large-enrollment and generally administered them two or three times during the term. Most faculty perceived summative assessments as practical in providing measurements of student learning and determining grades. Several faculty indicated these were the only means for individual students to demonstrate what they had learned as opposed to other types of more formative assessments or group work.

However, a few faculty acknowledged that summative evaluations did not always reflect student progress. For these faculty, their perceptions of the value (i.e., importance) and function (i.e., purpose) of summative assessments had evolved to privilege more formative assessments. Several of these faculty indicated their concern that diverse students

did not have ample opportunities to demonstrate their learning via more traditional and common (i.e., summative and qualitative in type) assessments, in essence questioning the quality of the data to inform their teaching. These practices minimized the impact of more common and traditional assessments on students' overall grades in the course. A small number of faculty were considering experimenting with students working in small groups and giving presentations, instead of formal exams, or having students take exams together; a practice that faculty perceived would need further discussion with department colleagues or leadership before implementing.

Roughly half of the faculty we interviewed also gathered formative data that they perceived gave them more immediate ways to assess student learning. Much of this data was qualitative in nature, collected via course artifacts or verbal exchanges. Such artifacts as "submission sheets" and "muddiest points" asked students to evaluate or respond to a prompt or question about the course content. The purpose of these novel strategies was to intentionally provide a more collaborative and inclusive process for students to demonstrate their learning that still offered them insight into students' learning and interests. Faculty also described these as less stressful ("low stakes") for students when compared to formal exams.

Some of these findings indicated an awareness on the part of faculty that not all students bring the same background and experience to STEM coursework, which would make more traditional and common assessments less than meaningful for them as educators. These faculty, thus, knew to implement other assessments that would allow a wider array of students to demonstrate the learning and progress faculty expected they were making. Faculty use of activities that provide students with "low stake" options seemed especially promising in promoting students' success across the diversity of learners in their courses. Through their implementation of these less stressful assessments, faculty also provided students means to ask for more review or coverage of particular topics. As they did this, faculty may have been engaging students in determining the pace and depth of content coverage, all the while sending a message to students about their role and agency in their learning. These faculty practices are promising per research that demonstrates a recognition of student achievement and faculty responsibility in creating more equitable learning environments [40,41].

Some of the formative data that faculty collected were of verbal form, described as some of the most informative data faculty collected about how students were doing in their courses. These included having an open discussion in class, talking to students during office hours, and talking with learning and teaching assistants about student progress. These findings are also promising, as faculty interactions with students are essential in building rapport and positive relationships with students [42] and can ultimately enhance student success [43]. Many of these interactions resulted in verbal data that were described as informal, such as asking students questions during office hours or talking with them one-on-one during class or as groups in work sessions. While we did not explore with faculty the proportion of students faculty connected with via these informal interactions, we assume that these informal interactions may not have involved all, or even most, students. There may, in fact, be a reason to be concerned that some students may find these interactions with faculty intimidating and be unwilling to share areas where they may be struggling due to their perceptions of the potential reactions of the faculty. Some students, additionally, may feel unable to initiate such interactions (as during office hour visits). This reality may position students not comfortable, agentic enough, or able to make time to talk with their instructors at a disadvantage [43,44], including those struggling as well as those succeeding, both groups that can provide faculty important insights into their teaching. A few faculty described using their learning and teaching assistants' assessments of student comfort and understanding of the material as indicators of student progress. This strategy provided indirect data gathering from students that may counteract our last concern, allowing faculty insight from students who do not feel comfortable talking to them directly. Faculty might consider fostering more of these interactions of their students with others

who students may see more as peers to collect meaningful and timely, learning-related data to inform their teaching.

Additionally, we need to consider the potential limitations of data that faculty reported collecting, including those which may seem to be more anecdotal. Andrews and Lemons [17] found that postsecondary biology faculty primarily relied on personal perceptions and experiences rather than empirical evidence to inform their teaching practice. They found that using data to convince faculty to change was ineffective. Indeed, this may be one reason that faculty may not be willing to change comfortable, already established practices, especially if departmental climates further stifle the implementation of novel practices. Professional developers and education leaders might need to help faculty recognize this limitation and find a "sweet spot" between the comfort, ease, and meaning they feel (for themselves and students) around verbal, informal formative data and data that is also valid and reliable. Faculty in our study indicated a concern for the meaningfulness of more common, traditional assessments such as multiple-choice exams that may not demonstrate student learning. Overall, our study points to how we must help them evaluate, modify, and create instructional data-use systems that sit at the intersection of being reliable and valid, as well as meaningful and usable. These systems must also be practical, timely, and engaging for a majority of diverse students.

#### *4.2. Impediments to (Meaningful) Faculty Instructional Data-Use Practices*

Many of the summative and formative data types we heard about were consistent with Bouwma-Gearhart's [4] and Hora et al.'s [5] research with postsecondary STEM faculty. These researchers, too, found a diverse repertoire of practices used to measure student learning to inform their teaching. They, too, found that most faculty relied predominantly on summative and largely quantitative forms of data to inform teachings. Some faculty also relied heavily on formative and qualitative forms. Also consistent with the findings of Bouwma-Gearhart, Hora, and colleagues were the constraints that faculty identified as limiting their instructional data-use practices. These impediments included a perceived lack of time needed to implement instructional data-use practices, standardization of course content that restricted the types of practices used, and departmental structures that determined some types of practices. Like Bouwma-Gearhart, Hora, and colleagues' research, ours suggested that many STEM faculty may generally not feel prepared or empowered to effectively utilize diverse forms of instructional data to inform their teaching practice. Survey data that helped to motivate this study (detailed in the methodology section of this paper) pointed to a general lack of confidence and competence in collecting and using instructional data by faculty. Our study confirms this, mainly around qualitative and formative types of data.

Our faculty participants noted several nuances around the impediments to their instruction data-use practices. Specifically, time constraints revolved around certain types of formative assessments (e.g., journaling, group exercises), especially in large classes. Although faculty acknowledged that these formative assessments were valuable and even necessary, they were not always implemented. In part, faculty felt pressed to cover content, which suggests a perception that certain types of instructional data-use practices (e.g., formative) were somehow outside regular teaching norms and therefore needed to be added or fit into the class.

Notable too were mathematics instructors' laments of the "standardization" of course content. This usually indicated departmental requirements for common exams and grading policies. In some cases, this indicated large introductory courses that were taught by multiple faculty. These departmental structures suggested that faculty were not empowered to gather, analyze, and use instructional data to meaningfully inform their specific teachingrelated questions and practices or to help students reflect on their learning achievements, a reality hinted at elsewhere in the literature (e.g., [5,23]).

Those faculty stating a lack of confidence and competence in collecting and using instructional data admitted it was the weakest part of their teaching. Faculty understood

the importance of using instructional data-use practices to assess student learning, but they struggled with responding to data in their teaching practices. One faculty, additionally, was uncomfortable with their colleagues finding out about what they did to assess student learning. They perceived pushback from colleagues, partially based on their perception of being seen as an outsider in the field per their gender.

Attending to faculty comfort and norms with instructional data-use practices is one step in promoting faculty engagement in this research-confirmed practice to improve teaching. Interestingly, in comparison to Bouwma-Gearhart, Hora, and colleagues' study, faculty in our study were all involved in teaching improvement initiative activities and had largely been involved in multiple teaching improvement initiatives in their past. Arguably we might assume these STEM faculty are the most confident and competent (when compared to peers not engaging in such initiatives) in gathering, analyzing, and responding to data. Limited research points to STEM faculty members' participation in teaching professional development opportunities as predicted by their previous participation, as long as past opportunities were meaningful; in essence, those involved in improvement initiatives should be some of the most aware and practiced in implementing research-based teaching practices [45–47]. If this rationale is correct, and arguably we are in need of more research to confirm this, it points to multiple implications, namely that 1) many (most?) initiatives still do not focus on instructional data-use practices, and/or 2) such initiatives are not especially fruitful in expanding faculty practice around instructional data-use. Either reality presents implications for department leaders and professional development experts who aspire to support faculty development towards improvements to their teaching and, ultimately, student success.

#### *4.3. Supports of Faculty Data-Use Practices*

Faculty claimed multiple affordances for their instructional data-use practices that informed their decisions related to their teaching. Most faculty pointed to data they collected that provided insight around their teaching as data types and means that were norms for them as practitioners within their larger departments and organizations. Most faculty and departments privileged traditional types of student learning data such as exams and written assignments. Nevertheless, faculty also claimed they used other data to inform them of students' learning, not traditional assessments. Several faculty claimed narrative and verbal forms, as discussed earlier, provided them novel and timely data to assess student learning and allowed them opportunities to interact and respond to students. However, most faculty still relied primarily on traditional methods, like exams, to gather quantitative data to inform their teaching. Faculty did seek different and more nuanced information, largely qualitative and formative, and several faculty tried to help students feel more comfortable and relaxed in demonstrating what they had learned. Learning and teaching assistants were other affordances that a few faculty used to gather data on students' achievements. Those faculty described developing a learning assistant program made up of former students who provided feedback on what was working or not related to the courses and students' understanding of concepts and content.

We found instructional technology afforded most faculty multiple instructional datause practices by providing them with opportunities to collect and use data in different ways, such as clickers, online homework, adaptive learning tools, and other uses. Faculty who incorporated instructional technology tools in collecting student learning data were more likely to collect formative types of data. Having relatively easy access to large amounts of formative data encouraged faculty to collect this type of data more often and make more immediate or next-class-meeting changes to what content faculty would teach and what teaching strategies they would use. Instructional technology eased the burden of collecting, especially in large classes. In some cases, this allowed instructors to share student learning data with their students more readily. Faculty also used instructional technologies to assess trends in different groups of students, suggesting that some faculty intentionally recognize and attend to achievement gaps. In short, faculty who used instructional technology to

collect data, largely formative, indicated they modified their instructional practices more often and quickly than those who collected data in other ways. Furthermore, they were more likely to engage students in reflecting on their learning. We think this finding is important because it adds to our understanding of how faculty perceive the instructional technology often, though not always, at their disposal, technologies that are often pointed to as allowing educators' data-based decision-making. Indeed, the faculty we spoke to claimed instructional technologies afforded them collecting different kinds of data and the ability to assess trends in student learning in a timely manner. Our research somewhat confirms the exploratory research of Hora and Holden [23] on the role of instructional technology in STEM faculty's practices and backs their assertion that understandings around faculty practice are needed to design more locally tuned interventions, and that faculty must see the technology as salient to their practice. Based on our more nuanced findings regarding faculty's use of instructional technology, we recommend those who have responsibility for adopting and implementing instruction technology consider that faculty use instructional technology for multiple reasons and in multiple ways. Thus, including faculty in the decision-making process can result in more faculty adopting and using it.

#### *4.4. Engaging Students in Reflecting on Their Learning*

Our findings suggest that faculty were mixed in their practice of engaging students in reflecting on their learning data. Those faculty who did indicate they implemented these practices described activities where students reflected on their overall performance in class, such as open-ended questions that were added to daily in-class or homework assignments. Some faculty asked students through journaling activities to talk about their use of study techniques that may or may not be helping them. Again, these findings are promising, based on other research confirming structured and unstructured activities such as engaging students in journaling and open-ended questions that asked students to write about what they were learning or having difficulty with. These are effective strategies for promoting students' reflection on and understanding about their learning [21,30].

Still, a majority of faculty indicated they were not intentionally engaging their students in reflecting on their learning. Faculty described doing this least so with students in their lower-division courses, primarily based on the perception that students lacked the ability to meaningfully reflect to the degree that upper-division students could. These faculty claims seem to suggest their assumptions that students have to achieve a certain level of cognitive development and/or understanding in a discipline before they can engage in reflecting on their learning. This may also suggest a faculty perception that students acquire reflection skills through means other than activities that are directed by themselves as their course faculty. Such assumptions may very well be unfounded [29]. Regardless, such thinking shifts the responsibility for student learning entirely to students' shoulders and suggests that faculty bear little responsibility for helping students be reflective learners. We assert that faculty need to provide opportunities for students to reflect on their learning in order for the student to gain an understanding of the particular aspects of their work that they need to attend to in order to be more successful. Professional developers and leaders can help by providing examples of practices that can be incorporated into classroom activities effectively and that engage students in meaningful reflection on their learning.

#### *4.5. Further Recommendations*

Throughout our above detailing of findings concerning STEM faculty instructional data-use practices, we have noted some recommendations for faculty leaders, professional development experts, and faculty themselves to design initiatives that support faculty practices that improve teaching and student learning. Largely, these are towards what we see as the main goal of our study, that is to build from what we know to be working, around important faculty realities and needs towards improving the frequency and efficacy of instructional data-use practices, including faculty engaging students in more meaningful

reflections on their learning. Here we detail recommendations more by main theme and stakeholder group.


Understanding faculty perceptions about instructional data-use practices can further support professional development activities that help faculty understand their use and beliefs about effective instructional data-use practices. We know that faculty perceptions and practices are based on their previous experience, perceptions, attitudes, and practices [48,49]. Any targeted professional development strategies that better support faculty must account for these realities. Indeed, faculty are much more likely to feel competent in using instructional data when they have a say in their experiences with the instructional tools and the teaching practices that effectively incorporate instructional data in daily processes (Hora and Holden, 2013). Professional development activities must invest faculty in the data-driven decision-making processes that make sense.

#### *4.6. Future Research*

Faculty instructional data-use practices, and the perceptions and realities that root them, can inform future research and interventions towards postsecondary STEM education improvements, ultimately towards enhanced success for diverse student populations. Our study is one step in this direction, yet there is still much to discover. Future research could explore more nuanced faculty perceptions and practices across disciplines, including across STEM. Indeed, STEM is not a monolith [50], and we need to explore differences that may exist for faculty and their organizations that may rely on different ways of knowing, cultures, and structures [51]. For instance, are there instructional data-use practices that faculty find particularly effective in certain disciplines? How can departments and institutions support effective instructional data-use practices and still maintain requirements mandated across their specific stakeholders (e.g., faculty and students, accreditation bodies, industry)?

More research is also needed that further explores the impact of instructional technologies on actual faculty practice. For instance, how do faculty incorporate these tools, and what intervention strategies are most helpful in supporting their actual use? What

technologies generate the most meaningful and efficient data for faculty to inform their teaching? How can department leaders and professional development experts support faculty in engaging students in reflecting on their learning? What professional development activities increase faculty confidence and competence and actual use of instructional data to inform practice? What data provide the most reliable and valid read for faculty across numerous problems of practice? Future research could also explore the most effective and efficient ways that faculty can engage students in reflecting on their learning. What else inhibits faculty from engaging students in reflecting on their learning, given the benefits for students? What data are most effective in providing students accurate and disciplinealigned insight for students? What practices best inspire students to accept their agency in assessing and ensuring their learning across STEM disciplines?

#### **5. Conclusions**

Faculty and student interactions in learning environments are complex. Given a relatively typical target of postsecondary education improvement interventions, we are especially in need of understanding how and why STEM faculty gather, analyze, and respond to instructional data. This study adds to the limited research that examines STEM faculty's instructional data-use practices. As research has confirmed, innovation and changes to instructional practices can be slow and challenging. However, we contend that this study indicates some research-confirmed instruction data-use practices of STEM faculty that inform their teaching. STEM faculty now, more than ever, may be attempting more effective and inclusive strategies to assess their students' learning, reflecting a diverse student population who has not traditionally experienced sustained success in STEM files. Faculty are recognizing and incorporating more formative types of data and rethinking how they are using summative data to determine student learning and grades. Faculty may be incorporating more instruction technology that provides them with more strategic ways to collect student learning data and to respond in real-time teaching. We suggest that departmental leaders, administrators, and professional development experts are critical in the continued support of faculty in their development of effective data-use practices that make sense per their and their students' realities. We see a need for more research that explores these realities to strengthen and expand support efforts.

**Author Contributions:** Conceptualization, C.L. and J.B.-G.; methodology, C.L.; formal analysis, C.L.; investigation, C.L.; resources, C.L.; data curation, C.L.; writing—original draft preparation, C.L.; writing—review and editing, J.B.-G.; project administration, J.B.-G.; funding acquisition, J.B.-G. All authors have read and agreed to the published version of the manuscript.

**Funding:** This work is based upon work supported, in part, by the National Science Foundation under Grant #1347817. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

**Institutional Review Board Statement:** The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board of Oregon State University (protocol code IRB-2019-0049 and date of approval, 27 March 2019).

**Informed Consent Statement:** Informed consent was obtained from all subjects involved in the study.

**Data Availability Statement:** The data presented in this study are available on request from the corresponding author. The data are not publicly available due to research subject confidentiality concerns.

**Conflicts of Interest:** The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

#### **Appendix A**

Interview Protocol Faculty Interview Questions

	- 2. Do you interact regularly with any others concerning issues of teaching and learning?
	- i. who? 1. Are these people in your discipline/department/program?
	- ii. how often?
	- iii. regarding what specifically?

3. I'd like to hear about your engagement with the (Name of Initiative) project. Specifically:


4. Please describe any evolution in your teaching practices over the last couple of years that you can attribute to improvement initiatives or professional development activities. (If not mentioned, probe for specifics via questions a and b).

	- 5. I'd like to hear about your assessment practices while teaching.

6. Describe a successful student in the courses or programs in which you teach.


7. A goal of the (Name of Initiative) project is widespread improvement to teaching practices and learning outcomes in undergraduate STEM education across (Name of University). Our general strategy is promoting educators' learning about evidence-based instructional practices via interactions with other educators.


8. A specific goal of the (Name of Initiative) project was to promote active learning and cooperative learning, especially in large, introductory, gateway courses. We define active learning and cooperative learning as X (definitions provided to interview on a handout).

a. What do you think about this goal and strategy?




#### **References**

	- 4. Bouwma-Gearhart, J. *Bridging the Disconnect between How We Do and Teach Science: Cultivating a Scientific Mindset to Teach in an Era of Data-Driven Education*; IAP—Information Age Publishing Inc.: Charlotte, NC, USA, 2021.
	- 5. Hora, M.; Bouwma-Gearhart, J.; Park, H. Data driven decision-making in the era of accountability: Fostering faculty data cultures for learning. *Rev. High. Educ.* **2017**, *40*, 391–426. [CrossRef]
	- 6. McClenney, K.M.; McClenney, B.N.; Peterson, G.F. A culture of evidence: What is it? Do we have one? *Plan. High. Educ.* **2007**, *35*, 26–33.
	- 7. Bouwma-Gearhart, J.; Ivanovitch, J.; Aster, E.; Bouwma, A. Exploring postsecondary biology educators' planning for teaching to advance meaningful education improvement initiatives. *CBE Life Sci. Educ.* **2018**, *17*, ar37. [CrossRef]
	- 8. Bouwma-Gearhart, J.; Hora, M. Supporting faculty in the era of accountability: How postsecondary leaders can facilitate the meaningful use of instructional data for continuous improvement. *J. High. Educ. Manag.* **2016**, *31*, 44–56.

### *Article* **Learner-Centred Learning Tasks in Higher Education: A Study on Perception among Students**

**Junmin Li**

**Citation:** Li, J. Learner-Centred Learning Tasks in Higher Education: A Study on Perception among Students. *Educ. Sci.* **2021**, *11*, 230. https://doi.org/10.3390/educsci 11050230

Academic Editors: Maria José Sousa, Fátima Suleman, Pere Mercadé Melé and Jesús Molina Gómez

Received: 10 April 2021 Accepted: 9 May 2021 Published: 13 May 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

Chair of Economics and Business Education, University of Cologne, 50931 Cologne, Germany; Junmin.li@uni-koeln.de

**Abstract:** Universities face the challenge of constantly improving the quality of higher education and changing the learning behaviour of students, from passive reactive learning to active self-regulated learning. Learner-centred, constructively designed learning tasks offer a great opportunity here. This paper investigates to what extent the learning process is challenged by these learning tasks, and how these tasks are perceived by the students, using a before and after survey of students studying at bachelor level in business courses at a German university. The paper starts with a short description of constructivism in the context of task design and the main characteristics of learner-centred, constructivist-orientated learning tasks: openness to problems, situation orientation, openness to solution paths, and degree of difficulty. Then the research method used is outlined before the findings are presented. The before and after survey shows that despite an increased complexity and workload, the motivation to deal with topics on the subject remained stable.

**Keywords:** self-directed learning; learning tasks; student surveys; university didactics

#### **1. Introduction**

The increasing competition among universities confronts them with the challenge of maintaining the quality of studies and teaching at a competitive level, both nationally and internationally, and of constantly improving teaching [1].

In light of the above, approaches to activate self-directed learning have moved into the focus of current discussion [2,3]. For example, Mandl, Gruber, and Renkl [4] have found that in traditional forms of university teaching, students often acquire "inert" knowledge (knowledge that the learner is unable to apply). The learning behaviour of students should therefore be changed from passive memorisation to active transfer-orientated learning [5,6]. Lecturers, in particular, are thus increasingly faced with the challenge of ensuring the competence of students through learner-centred methods [7]. This form of self-directed learning is particularly problematic in large university courses because of the large number of participants, since learner-centred methods such as discussions, group work, and so forth, are challenging to use within large groups [8–10].

At this point, learning tasks in university teaching offer a great opportunity to promote analytical, learner-centred, and self-directed learning [11], since they encourage the application of knowledge and encourage the use of the content learned to solve real problems [12,13]. However, a shift from a teacher-led learning culture to a self-directed form of learning can also be dangerous. Vermunt and Verloop [14] (p. 270) speak of the danger of "friction" if the teaching strategy and the learning strategy do not fit together; the students are over- or underchallenged and they consequently do not gain a learning effect or, in the worst case, the learning effect is negative.

Currently, the use of learner-centred learning tasks, especially in business management courses, has not been sufficiently researched. With regard to learning tasks in higher education, the studies of van Merriënboer and Kirschner [11], and Hoogveld, Paas, and Jochems [15], which dealt with the construction of learning tasks for university teaching,

are significant. Their works deal with a model for the construction of learning tasks within the framework of university didactics. Based on this model, various continuing education programmes for lecturers have been developed and empirically evaluated [11,15]. Furthermore, the abovementioned studies take the perspective of the lecturers and the implementation of learning tasks by lecturers, in other words the teaching style, as their main focus. The question of the extent to which students are able to deal with self-directed learning tasks, and whether these are compatible with their own learning strategies, has hardly been researched. This paper addresses this research gap, examining students' perceptions of the use of complex, learner-centred learning tasks in bachelor level business courses at a university in Germany. The change in teaching strategy is achieved by adapting the tasks used in exercises and tutorials in the past, to constructivist-orientated, self-directed learning tasks. Specifically, it was investigated to what extent the learner's learning process is challenged by these learning tasks, and how these tasks are perceived by the students. Therefore, the research question is:

How do students perceive the increase of cognitive level, complexity, and openness of learner-centred learning tasks?

This paper starts with a short description of constructivism in the context of task design. Then the research method used is outlined, before the findings are presented, followed by a discussion and a view to future outlook.

#### **2. Constructivism as a Basis for the Construction of Learner-Centred Learning Tasks**

On a theoretical level, the learning task concept pursued here is linked to constructivist learning theories. Billet [16] describes how learning is enabled by two important elements. On the one hand, the presentation of knowledge as a specific situation from the field and, on the other hand, the thinking activities that construct, modify, and apply this knowledge in order to deal competently with situations in this field. Specifically, this study is based on the eight principles of the constructivist-orientated problem-based learning environment of Savery and Duffy [9], which cover both knowledge presentation and thinking activity. These principles are [9] (pp. 137–140):


Learning tasks are understood to be tasks that serve the purpose of learning or practicing knowledge. They are a matter of material control of the learning process [17]. Constructivistorientated learning tasks should be designed according to these eight principles.

Based on these principles, characteristics of constructivist-orientated learning tasks were derived. The first two principles of Savery and Duffy [9] emphasize the problem: "openness of learning tasks", which embeds the learning tasks in a larger context. The learning tasks should be formulated in such a way that learners are confronted with a question or problem. Learning tasks must have a stimulating quality that results from a challenging and motivating problem orientation [3]. The development of learning tasks becomes more significant and more realistic if the work instruction within the learning task is not too clear, and the learners themselves must identify the problems to be worked on [9] (p. 139). Problems should not be clearly specified, but at best should be discovered by the learners themselves in an open process.

Savery and Duffy's [9] third and fourth principles aim at the characteristic of: "situational orientation" of learning tasks. It is of particular importance in knowledge transfer, to show the learners in which practical context the learned knowledge is applicable. This situation orientation is understood as a reference to everyday life and the world around us, as well as in the sense of authentic application contexts. Learning tasks that are integrated into a typical professional situation by means of a business reference show students the sense of learning [13,18]. In the context of a university degree program, the living world can of course also include the area of scientific work. The lack of embedding learning in authentic contexts can lead to a lack of transferability of the content learned [9]. Therefore, it is necessary that learning tasks with real, or at least constructed application relevance, promote the application of the learned skills in real life situations [19–21].

The fifth principle mentioned above by the authors refers to the "solution process" when dealing with the learning task. Constructivist learning tasks do not provide only one correct solution. The solution process can be designed openly, where learners are given the opportunity to pursue their own action strategies and goals by using openness in their solution path. Learning outcomes and solution paths develop heterogeneously and are regarded as fundamentally unpredictable [9,22]. If the solution paths are open, the learners need to decide for themselves which strategies, concepts, and procedures they use to solve the problem. The learners will need already pronounced metacognitive skills, since they will have to control their own learning processes in order to successfully complete the tasks; it is desirable that there should be different ways of solving the task. Different approaches are also desirable to encourage learners to develop their ability to deal with divergent forms of situated presentation [9]. In this way, unexpected task completion processes can be understood as a learning opportunity. In this sense, the openness of solution paths also fulfils the requirements of principles seven and eight, which refer to the evaluation of ideas and the reflection on the learned content.

In order to design the learning environment as described in the sixth principle in such a way that it supports and challenges the thinking process of the learners, attention must be paid to the degree of difficulty of the learning tasks. The cognitive level of the learning tasks must be carefully considered. This defines different gradations: whether the content is to be remembered, whether it is to be understood, whether it is to be applied in a similar way or to a new problem at hand, or whether learners are to acquire additional knowledge by themselves [23,24]. Ideally, learning tasks promote at least the level of application of the acquired knowledge in real situations in order to solve problems at a higher level [11,12]. The degree of difficulty of a task can also be regulated by the linguistic complexity [25,26]. This complexity is increased in a task, for example, if the partial aspects relevant for mathematical modelling are presented in a sequence according to logic of the situation, that does not need to be logical for the solution path, and the mathematical variables are named in the text. This is necessary because even in professional life not all information is presented in the way it is needed to solve a problem. Furthermore, complex sentence structures or formulations, which are caused by the authenticity of a situation, can also lead to an increase of the linguistic complexity. This complexity encourages students to independently work out the relevant information for the task at hand. Consequently, this characteristic also serves to fulfil the requirements of principles three and four, which refer to the authenticity and complexity of learning tasks.

In summary, the characteristics of learner-centred, constructivist-orientated learning tasks can be divided into the variables of openness to problems, situation orientation, openness to solution paths, and degree of difficulty. The property "degree of difficulty" in turn has the subcategories cognitive level and linguistic complexity.

#### **3. Method**

#### *3.1. Research Design and Data Collection*

In order to answer the research question, a research intervention with both before and after surveys was realised. The intervention covered the change process from traditional

small-step tasks which were part of four bachelor courses in business administration at a prestigious German university to constructivist-orientated self-directed learning tasks. To measure the effects, a standardised quantitative online survey of the students involved was conducted, to reach as many people as possible [27].

In the surveys mainly single and multiple-choice questions were used. The main items were assessed using a 4-point scale, in which the students rated comments on a scale from "totally agree" to "disagree". A 4-point scale was chosen to get a response tendency. There is no "neutral" option in the scale form. This procedure is suitable for capturing perceptions. Respondents were also asked to weight statements using percentages. The questions were derived from the above-mentioned theoretical principles and cover the derived variables of openness to problems, situation orientation, openness to solution paths, and degree of difficulty. In order to measure the effect of learning tasks, the motivational disposition plays an important role [28,29]. This can be revealed as a preference for a particular field of knowledge or action [30]. An individual develops extensive knowledge and skills through the motivation to deal with a topic. [18,28]. Additionally, the time spent for task processing per week was asked to investigate the objective workload of the students before and after the intervention, because the workload and complexity of the tasks define the learning context [31]. Therefore, the interest of the students in the respective subject matter was also surveyed. Questions were developed for the survey by the research team according to the theoretical framework. When formulating the questions, students were considered to have no previous pedagogical knowledge. Consequently, no pedagogical terminology was used, and the contents were paraphrased. The questions were formulated in such a way that the participants could understand them by reading them once and their motivation to complete the questionnaire was maintained. The questionnaire itself was divided into three parts: demographic data, perception of the learning tasks based on the constructivist dimensions, and additional questions about motivation and time expenditure.

Table 1 shows an overview of the relevant variables and the asked questions for the research question.

The validity of the questions was checked by a pre-test with five students of economics. Furthermore, an expert in quantitative research reviewed the questionnaire.


#### **Table 1.** Overview of the variables, questions, and scales.


**Table 1.** *Cont.*

The surveys were conducted before the COVID-19 pandemic began in the winter semester of 2017, the first student survey was conducted to analyse the perception of the learning tasks used so far in the lectures. This first survey aimed at students of those courses in which the tasks had not yet been changed.

In the summer semester of 2018, more than 300 assigned tasks were investigated for the characteristics of constructivist-orientated learning tasks, using a structured document analysis. The analysis grid was developed based on the above-mentioned characteristics of the learning tasks. The research team improved the learning tasks according to constructivist criteria in cooperation with the lecturers responsible for the courses.

In the winter semester of 2018, the lecturers used these improved learning exercises in their regular courses. In order to investigate the effects of the redesigned learning tasks on the students, the same surveys on the learning tasks used were conducted with another cohort after the intervention, therefore we had two different samples. The second survey was conducted to investigate whether there are any changes in the students' perceptions. Both surveys took place on one of the last course dates, so students had already completed most of the learning tasks. A member of the research team went into the seminar room at the beginning of the seminar and invited the students to participate in the survey. The internet address of the online survey was then shown on a presentation slide. Afterwards, an invitation to participate in the survey was sent out to the students by e-mail to also reach the students who were not present at the course on the day. This method ensured a high level of participation in the survey.

The date was evaluated on the basis of mean value calculations with a two-sample t-test for significance. The open-source calculation program R was used for the calculation.

The research team was not involved as lecturers in any of the courses. The survey was anonymous and was analysed by the research team. The result was presented to the lecturers of the courses involved. This separation between the research team and the lecturers minimised power differentials and coercion. The combination of qualitative document analysis and quantitative surveys allowed the investigation of different perspectives and dimensions of the intervention [32].

#### *3.2. The Intervention*

These four courses were exercises to deepen the content of the lectures. The courses took place weekly during the semesters. The courses each had more than 100 students and could be described as large-scale courses. There were two lecturers per course, with eight lecturers in total involved in the intervention. The lecturers were junior academics with two to three years of teaching experience. The lecturers had little previous training in higher education didactics and so that the improvement of the learning tasks was closely supported by the research team. The research team showed the lecturers potential for the improvement of the old learning tasks and showed possibilities of how to improve the learning tasks according to constructivist dimensions. The lecturers then improved the learning tasks by themselves.

Both the discussions with the lecturers and the analysis of the learning tasks demonstrated that the tasks were not formulated in a constructivist manner prior to the intervention. In the revision of the learning tasks, reference was made to the constructivist-orientated characteristics presented above: openness to problems, situation orientation, openness to solution paths, and degree of difficulty, in order to proceed according to the eight principles of the constructivist-orientated learning environment according to Savery and Duffy [9]. In particular, the tasks with the cognitive level of memory were adapted to achieve the higher learning objective of "understanding and applying". Thus, in many learning tasks, the reproduction of definitions was abandoned after the adaptation. Instead, the students were asked to explain technical terms and concepts based on self-selected examples.

According to the situation orientation, individual units of knowledge were linked together in the revised learning tasks and partly enriched by linguistic complexity. The tasks were contextualized in typical business situations.

For example, a mathematical cost accounting task was revised by the following situated task.

Excerpt from the task: "You already suspect that your manager will not be satisfied with the resulting increase in costs. In order to prepare yourself well for the discussion with him/her, you should therefore give it some more thought. Which measures and concepts could be used to reduce production costs?".

Learning tasks where the solution paths were given in small steps through several subtasks were made more open by allowing students to freely choose the solution strategy.

The learning tasks became more extensive through the addition of practical relevance and higher degree of difficulty. Through this enrichment, the students are requested to filter the relevant information needed to solve the problem in the task by themselves.

In addition to increasing the complexity of linguistic logic and raising the cognitive level, the problem openness of the tasks was increased. The problem to be solved had to be identified by a situation analysis and is not clearly identified by the lecturers.

#### **4. Results**

In the following section the findings are presented according to the research question and the characteristics of the learning tasks already described. The results before and after the intervention are presented and compared according to the characteristics of the learning tasks.

Out of a total of 2495 students, 495 took part in the first survey. The response rate was therefore 19.92%. From the participants, 51.9% were female, 47.7% male, and 0.4% diverse, with an average age of 22.1 years. This second survey involved 481 out of 2350 students with an average age of 21.7 years. Of these, 58.2% were female, 41.0% male, and 0.8% diverse. The response rate was 20.47%.

The characteristic problem "openness" indicates that learners should identify the problem presented in the learning task through their own analytical performance in order to develop action steps by themselves. During the intervention, the learning tasks were revised in such a way that problems and work instructions were no longer identified separately but were integrated in a complex situation. Work instructions could only be derived after an intensive investigation of the situation presented. Consequently, the research question asked was whether the students can understand the content of the work instructions without further explanation and understand what is required. After the intervention (survey 2), the students answered significantly (*p* < 0.05 \*\*) higher 2.21 than before the adjustment of the tasks (survey 1) 2.08 (scale of four: 1 totally agree/4 disagree; see Figure 1). The clarity of the work instructions was reduced. The before and after survey

showed that in comparison to the initial situation it was now more difficult for the students to independently recognise the specific problem to be solved.

Regarding the characteristic "situation orientation", when revising the tasks, it was ensured that the learning tasks did not repeat the theoretical scientific treatises of the lectures, but that the learning tasks represented complex professional situations. Consequently, a one-to-one transfer of the lecture contents to the learning tasks was no longer possible; students first had to transform lecture content into job-related contents. The study showed the associated change in contextualisation with a significant (*p* < 0.05 \*) change in the mean value of the question as to whether the lecture content could be applied to the tasks, from 1.74 (survey 1) to 1.84 (survey 2) (four scale: 1 totally agree/4 disagree; see Figure 2). The business contextualisation of the learning contents required an increased transfer ability of abstract lecture contents into situated learning tasks. The comparison of the mean values between the two surveys showed that the tasks required an increased transfer ability of the students. In comparison to the first survey, the students found it more difficult to apply lecture content to situational learning tasks in the second survey.

**Figure 2.** Perception of students pre- and post-intervention for situation orientation.

For increasing the openness of solution paths, the narrow specification of solutions, for example, through small-step work instructions, was reduced. At this point, students were asked to derive and justify their own solutions. Here the question was asked as to whether the students could work on the tasks according to their own solution. There was a

significant change (*p* < 0.05 \*) from 2.34 (survey 1) to 2.44 (survey 2) after the intervention (scale of four: 1 totally agree/4 disagree; see Figure 3). It is astonishing that despite the increase in the openness of the solution paths, the students responded that they could not develop their own solution path. This result could be explained by discussion of the results of the tasks during the courses, especially if lecturers only discussed one solution path.

**Figure 3.** Students' perceptions pre- and post-intervention in terms of openness to solution paths.

In order to challenge the students' thought processes, and to increase the degree of difficulty of the learning tasks, the intervention raised the cognitive level of those tasks that were limited to reproducing what had been learned. Tasks that required problem solving were used more frequently. Regarding this characteristic, 29.25% of the learning tasks in the first survey, according to the student perception, were distributed as follows: 29.25% of the tasks were to reproduce only what had been learned, and 32.19% required an understanding of the contents first so that the tasks could be solved. Only 27.32% of the tasks were conceived in such a way that one had to apply the contents to an existing problem, and only 12.47% required one to acquire additional knowledge. The survey after the intervention shows that the adjustments of the learning tasks led to a significant change in the perceived cognitive level compared to the first student survey. Here, a reduction of 6.16 percentage points indicating tasks requiring the exclusive reproduction of what was learned, was measured, from 29.25% (survey 1) to 23.09% (survey 2) (*p* < 0.001 \*\*\*). On the other hand, the proportion of tasks that required an understanding of the content in order to be able to solve them increased significantly (*p* < 0.001 \*\*\*) from 32.19% (survey 1) by 4.57 percentage points to 36.76% (survey 2). A significant shift in the other two areas: "applying content to a given problem" and "acquiring additional knowledge" could not be measured (see Figure 4). It may be difficult to make a distinction at this level from the students' perspective. The before and after studies show that students perceive the increased cognitive level as being significant. In summary, the adaptation of tasks led to a shift in the cognitive level from "memorising" to "applying".

Another method of increasing the difficulty of learning tasks is through the utilization of the characteristic of linguistic complexity. During the intervention, the presentation of information in the learning tasks was revised. Mathematical indicators were transformed into vocational units and information was presented according to situational logic instead of the logic of the solution path, so that relevant information for accomplishing the task was not always clear and could only be identified through analysis. For the investigation, the question was asked as to whether students could extract from the task the information necessary for the task's processing. The study's result showed a significant (*p* < 0.05 \*) increase in the mean value from 1.73 (survey 1) to 1.83 (survey 2) (four scale 1 totally agree/4 disagree; see Figure 5). The increased complexity of the linguistic logic meant

that the students were not always able to clearly extract the information necessary for processing from the tasks, compared to the initial survey.

**Figure 4.** Students' perceptions pre- and post-intervention of the cognitive demands of the tasks.

**Figure 5.** Students' perceptions pre- and post-intervention of complexity.

Furthermore, for the investigation of the characteristic "degree of difficulty", respondents were asked about their direct perception of the degree of difficulty. The comparison of answers in the two surveys to the question: "I find the degree of difficulty of the tasks is appropriate", showed that with the degree of difficulty of the learning tasks a significant shift of the mean value (*p* < 0.05 \*) from 1.9 (survey 1) to 2.01 (survey 2) by 0.11 points can be determined (scale of four: 1 totally agree/4 disagree; see Figure 6).

In addition to questions about how the tasks were handled, the motivation to learn more about the subject area was also questioned. Despite the additional time required, and the perceived increase in the complexity of the content and the difficulty of the tasks, it was possible to determine an unchanged motivation (see Figure 7).

The increased level of difficulty was also reflected in the time required to solve the tasks. On a scale of four (in which 1 corresponds to less than one hour and 4 to more than 4 h) the time required increased significantly (*p* < 0.001 \*\*\*) by 0.24 from 1.77 (survey 1) to 2.01 (survey 2). Students required more weekly learning time to solve the tasks than before the learning task adjustments (see Figure 8).

**Figure 7.** Perception of students pre- and post-intervention for motivation.

**Figure 8.** Time required to process the pre- and post-intervention tasks.

Table 2 gives an overview of the items with ordinal scales in the first and second survey.


**Table 2.** Overview of the items with ordinal scales in the first and second survey without the variable "cognitive demand".

The mean values of the survey results of the learning tasks are mostly in the positive range of the scale showing that the students' handling of the learning tasks is definitely conducive to learning.

#### **5. Discussion and Conclusions**

The results show that working through the learning tasks enhanced by the increase of the characteristics problem openness, situation orientation, openness of solution paths as well as degree of difficulty, requires the more intensive involvement of the students. However, the before and after survey shows that despite the increased complexity and workload, the motivation to deal with topics of the subject has remained stable.

This could be as a result of the fact that the learner-centred, constructivist-orientated learning tasks now activate the learners' self-regulation more strongly. The increased complexity led initially to an excessive demand on the learners because they were not used to working on constructivist-orientated tasks. The study by Tremblay, Leppink, Leclerc, Rethans, and Dolmans shows that complex tasks produce a higher cognitive load and require more working time than a simple task; a lack of problem-solving experience and information research experience hinders the thinking process and results in a poorer selfassessment of performance. Complex tasks, however, strengthen reflective practice during debriefing. Students indicated that they learned more from the complex tasks [33].

Perkins [34] already mentioned three challenges that learners face in learner-centred, constructivist learning environments. The first is the cognitive complexity of the learning environment, which can lead to an initial cognitive overload of learners. Second, the increased demand on task management skills is described. The learners do not receive predefined solutions and must now activate their own task management skills. This ability can be developed in different ways depending on the learning culture and learning level of the learner. Thirdly, Perkins [34] shows that the constructivist-orientated learning environment contains two learning goals at the same time. On the one hand, students should acquire professional competence; while, on the other hand, they should also independently control the learning process for the acquisition of professional competence. A more recent study by Kyndt, Dochy, and Cascallar, on the context of learning tasks and subjective perception of students also confirms this effect [31].

At the same time, the results of the surveys also show that complex learning tasks motivate students to deal intensively with realistic and complex issues in a self-directed manner [35]. In this context, an initial overload can be prevented by an appropriate support system, such as scaffolding and coaching [34].

In general, the transition from traditional tasks to those designed along constructivist lines should be carried out step-by-step by the instructors. The changed expectations of students' learning behaviour must be clearly communicated and introduced by a support system [11,34]. In this context, the discussion of tasks must also be open so that the students' self-determination is required [9]. According to Jonassen [22], constructivist-orientated university teaching does not necessarily require small seminars with group work and discussions. More important is the development of concrete support tools for the students.

Nevertheless, it must be emphasised that the use of constructivist-orientated learning tasks alone is not sufficient to change learning culture. For this to happen, the university needs a holistic concept in which method of performance assessment is also constructivistic. This is because learning successes acquired through constructivist methods cannot be fully captured by the traditional performance assessment of knowledge acquisition at the lowerorder level, leading to "sham constructivism" [6]. Against this background, the culture of performance assessment must be reconsidered so that constructivist-orientated learning methods can be fully effective. Instead of conducting a traditional written examination at the end of a learning unit, it is more appropriate to use learning portfolios, learning reports, and so forth [6].

Finally, the limitations of the study should also be pointed out. For example, only a single intervention in a university was analysed here. The learning process itself was not examined either, in other words, no teaching visits were made. Furthermore, no statements can be made about the actual learning success of the students due to the new task types.

Despite these limitations, the findings show that the introduction of constructivistorientated learning tasks must be accompanied by the introduction of a constructivistorientated learning culture into the university landscape. This intervention study is significant because the use of learning tasks shows a possibility to also enable constructivist learning in the context of large-scale courses where project work is not possible. The students' perceptions give evidence on how the use of learning tasks can be adapted.

The results can be transferred beyond business studies to other academic disciplines where the reconstruction of knowledge and the transfer of knowledge to the professional context is important, such as in the field of teacher education.

Against this background, this study contributes to research on the use of learning tasks from a student's perspective. The implementation of a constructivist-orientated teaching and learning culture must focus both on the people who prepare the knowledge and those who learn it.

**Funding:** This research received no external funding.

**Informed Consent Statement:** Informed consent was obtained from all subjects involved in the study.

**Data Availability Statement:** The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy.

**Acknowledgments:** I would like to express my special thanks to Christian Gronowski and Matthias Pilz as well as to the participating lecturers.

**Conflicts of Interest:** The author declares no conflict of interest.

#### **References**


**Patrícia Raquel da Silva Fernandes 1,\* , Jacinto Jardim 2,\* and Maria Celeste de Sousa Lopes <sup>1</sup>**


**Abstract:** The special education teacher is a key element in the development of the process of inclusive education. In this setting, soft skills have proven to be determinant in teachers' educational action. However, those that best qualify their profile have not yet been identified. Therefore, this study aims to carry out a review of scientific production between the years 2010 and 2020. To this end, articles were selected using the following databases: ERIC, Scopus, Web of Science, and PsycINFO. Studies have been included in the review that point out as soft skills: resilience, reflexibility, empathy, collaborative work, self-efficacy, creativity, and effective communication. Only studies that presented such criteria were included in the analysis. After the application of the eligibility criteria, seven articles were considered. From the analysis, it emerges that effective communication, collaborative work, and reflexibility stand out. There are gaps in this area in the specialized training of these teachers. Thus, it is suggested that there should be investment in this area in the training programs of the schools that certify them; and that, at the research level, instruments should be developed to evaluate the model emerging from this review.

**Citation:** Fernandes, P.R.d.S.; Jardim, J.; Lopes, M.C.d.S. The Soft Skills of Special Education Teachers: Evidence from the Literature. *Educ. Sci.* **2021**, *11*, 125. https://doi.org/10.3390/ educsci11030125

Academic Editor: Pere Mercadé Melé

Received: 15 January 2021 Accepted: 20 February 2021 Published: 16 March 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

**Keywords:** soft skills; inclusion; teachers of special education; teacher profile; effective communication

#### **1. Introduction**

In addition to the major changes that have taken place in education systems around the world, innovation has also taken place regarding special education. In terms of the development of policies and practices at world level, inclusive education is one of the themes increasingly discussed [1–3]. For this reason, we currently use the best resource for a quality and equitable education: inclusion [2–5]. At world level, it has gone through moments of great change, partly thanks to the actions developed by UNESCO, previously proclaimed by the declaration of Salamanca on 10 June 1994 and also reiterated by the Universal Declaration of Human Rights of 1948, indispensable for the understanding of the development of inclusive education [6,7]. According to this organization, it is an "enriched form of general education aimed at improving the lives of those who suffer from various disabilities, enriched in the sense that it uses modern pedagogical methods and technical material to remedy certain types of disability" [8–10]. As Casanova [11] (p. 16) adds, it is "the provision and application of precise educational resources for all pupils, whatever their personal educational difficulties and needs, to achieve optimal individual and social development".

In the Portuguese educational context and in accordance with the Basic Law of the Education System [12], it constitutes one of the special modalities of school education "... dedicated to those persons who cannot follow the educational system temporarily or permanently under normal conditions" (article no. 19). In parallel with these changes, other changes have emerged, and the concept of disability has become a more comprehensive and non-stigmatizing concept, that is, the concept of special educational needs (SEN). The literature of the speciality reveals that the "inclusion" is currently designated to promote

ideas and practices, as well as formations that bring the school closer to an institution that lives in the values of inclusive education [1,13]. It is a relatively recent activity, which had its origins in a systematic way in the second half of the 19th century and which, until the sixties of the 20th century, developed an activity in the field of practical knowledge, of marginal action that resulted in a segregated character [14]. Today, a new vision of special education and of parallel activity has become an integral part of general education, constituting a special modality of education as it is inscribed in the "Lei de Bases do Sistema Educativo" (LBSE), of 14 October 1986, in Portugal. In recent decades, especially since the Salamanca Declaration [6], a new paradigm of an inclusive school, capable of welcoming and retaining groups of traditionally excluded children and young people, has been affirmed. This paradigm has evolved as a movement which ideally calls into question policies and practices of exclusion. Inclusive education thus aims at educational equity, which is the guarantee of equality in access, participation, and learning. In the framework of educational equity, the educational system and practices should ensure the management of diversity and adopt different types of strategies to meet the educational needs of students [15–19]. Inclusive education is education for all. It aims to reverse the path of exclusion by creating conditions, structures, and spaces for a diversity of learners, as proclaimed in the Salamanca Declaration in 1994, where it is stated that children and young people with special educational needs should have access to regular schools, which should be adapted to them through child-centered pedagogy, capable of meeting these needs [6].

Inclusive education thus reinforces the right of everyone to attend the same kind of education, guided by the principle of equal opportunities and education for all. It is a process that involves change and involves, among others: (i) valuing all pupils equally; (ii) increasing participation and reducing exclusion of pupils and cultures; (iii) restructuring policies, cultures, and practices in schools so that they respond to the diversity of pupils; and (iv) reducing barriers to learning and participation of all pupils regardless of their differences. This is an ongoing process of developing learning and participation for all pupils. According to Booth and Ainscow, it is an ideal that all schools can and should aspire to. Participation, according to the same authors, "means learning together with others and collaborating with them in shared educational experiences. This requires active involvement in learning and has implications for how the educational process is lived" [20] (p. 7). However, the inclusion policy for SEN students is based on factors that go beyond legislation. The question is how to make this heterogeneous reality compatible with schemes based on models not prepared to work on diversity and difference, on proposals from the perspective of homogenization [16]. The school will be inclusive when it transforms, not only the physical network, but the posture, attitudes and mentality of educators, and the school community in general, to learn to deal with heterogeneity and to live naturally with differences.

#### *1.1. The Inclusive Teacher*

The concept of inclusion or inclusive education is paramount and commonly associated with special needs education (so-called "EE") and special needs students [7]. It emerges in the context of the European Schools and the commitment to the education of people with disabilities in the regular education network, thus seeking to support the development of education systems, create schools that can respond to all children and young people and combat exclusion [21]. Inclusion involves change. It is a continuous process of learning development and participation of all students. It is an ideal to which all schools can aspire, but which will never be fully achieved. However, inclusion occurs as soon as the learning development process begins. An inclusive school is one that is on the move [20]. It requires a restructuring of schools to meet the needs of all children and another pedagogy within the classroom and another type of teacher. In this sense, an inclusive teacher, aside from recognizing the difference, adopts a pedagogy that includes everyone, seeking to provide a differentiated teaching, and he or she organizes the activities

and interactions in such a way that each one is often confronted with enriching situations according to his or her personal characteristics and needs.

By promoting the development of diversified strategies, the teacher becomes a facilitator, a true builder of learning environments that promote personal, cultural, and social development. He or she will have to develop and manage these environments by being flexible enough to deal with the unforeseen, the uncertainty, the expression of feeling, and the doubts and fears of those who grow up, along with those who learn. The teacher must go on forming, discovering, reflecting, adapting, identifying, and imagining new ways of acting that are more appropriate and closer to the realities with which he is confronted daily. The pedagogical differentiation appears as a path in the respect for difference by providing everyone with the same opportunities. To be able to differentiate, it is necessary not to be indifferent to differences. To teach a class, it is assumed that all students can learn, but in time and in their own way; that is, each learns certain knowledge according to their own characteristics, which come from their own knowledge and their habits of thinking and acting. Attention to individual differences, whatever their origin, in an inclusive school thus requires open and flexible curricula capable of responding to the common needs of the entire school population. Differentiation, adaptation, and individualization of curricula is necessary, in line with the needs and characteristics of each student. All pupils should have the same rights and opportunities, including the right to difference and an education adapted to their needs [16].

In this sense, the European Agency for Development in Special Needs Education (EADSNE) defines the profile of inclusive teachers and identifies four core values, related to teaching and learning, for the work of all teachers in inclusive settings: (i) valuing diversity—difference is considered a resource and a value for education; (ii) supporting all pupils—teachers have high expectations of outcomes for all pupils; (iii) working with others—collaboration and teamwork as essential methodologies for all teachers; and (iv) professional and personal development—teaching is a learning activity and teachers should take responsibility for their lifelong learning [22]. These values, presented as fundamental, together with their associated areas of competence are made up of three elements: attitudes (knowing how to be and how to live together), knowledge (knowing how to know), and abilities (knowing how to do). A certain attitude or conviction requires a certain knowledge or level of understanding and then abilities (know-how) to implement that knowledge in a practical situation. For each area of competence identified, the essential attitudes, knowledge, and skills that underpin them are presented.

#### *1.2. The Profile of the Special Education Teacher*

The special education teacher (so-called "PEE") in the Portuguese educational context constitutes one of the specific educational resources that, in the context of his specialty, supports, in a collaborative manner and in a logic of co-responsibility, the other teachers of the student in the definition of strategies of pedagogical differentiation and curricular accommodation, in the reinforcement of learning, and in the identification of multiple means of motivation, representation, and expression [1,23]. Thus, in addition to direct support (psycho-pedagogical support) in specific areas within its specialty, it provides indirect support or consultancy/mediation. His skills and profile have evolved and, if in the recent past, he was a teacher with a set of knowledge and skills that in schools facilitated the integration of pupils. Today, with the introduction of a new educational paradigm of integration for inclusion, this vision has changed. Thus, he began to direct his activity to all students with SEN whether they had disabilities [1]. Its competencies were first legally organized through a in five areas: critical analysis, intervention, training, supervision, and evaluation. However, this organization, according to some studies, proves to be little clarifying as each school interprets it in its own way. In this sense and for better clarification, the Association of Teachers of Special Education recommends that the competence profile of the PEE should be thought of in the perspective of establishing a bridge between the school we have and the school we want. It thus indicates a set of premises that define the

PEE as a collegial element of the school, a teacher who should cooperate with his colleagues, learning, teaching, and above all reflecting on what are the best models, frameworks, and materials to bring quality education to all students [1].

It should also, according to "Associação Nacional dos Docentes de Educação Especial" (ANDEE) [23], be: (i) a pedagogue who, within a pedagogical structure, is responsible for collecting, producing, and sharing information that is relevant to the education of all students; (ii) a professional in possession of intervention models that allow the school to understand, plan, execute, and evaluate inclusive models of pedagogical intervention; and (iii) a professional capable of articulating the internal and external services of the school, in a harmonic and coordinated whole, in order to achieve the best possible results. In this profile, PEE will be a professional with specialized training in one of its areas of expertise, a professional with scientific and practical knowledge in his area of expertise that will allow him to intervene, directly, in specific knowledge. Thus, for each student to progress in learning, they will essentially be a consultant, a collaborator, a supervisor, a facilitator, a co-operator, and a facilitator of practices that lead to success and quality in teaching. In short, he will be an inclusive teacher, as everyone should be, but specialized, contributing to our having a quality school where everyone learns according to their characteristics and abilities. In summary, he will be an inclusive specialist teacher, who in addition to his pedagogical and teaching skills, also known as hard skills, needs to be skilled in a set of personal and social skills that make for effective action. These skills are called soft skills.

#### *1.3. The Soft Skills of the Special Education Teacher*

The concept of soft skills has been considered in human resources, management, psychology, education, and the social sciences in general. However, some consensus has been found around the definition as a set of technical, methodological, and practical skills that is dynamically activated and manifested in performance [24–27]. Based on the concept of competence, it assumes the operationalization of a set of knowledge and attitudes in a specific situation in order to achieve specific results [28,29]. In turn, the concept of soft suggests the opposite of hard, hard skills referring to technical skills and soft skills to personal and social skills [30–32]. The concept of transferable skills is intertwined with that of soft skills, which are defined as "personality traits, goals, motivations and preferences that are valued in the labour market, at school and in many other fields" [33] (p. 451). Personal skills enable individuals to manage their own personal attributes, improve performance and sustain interpersonal relationships with others [28,33].

On soft skills, it should be noted that both the OECD and the European Union consider the development of their transversal skills to be relevant, and this is an area to be taken into account among the priorities of national training policies [33–36], being valued both in school and in the labor market and in social interaction in general. This relevance is justified by the fact that the degree of development of this type of skills predicts productivity at work, since they complement technical skills [37]. Thus, we can conclude that the challenges of the teaching career today can be more easily overcome by teachers with soft skills, in addition to the technical skills that are inherent to them. The OECD indicates the importance of teachers developing their transversal competences, and this is an area to be taken into account among the priorities of national training policies [34]. The European Commission, for its part, proposes that, in addition to promoting the development of these competences by teachers, they should also be developed by pupils, since their mastery leads to improvements in the overall teaching and learning process. This proposal is justified by the fact that these skills are acquired mainly through socio-emotional dynamics, hence the relevance of special education teachers developing soft skills in the training period as a way of enabling them to have a significant pedagogical presence in the educational community [38].

The empirical evidence also points to the fact that teachers' pedagogical capacities are related to their transversal competences, with those who possess these capacities proving to be more pedagogically effective than those who possess only theoretical knowledge [39]. Therefore, we can conclude that the challenges of the teaching career today can be more easily overcome by teachers qualified in the field of soft skills, in addition to the technical skills that are inherent to it, thus being able to effectively manage their daily tasks in challenging contexts, such as the one presented today to special education [40]. For all these reasons, the following question has been defined: what are the soft skills that special education teachers most need to be successful in their professional activity? Thus, this study aims to identify and describe the soft skills of special education teachers.

#### **2. Methods**

In order to find the answer to the above-mentioned question and achieve the objective of this study, a review was carried out, based on theoretical and empirical studies related to soft skills in the following databases: ERIC, Scopus, Web of Science, and PsycINFO. Studies were included in this review if they (i) involved special education teachers; (ii) assessed the soft skills, namely resilience, reflexibility, empathy, collaborative work, self- efficacy, and effective communication; (iii) were written in English, Portuguese, or Spanish; and (iv) were published in a peer-reviewed journal over the last 20 years. Therefore, keywords in search were soft skills and special education. A pair of researchers independently extracted relevant full papers. The discrepancies between the two main reviewers were resolved through discussion with a third co-author and a final list was obtained. As shown in Table 1, a total of 33 studies were identified. From these, 26 were excluded because they did not examine soft skills in special education teachers. All those focusing on students or other professionals such as psychologists or students were excluded.


**Table 1.** Summary of the initial screening \*.


**Table 1.** *Cont.*

\* Note: Quartile rankings derived for each journal according to the SJR (Scimago Journal and Country Rank).

#### **3. Results**

The results obtained appear from selected articles in the ERIC, Scopus, Web of Science, and PsycINFO databases, published between 2010 and 2020 and in accordance with the inclusion and exclusion criteria previously defined and referred to above. All these articles were within the scope of special education, from the perspective of inclusion and with special attention to professionals in this field. The objectives of the selected articles referred to some soft skills, but in an isolated way, according to Table 2. Furthermore, in the theoretical basis, the studies revealed this dispersion of models and conceptions; hence the relevance of this study, which reveals itself to be innovative and useful, both for intervention and research in this area.



Table 2 illustrates the seven included studies. The data collected were presented according to the order of the date of publication, highlighting the type of journal and the skills analyzed by the different authors.

According to Guo et al. [56], in an article published in the journal Early Childhood Research Quarterly, in the quartile Q1 ranking, self-efficacy and effective communication skills are highlighted. Kart [59], in a paper published in the journal Education Sciences, in the quartile Q2 classification, highlights the skills of resilience and collaborative work. For Demirok et al. [66], in an article published in the International Journal of Emerging Technologies in Learning (IJET), in the quartile Q2 ranking, effective communication skills are highlighted. For Mu et al. [60], in an article published in Teaching and Teacher Education magazine, in the quartile Q1 classification, the competence of resilience is highlighted. According to Pickl et al. [63], in an article published in the International Journal of Inclusive Education, in the quartile Q1 classification, the skills of resilience, reflexibility, collaborative work, self-efficacy, and effective communication are highlighted. Finally, according to Irvine [58], in an article published in the Journal of Teacher Education, in the quartile Q1 rating, the skills of reflexibility, empathy, collaborative work, and effective communication are highlighted.

The article by Irvine et al. [58] addresses the issue related to multicultural education and special education and focuses on the complexity of the relationship between multicultural education and special education of African Americans, students of color as belonging to an identity category, usually identity related to their disability. The authors explored areas of divergence and conflict between the two areas—special education and multicultural education, specifically issues of disproportionate representation, cultural misunderstandings, tensions between home and school, and competition—providing some recommendations that can most effectively prepare special education teachers, namely culturally responsive pedagogy and training of special educators in developing caring relationships with students while maintaining high expectations; engagement and motivation of students; selection and effective use of learning resources; and promotion and learning with family and community involvement [58]. Thus, this article distinguishes itself by explicitly referring to reflectivity, and implicitly, empathy, collaborative work, and effective communication.

The article by Peltier [62] argues that the literacy process is complex for all children, especially those with learning difficulties. It requires that their teachers have deep, extensive and flexible knowledge about teaching these skills—phonological, phonetic, and orthographic awareness. This study addresses the fundamental knowledge, perceptions, and skills in this subject, as well as their reflexive capacity. It focused on a group of 12 teachers from general and special education preparation courses. The knowledge scores of initial and special education teachers were significantly higher from pre to post-tests and significantly different when compared in a general education literacy course. Reflective ability was not a significant predictor of primary school pupil growth and declined over time. The author thus explicitly presents the soft skills of reflexivity, and implicitly discusses collaborative work and effective communication.

In turn, the article by Pickle et al. [63] argues that today, special needs teachers need, in addition to general pedagogical skills, skills to manage highly heterogeneous groups in inclusive environments. This is a qualitative study, which aims to identify knowledge skills, action, and attitudes necessary for teachers to succeed effectively. In-service training, focusing on reflection and evaluation of individual and team work, as well as the reactions of students in initial special education training, can help increase the readiness to model a reflective attitude as a crucial prerequisite for teaching success. The results of this study therefore show that teachers need to improve their skills in reflexivity, resilience, reflectivity, collaborative work, and effective communication.

The article by Guo et al. [56] is a quantitative study, using a sample of 73 early childhood special education teachers and 837 preschool children. It aimed to verify the differences in the teaching of self-efficacy of children with and without disabilities, as well as the differences in the teaching of children with different types of disabilities. The findings of these authors indicate that the self-efficacy of teachers is a significant predictor of the knowledge acquired by children. In addition, they argue that poor teacher self-efficacy in relation to children with disabilities may constitute an additional risk factor for the school maladjustment of these children. Thus, Guo et al. [56] focus their paper on the issues of effectiveness, including self-efficacy and effective communication.

Demirok et al. [66], with a qualitative study, used a sample of special education teachers in order to verify the opinions of special education teachers regarding the use of technology to assist students with reading difficulties. Although the focus of the study is on the use of technologies in teaching reading and writing, the results show that good communication is necessary in this process and that the technologies, in addition to saving time, provide the development of persistence and motivate and focus students' attention more. In this way, the soft skills that are evident in this study are effective communication.

The article by Mu et al. [60] focuses on resilience. This quantitative study, based on an ecological perspective, investigates the role of Chinese inclusive education teachers in the process of resilience of students with disabilities. The study shows that students with disabilities suffer from multiple stress factors, which requires a great ability to find adequate resources and minimize student difficulties, as the study demonstrates, summarizing all this into the competence of resilience.

In turn, Kart and Kart [59], in a literature review study, among the skills investigated, highlight the relevance of collaborative work in promoting inclusion and they state that this is one of the factors that most influence student outcomes in an inclusive school and that negative impacts can be mitigated with policies and active collaboration between all stakeholders in the educational process. In this way, the soft skill that is evident in this study are collaborative work.

The main results of these articles will be discussed below.

#### **4. Discussion**

The reduced number of publications in the ERIC, Scopus, Web of Science, and PsycINFO databases that address the soft skills of special education teachers in the inclusion process reveals the lack of studies in the area and the consequent need for research in this area. In our opinion, the profile of the special education teacher should include, besides the technical skills, inherent to its specialty group, the soft skills. In this sense, our study has identified a model of six soft skills that are now described and discussed. Therefore, in accordance with the above results, and responding to the research question presented, we found that some soft skills occupy a relevant place in the teaching performance of special education teachers, highlighting effective communication, collaborative work, and reflexibility.

These results are in line with those of Allala and Abusukkar [42], which affirm the importance of soft skills and the need to post them in order to be successful in professional life and that more attention should be paid to soft skills by teachers, particularly in their initial training and throughout their lives. Thus, we can conclude that soft skills are determinant in the access and performance of special education teachers' functions.

To lead the teaching-learning processes, all education naturally presupposes competence in the field of effective communication, which consists of making common, sharing ideas, exchanging information, and interacting [51,74,75]. Most of the articles reviewed have explicit references to effective communication [56,58,62,63,66]. For example, Irvine et al. [58] state that pro-teacher training institutions need to find strategies to empower all initial teachers to be effective educators, being persistent, open-minded, reflective, and therefore good communicators. Therefore, this soft skill implies a varied set of factors as it is a complex phenomenon. The human being, communicating at various levels, involves a varied set of factors that make it possible to express what one thinks, feels, and desires, choosing a set of attitudes appropriate to each situation, according to the place and the moment [26], and all this is fundamental in the context of inclusive education where the special education teacher acts. Thus, it is concluded that in educational interaction effective communication is essential to achieve the objectives of special education.

In addition to effective communication, special education presupposes collaborative work, which consists of planning, acting, and evaluating as a team. Some of the articles analyzed have explicit references to collaborative work [58,59,62,63], which is in line with those who argue that teamwork is essential in inclusive education [76–79]. For this reason, teachers need to improve their faculty of cooperation, which consists in the ability to "operationalize knowledge, attitudes and skills in order to act together, with a view to achieving a common goal by maximizing the potential of each individual in a durable and balanced way" [26] (p. 135). Promoting collaborative work in schools means highlighting the explicit intention of each one to add value to working together, contributing something different. Naturally, this type of action has been increasingly implemented since it is duly defined in the educational projects of educational institutions. Hence, we consider it to be an essential soft skill in the training and work of special education teachers.

Reflectivity is also essential in special education, since it is necessary to analyze, plan activities, and deal constructively with uncertainty and unpredictability in order to reformulate the action. The articles analyzed that explicitly refer to this capacity are Peltier et al. [62], Pickl [63], and Irvine [58]. To be successful in inclusive education also presupposes the use of reflexive thinking, as described in literature [80–82]. In this sense, Peltier et al. [62] state that reflective activities are widely used in teacher training programs. These activities are continuously developed in order to plan for the unpredictable circumstances of daily teaching.

This ability, according to the above-mentioned authors, manifests itself in the ability to ask and to doubt, to dialogue, and to criticize. In this sense, educational action requires systematic, rigorous, and strategic reflection on the emerging problems and the appropriate plans for their sustainable resolution. It is the very unpredictability of educational situations that demands the promotion of these reflective habits as a way of educating children, young people, and adults with special needs with quality. Thus, when someone joins an education team they even need to be equipped with practical knowledge about the techniques and methods to be creative and reflective in the context of the teaching-learning process itself.

Special education presupposes the competence of resilience. The articles that focus on this theme are Mu et al. [60] and Pick [63]. The special education teacher needs the competence of resilience to deal with the adversities that his or her profession inevitably raises [60,63]. This can be defined as "the ability to operationalize knowledge, attitudes and skills in order to prevent, minimize or overcome the harmful effects of crises and adversities" [26] (p. 167). Thus, a resilient teacher, having to face a stressful or adverse situation, is able to use his personal resources by assuming the behaviors that help him to be successful in that circumstance.

Successful inclusive teachers also manifest behaviour characterised by self-efficacy [83,84]. This soft skill was referenced in the articles of Guo et al. [56] and Pick et al. [63]. It should also be noted that others do not mention it explicitly but refer to it implicitly. According to Bandura [85], self-efficacy is related to beliefs about the ability to have self-control over individual behavior and events affecting life. It is this competence that facilitates decisionmaking in difficult situations, since it allows one to think and evaluate circumstances, to have self-determination and flexibility in order to effectively achieve the objectives previously outlined. These data are in line with the study by Guo et al. [56], which states that there is strong evidence that teacher self-efficacy in relation to each child is an important factor to be considered in the context of inclusive education. Therefore, this soft skill will therefore be necessary in the professional performance of the special education teacher.

Special education also presupposes the competence of empathy. Among the articles analyzed in this review, the one that focuses on this theme is that of Irvine [58]. The ability to listen actively to students is also essential in inclusive education [85]. Empathy consists in the ability to "listen in order to perceive the thoughts, feelings and intentions of the interlocutor, providing an adequate understanding of the situation expressed and encouragement for similar future situations" [26] (p. 80). Thus teachers, especially those in special education, need to develop this communication skill by improving not only verbal communication, but also non-verbal communication. This accompanies the information exchanged, through looks, gestures, and smiles, which leads the interlocutor to feel understood, accepted, and encouraged.

Thus, according to the above, we can state that we have answered our research question, since we have identified six soft skills necessary for special education teachers to be successful in their activity. However, we recognize the limitations of this study.

It should also be noted that there are some limitations to this review, in particular the existence of little scientific production on this topic. Most of the articles focus on empirical work aimed at students and not at teachers. Moreover, the methodology of the study, an exhaustive analysis, was mainly descriptive but could have been accompanied by a meta-analysis.

#### **5. Conclusions**

The aim of this research was to contribute to the improvement of the subject in question by systematically reviewing scientific production between 2010 and 2020 to verify the soft skills most evident in this area of research. From the analysis of the articles and according to the selected soft skills—resilience, reflexibility, empathy, collaborative work, self-efficacy, and effective communication—we concluded that, although they emerge in isolation, effective communication, collaborative work, and reflexibility predominate. From the scarcity of literature in this field, and in the few articles found, no theoretical models were verified. Thus, there are some gaps in this area, not only at the level of scientific production, but also at the level of the specialized training of these teachers. As described in the literature, properly preparing professionals for these new roles and responsibilities requires the implementation of a new training model, since the challenges of the teaching career today can be more easily overcome with soft skills, in addition to the technical capacities that are inherent to it.

Within the framework of quality educational equity, education systems must not only ensure the management of diversity but also adopt a set of appropriate practices and strategies. In this context, the special education teacher has a leading role, for which they require not only innovative teaching and didactic practices and scientific knowledge inherent to his or her specialty group, but also a set of soft skills that can contribute to an inclusive education of quality and more effective. In this sense, in a truly inclusive school, its actors act with the development of all students in mind, and without the abovementioned skills, their performance is limited, and the entire educational system is also impoverished. In the current context, the renewal of the profile of special education teachers increasingly requires the acquisition of transversal competencies that allow them to respond effectively to the challenges of schools that, by definition, must be inclusive, promoting equity, valuing diversity, teamwork, reflexivity, and resilience. It is therefore suggested that there should be investment in this area in the training programs of the schools that certify them, and that, at the research level, tools should be developed to evaluate the model emerging from this review.

**Funding:** This research no external funding.

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** The data supporting the findings of this study are available from the corresponding authors, upon reasonable request.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**

