Next Article in Journal
Social Media Topic Classification on Greek Reddit
Previous Article in Journal
A Lightweight Crop Pest Detection Method Based on Improved RTMDet
Previous Article in Special Issue
Assessment in the Age of Education 4.0: Unveiling Primitive and Hidden Parameters for Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations

1
IN4OBE LLC, St. Petersburg, FL 33702, USA
2
Department of Electrical & Computer Engineering, Gannon University, Erie, PA 16541, USA
*
Author to whom correspondence should be addressed.
Information 2024, 15(9), 520; https://doi.org/10.3390/info15090520
Submission received: 2 July 2024 / Revised: 10 August 2024 / Accepted: 22 August 2024 / Published: 26 August 2024

Abstract

:
The purpose of this research is to examine the benefits and limitations of the implementation of novel digital academic advising systems based on the principles of authentic outcome-based education (OBE) using automated collection and reporting processes for Accreditation Board for Engineering and Technology (ABET) student outcomes data for effective developmental advising. We examine digital developmental advising models of undergraduate engineering programs in two universities that employ customized features of the web-based software EvalTools® 6.0, including an advising module based on assessment methodology incorporating the faculty course assessment report, performance indicators, and hybrid rubrics classified according to the affective, cognitive, and psychomotor domains of Bloom’s learning model. A case study approach over a six-year period is adopted for this research. The two case studies present results of samples of developmental advising activity employing sequential explanatory mixed methods models using a combination of quantitative and qualitative analyses of (a) detailed students’ outcomes and performance indicator information and (b) self-evaluation of their professional development and lifelong learning skills. The findings of this study show that digital advising systems employing the faculty course assessment report using performance indicators and hybrid rubrics can provide comprehensive and realistic outcome data to help both developmental advisors and students easily identify the specific cause of performance failures, implement practical recommendations for remedial actions, and track improvements. Inherent strong skills can also be identified in academically weak students by observing patterns or trends of relatively better-performing outcomes to reinforce their natural affinity for learning specialized competencies to help them pursue related and successful career paths.

1. Introduction

Outcome-based education (OBE) is an educational theory that bases every component of an educational system around essential outcomes. At the conclusion of the educational experience, every student should have achieved the essential or culminating outcomes. Classes, learning activities, assessments, evaluations, feedback, and advising should all help students attain the targeted outcomes [1,2,3,4,5]. OBE models have been adopted in educational systems at many levels around the world today [5,6,7,8]. However, the tight race for ranking and accreditation has forced many institutions to pursue minimal requirements for the fulfillment of accreditation standards [9,10,11]. As a direct result, several aspects of established educational processes in many institutions may not truly reflect the paradigm and principles of authentic OBE [5,6,9,12,13,14,15,16,17,18,19].
Academic advising forms a fundamental aspect of OBE systems and is driven by outcome information. Established engineering institutions have implemented advanced academic advising systems to guide students in curricular or career matters. However, an extensive web search of the research literature for developmental advising based on outcomes in online databases of popular advising and engineering education journals produced no tangible information for advising systems based on skills data for individual students. The partly relevant literature, that covered some form of assessment of advising found a dearth of advising systems based on outcomes or presented some samples of the assessment of advising systems that do not incorporate outcomes data collected from direct assessments [5,20,21,22,23,24,25,26,27,28,29,30,31,32]. Quality assurance agencies mention the importance of improving outcomes using academic advising but do not list the assessment of individual student skills as a requirement for accreditation. This is due to the staggering amount of work and resource requirements that would otherwise be imposed on institutions related to the manual assessment, collection, and reporting of outcome data for each student. Most academic advising today is not based on accurate and realistic outcome data that provide qualitative and quantitative analysis of every student’s skills but rather on summative transcript scores and abstract derivations of student–advisor communication. In this research, we present a case study of two engineering campuses implementing digital developmental advising systems based on a sequential explanatory mixed methods approach for the evaluation of accurate outcome data collected for every individual student by employing the faculty course assessment report (FCAR) embedded assessment methodology and specific performance indicators (PIs) and their hybrid rubrics classified per Bloom’s three domains and learning levels.

2. Purpose of Study

The driving force behind this research is to demonstrate the benefits of the application of the essential theory of the authentic OBE model for the implementation of a holistic and comprehensive educational process that maximizes opportunities for the attainment of successful student learning. The objective is to study the implementation of a state-of-the-art academic advising system that employs best assessment practices, such as FCAR, specific PIs, and hybrid rubrics, using digital technology to tap the maximum potential and benefits of the authentic OBE model and overcome the limitations of contemporary advising mechanisms.
In particular, the researchers sought to answer the following research questions:
  • To what extent should engineering programs shift from program- to student-centered models that incorporate learning outcomes for the evaluation of individual student performance besides program evaluations for accreditation requirements?
  • To what extent can manual assessment processes collect, store, and utilize detailed outcome data for providing effective developmental academic advising to every student on an engineering campus where several hundred are enrolled?
  • To what extent can the assessment process be automated using digital technology so that detailed outcome information for every student on campus can be effectively utilized for developmental advising?
  • What specific benefits can digital automated advising systems provide to developmental advisors and their students?

3. Research Framework

3.1. Methodology

This research involves a case study approach over a six-year period from 2014 to 2020 of two engineering campuses at the Islamic (IU) and Ganon (GU) Universities. A qualitative analysis of developmental advising based on outcome direct assessment data was obtained using a selective literature review covering academic advising topics with a focus on outcome assessment in online databases of popular advising and engineering education journals. An extensive web search of the last 20 years of the research literature was conducted using keywords including “outcomes”, “assessment”, “engineering”, and “advising” in online databases of popular advising and engineering education journals, such as the National Academic Advising Association (NACADA) Journal, The Mentor, the American Society of Engineering Education’s (ASEE) Journal of Engineering Education (JEE), and the Institute of Electrical and Electronics Engineers (IEEE) Transactions. The partly relevant literature that covered some form of assessment of advising either employed some national or regional study and found a dearth of advising systems based on outcomes or presented some samples of assessments of advising systems that do not incorporate outcome data collected from direct assessments [5,20,21,22,23,24,25,26,27,28,29,30,31,32].
An in-depth description of the theoretical, conceptual, and practical frameworks that helped establish authentic OBE pedagogy at the two engineering campuses of IU and GU supporting the implementation of state-of-the-art digital developmental advising systems based on valid and reliable outcome direct assessment data over a period of 6 years is provided. Essential elements of an authentic OBE assessment methodology utilizing a digital platform, web-based software, EvalTools® employing the FCAR, specific/generic PIs and corresponding rubrics, and classified per Bloom’s 3 domains and their learning levels to ensure the quality of direct assessment outcome data at the two campuses are discussed.
Finally, we present a sequential explanatory mixed methods approach for the examination of direct assessment outcome data and self-evaluation information using electronic diagnostic tools enabling feedback for knowledge and skill improvement in developmental advising. The sequential explanatory mixed methods approach adopted at the Islamic University’s Civil, Electrical, and Mechanical engineering programs first involved a quantitative analysis of every individual student’s skills based on SO and PI scores for a given term conveniently presented in organized digital performance reports. The quantitative analysis is then followed by a qualitative semantic analysis of the language of SOs and specific PI statements, course names, and assessment types to accurately identify the specific cause of failures or exceptional performance. Any observed trend or pattern of failures or exceptional performance for specific or related skills was easily identified using single or multi-term outcomes data conveniently presented by electronic diagnostic tools. In some cases, if the semantic analyses were inconclusive, advisors also qualitatively analyzed course instructor feedback to zone in and verify specific details of students’ performances. Once core student learning deficiency or strength was identified, developmental advisors could identify specific curricular learning activities to either recommend practical remedial actions for the improvement of required student knowledge and skills or develop a suitable plan of study to target specific career paths.
The sequential explanatory mixed methods approach adopted by the electrical engineering program at Gannon University first involved a quantitative analysis of every individual student’s skill performance in a given term, followed by a qualitative analysis of survey responses for each student’s self-evaluation of lifelong learning skills. Advisors then applied rubrics to score individual students’ survey responses corresponding to various PIs for assessing lifelong learning skills. A customized advanced feature of EvalTools® called “SOs Evaluation by Alternatives” was developed with the help of Makteam Inc. (EvalTools® by Makteam Inc., Erie, PA, USA) to aggregate the various PI results corresponding with students’ lifelong learning skills to compute the program-level SO value for a given term [33]. Finally, the single- or multi-term quantitative SO results were reviewed to estimate the attainment of program-level performance of students’ lifelong learning skills.
The findings of this research highlight essential elements of authentic OBE assessment methodology that need to be incorporated into educational practice to ensure the collection of accurate student learning outcome data for establishing effective developmental advising based on outcome information. This study specifically highlights novel mixed methods approaches in digital developmental advising systems to systematically examine accurate student knowledge and skill information gathered by streamlining the sequential collection and reporting of course outcome data by instructors from direct assessments for all enrolled students by using FCAR + specific PI embedded assessment methodology. The process flowcharts indicated for the qualitative and quantitative analyses of digital outcome data provide clear guidelines to developmental advisors on how to effectively conduct quick and accurate evaluations to identify patterns of failure for remedial actions or strengths for the alignment of education with successful future career paths. The process flowcharts are elaborately explained in Section 5, the results of this paper. The case study presents samples of novel investigative models exhibited by advisors’ ingenious usage of electronic reporting features to easily track the achievement and progress of student learning outcomes. The models can therefore act as viable prospects for the design and use of practical and effective technology-based pedagogical solutions for accurate evaluations and comprehensive advisor feedback to attain holistic developmental academic advising. The scope of this research, as outlined by the research questions, involves detailed analyses of empirical data related to sample cases of developmental advising in the two campuses since each sample requires careful examination of an individual student’s single and multi-term outcomes data and corresponding advising actions. As per authentic OBE principles, the overall impact of developmental advising models is directly measured by evaluating the attainment of SOs based on multi-year cumulative results of SO summary and trend analysis reports. If engineering programs achieve positive trend analysis results for a majority of the SOs, a meeting expectations decision is thereby attained for the implemented digital developmental advising models. As a summary of findings, we present a detailed qualitative comparison of digital developmental and prevalent traditional advising models by using 22 pedagogical aspects in six broad areas of education that are extracted from the literature review, theoretical, conceptual, and practical frameworks, and results of this study.

3.2. Participants

In this sequential mixed methods research conducted from 2014 to 2020, we shall present some samples of digital academic advising systems that employ the developmental advising model based on authentic OBE theory at the Faculty of Engineering at the Islamic University (IU) and the electrical engineering (EE) program at Gannon University (GU). The engineering programs at the two institutions were selected for this study since programs in both institutions employed the advising module of EvalTools®. The research conducted in the Faculty of Engineering at IU’s Departments of Civil, Electrical, and Mechanical Engineering involved 43 faculty members and 823 students from multiple cohorts of the 4-year bachelor of science programs. This study implemented developmental advising based on accurate diagnostics and mechanisms of failure analysis extracted from observations of specific trends or patterns of deficient ABET SOs/PIs performances to enhance overall teaching effectiveness. Two random samples of developmental advising with ranging academic performance were considered for this study at the Faculty of Engineering at IU. One sample consisted of a student with above-average academic performance (3.5 < GPA > 5.0), and the other was an underperforming student with a GPA of less than 3.0. The research study conducted at the Electrical Engineering Bachelor of Science program at Gannon University involved 8 faculty members and 272 students from multiple cohorts. The EvalTools® advising module implemented at the electrical engineering program at GU involved the use of an additional feature called “SOs Evaluation by Alternatives”. This research implemented self-evaluation mechanisms in advising systems that empower students with the capability of measuring their own performances in program-level learning outcomes using digital reporting features corroborated by advisor inputs to enhance lifelong learning skills. For this case, one random student advising sample with above-average academic performance (3.5 > GPA < 5.0) was considered. Students’ outcome data used in this study are available online, were collected using routine direct assessments, and are presented in an anonymous format.

3.3. Developmental Advising and Its Assessment—A Qualitative Review

3.3.1. Developmental Advising

Since several approaches to advising are practiced worldwide, each with its own purposes and goals, the assessment of current advising systems or programs is a complicated affair. There are two popular approaches, traditional and developmental advising. The traditional or prescriptive approach is highly structured, with the advisor assuming an authoritative role, controlling the amount of information given and the way it is presented. Jeschke, Johnson, and Williams (2001) described traditional, or prescriptive, advising as a “quick and efficient” method, in which the advisor explains the sequence of courses that the advisee should take and makes sure the student understands the course registration process [34]. As cited by Kadar (2001), Raushi defined developmental advising as “a process that enhances student growth by providing information and an orientation that views students through a human development framework [35]”. Gordon (2019) explained that the developmental approach to advising entails focusing on the individual student’s concerns, needs, and aspirations; it is accepted as an ideal by many writers and practitioners in the field of advising [26]. Developmental advisors consider the advisee as a student who matures throughout their educational career. The developmental advisor, while assisting the student in choosing appropriate course plans, also attempts to address the needs of the transitioning student by using student development theory and providing the required information about the academic environment. The advisor is therefore required to evaluate the student’s current developmental stage and use this information to work with the student and concerned instructors to design an appropriate plan of study. This approach adopted by developmental advising can significantly enhance the effectiveness of the teaching process.
According to Appleby (2002, 2008), “Well-delivered developmental advising helps students understand why they are required to take certain classes, why they should take their classes in a certain sequence … what knowledge and skills they can develop in each of their classes … and the connection between student learning outcomes of their department’s curriculum and the knowledge and skills they will be required to demonstrate in graduate school and/or their future careers” [21,22].
Banta, Hansen, Black, and Jackson (2002) summarize the following two main schools of thought: one for prescriptive advising (the most important aspect of advising is the assurance that students register for correct courses) and the other (the developmental approach, in which knowledge, skills, academic environment, and other aspects of students’ lives must be considered) for the proper administration of student advising [36]. These two approaches have different goals and may require different approaches to assessment.
According to Campbell and Nutt (2008), the assessment of advising based on a learning-centered paradigm that focuses on outcomes must be used to understand whether student learning outcomes have been achieved [25]. Campbell (2005a, 2005b) clearly states that advising programs and administrators need systematically gathered and specific outcomes assessment data for achieving academic improvement [23,24]. Since our focus in this research is on the OBE model of education, we will concentrate on the evaluation of learning outcomes in developmental advising for the enhancement of student knowledge and skills and the effectiveness of teaching.

3.3.2. National Academic Advising Association and ABET Standards

The National Academic Advising Association (NACADA, 2023) guidelines for academic advising also state that each institution must develop its own set of student learning outcomes and the methods to assess them [37]. NACADA states that student learning outcomes for academic advising are “an articulation of the knowledge and skills expected of students as well as the values they should appreciate as a result of their involvement in the academic advising experience”. These learning outcomes answer the question, “What do we want students to learn as a result of participating in academic advising?” The assessment of student learning should be an integral part of every advising program [37].
ABET criterion 1 for accreditation specifically states “Student performance must be evaluated. Student progress must be monitored to foster success in attaining student outcomes (SOs), thereby enabling graduates to attain program educational objectives. Students must be advised regarding curriculum and career matters” (ABET, 2023) [12]. Therefore, individual student skills data or results would be both a fundamental requirement and a pivotal base for the entire academic advising process to initiate and continue successfully. In fact, the ongoing and continual assessment of individual student skills would be the litmus test for a successful academic advising process.

3.3.3. Assessing Advising

Unfortunately, even though the importance of student performance is touted far and wide throughout academia, student learning outcome information is rarely implemented and evaluated in academic advising systems. An extensive web search of the last 20 years of the research literature using keywords including “outcomes”, “assessment”, “engineering”, and “advising” in online databases of popular advising and engineering education journals, such as the National Academic Advising Association (NACADA) Journal, The Mentor, the American Society of Engineering Education’s (ASEE) Journal of Engineering Education (JEE), and the Institute of Electrical and Electronics Engineers (IEEE) Transactions on Education, produced no tangible information for advising systems based on skill data for individual students. The partly relevant literature that covered some form of assessment of advising either employed some national or regional study and found a dearth of advising systems based on outcomes or presented some samples of the assessment of advising systems that do not incorporate outcome data collected from direct assessments.
In a scholarly work on advising assessment, Lynch (2000) observed the following: one might expect that academic advising would be evaluated with somewhat the same regularity and thoroughness as classroom instruction [30]. Such is not the case. In its fifth national survey of academic advising based on eleven criteria (27), American College Testing (ACT) found that the evaluation of advising programs and academic advisors received the ninth and tenth lowest effectiveness ratings. Lynch (2000) concluded that designing comprehensive assessments for advising is a complex affair because advising systems can differ based on various theoretical models and are also implemented at various levels of complexity [30]. Swing (2001), in his work for the Policy Center on the First Year of College, noted that only 63% of academic advising programs are regularly evaluated [32]. Aiken-Wisniewski, Smith, and Troxel (2010) suggested that the literature lacks evidence of advisor access to information related to program assessment and evaluation, specifically citing a need for advising units to design curricula with intentionality, i.e., curriculum planning for the assessment of SOs [20].
Recently, Powers, Carlstrom, and Hughey (2014) stated that best practices of academic advising assessment involve the identification of student learning outcomes, the development and use of multiple measures of student learning, and sound professional judgment to understand the information gathered and improve student learning [31]. In their exhaustive national study, 499 individuals were invited from US NACADA regions (NACADA, 2019), and data were collected from 291 people, resulting in a 58% response rate. Out of this number, 230 (46% of the invited participants) offered complete data. The highest percentage of participants by institution type came from public and private, nonprofit, doctoral degree-granting institutions (37.8%, n = 87). A total of 53.0% (n = 122) of participants reported job responsibilities associated with institution-wide undergraduate advising. Collected demographic data indicated that most held the title of advising director/coordinator (45.7%, n = 105), and 21.7% (n = 50) said they work as an academic advisor. The title of assistant/associate dean described 9.6% (n = 22) of the respondents, while 5.2% (n = 12) identified themselves as dean. The fewest self-reported being a faculty advisor (1.7%, n = 4). A total of 87% (n = 200) of the participants indicated having some direct advising responsibilities, with 32.6% (n = 75) representing situations exclusive to professional advisors and 20.0% (n = 46) from situations in which only faculty advisors were employed [31]. The national study was based on a survey of academic advising assessment; the researchers noted that the assessment results often come from minimal, narrow, and inconsistent evaluation practices, often based on student satisfaction surveys. To generate a better picture of the current state of assessment, they surveyed those conducting or deemed responsible for academic advising assessment [31]. Although 80% of survey participants identified academic advising student learning outcomes, one-half assessed the achievement of those outcomes, with most using student surveys [31].
It is evident from the research literature that most advising systems use student surveys and do not use actual knowledge and skill information collected from direct assessments to verify the progress of academic advising. He and Hutson (2017) suggested that advisors need to incorporate direct assessment into advising to demonstrate value to the institution and contribute to the scholarship of advising [28]. A recent research study by Kraft-Terry and Cheri (2019) attempts to fill the gap in advising integrated with the assessment of SOs [29]. However, unfortunately, they defined seven generic SOs targeting student knowledge of university admission standards, graduation policies, the development of academic plans, and the identification and utilization of appropriate campus resources to achieve academic success. They state the importance of student-centered advising systems but refer to GPA as a benchmark for identifying academic success. In summary, their work emphasizes the importance of integrating SOs into the curriculum with a mechanism called backward design, but the SOs they propose were not related to curricular content and therefore could not be integrated into the direct assessments that comprise curriculum delivery. Their work did not provide a detailed mechanism or institutional resource to identify performance failures related to SO data aligned with curricular assessments [29]. At best, most attempts to integrate SO assessment and evaluation with academic advising did not apply the essential principles of OBE by overlooking the fact that SOs should align tightly with teaching and curriculum. This oversight is actually intentional due to a lack of information in the literature related to digital advising systems that are based on authentic OBE methodology and target the attainment of SOs fully integrated and aligned with the delivery and assessment of curricular course content.
Accreditation assessment models exacerbate the situation by suggesting manual processes for the data collection of SO information using generic and vague performance criteria and rubrics, which are based on the selection of small samples of students in select courses [12,13,17,18,38,39,40,41,42]. It is generally observed, as in the Gloria Rogers (GR) model [5,12,43,44,45], that most program evaluation models do not incorporate a comprehensive and accurate assessment of all students using specific performance criteria and corresponding rubrics. Contrary to OBE systems, the GR program evaluation model is program-centered, collects “relevant” pieces of information supposedly sufficient for evaluating a program, and does not implement a comprehensive assessment of all students’ performance. This renders the collected SO information inaccurate and insufficient to evaluate individual student performance for academic advising [5,43,44,45]. OBE-based developmental academic advising systems should employ student learning outcome information collected sequentially for all enrolled students using direct assessments in various phases of the educational process to evaluate the progression of advising. To resolve this dilemma, engineering institutions, programs, and quality assurance agencies should promote comprehensive assessment models that employ specific performance criteria and corresponding rubrics to implement authentic OBE using web-based digital technology [5,13,38,39,40,41,42,43,44,45,46].
In this research, we examine digital advising models that are based on authentic OBE frameworks, employ the FCAR embedded assessments methodology, and specific PIs classified per Bloom’s three domains and learning levels to track student knowledge and skills for effective developmental advising. We elaborate on several theoretical, conceptual, and practical frameworks that have been established over a 6-year time period, with intensive team efforts to support holistic pedagogy and multi-dimensional benefits for curriculum design and delivery, strategies of teaching and learning, and assessment and evaluation (Hussain et. al., 2020) [47]. The frameworks will show how digital developmental advising models presented in this research specifically address major deficiencies of prevalent advising by utilizing pedagogical solutions that (a) support the automated collection and reporting of valid and reliable outcomes data for every individual enrolled student, (b) collect accurate outcome data using specific PIs and hybrid rubrics that are accurately aligned with intended course topics and their learning activity, (c) provide high-precision qualification for student attainment of holistic learning by assessing specific PIs classified per Bloom’s three domains and their learning levels, (d) enable novel mixed methods approaches for the quick and accurate evaluation of student failure and/or strength based on detailed objective assessment data for achieving effective developmental advising, and (e) enable students to easily access detailed multi-term outcome data, reinforce remediation efforts with close collaboration and follow-up with advisors, and use outcome-based self-evaluations to enhance their metacognition and lifelong learning skills.

4. Theoretical, Conceptual, and Practical Frameworks

The philosophy, paradigm, premises, and principles of authentic OBE form the basis for theoretical frameworks that lead to the development of crucial models that act as the foundation of the Integrated Quality Management Systems implemented at the Faculty of Engineering. Several essential concepts are then induced from OBE theory, assessment best practices, and ABET criterion 4, CR4, on continuous improvement [12]. Essential techniques and methods based on this conceptual framework are then constructed as a practical framework of automation tools, modules, and digital features of a state-of-the-art web-based software EvalTools® [47].

4.1. Theoretical Framework

OBE Model

Educational institutions following the OBE model should ensure that all learning activities, assessments, evaluations, feedback, and advising help students attain the targeted outcomes. International and regional quality assurance (QA) agencies and academic advising organizations strongly recommend that educational institutions implement academic advising based on learning outcomes. However, many engineering programs’ advising systems follow the traditional or prescriptive approach based on summative transcript scores and do not utilize individual student’s learning outcome information for enhancing students’ knowledge and skills. Most developmental advising systems implement a scaled-down limited model that does not employ the Accreditation Board of Engineering and Technology (ABET) learning outcomes and detailed performance indicators information for every enrolled student [12]. To better understand the scope of this research and the limitations of current advising systems for outcome-based approaches, we begin with a brief introduction to some essential elements of OBE that were developed by the High Success Network (3,4,6,48).
The keys to having an outcomes-based system are as follows:
  • Developing a clear set of learning outcomes, around which all of the educational system’s components can be focused;
  • Establishing the conditions and opportunities within the educational system that enable and encourage all students to achieve those essential outcomes.
OBE’s two key purposes that reflect its “Success for all students and staff” philosophy are as follows:
  • Ensuring that all students are equipped with the knowledge, competence, and qualities needed to be successful after they exit the educational system;
  • Structuring and operating schools so that those outcomes can be achieved and maximized for all students.
In this research, we specifically concentrate on the following two major aspects advocated by an authentic OBE model:
  • All components of the education system, including academic advising, should be based on, achieve, and maximize a clear and detailed set of learning outcomes for each student;
  • All students should be provided with detailed real-time and historical records of their performance based on learning outcomes to make informed decisions for improvement actions.
Therefore, all components of educational systems that implement an OBE model should focus on aiding all students to successfully attain the targeted outcomes for achieving the intended learning aimed by the curriculum.

4.2. Conceptual Framework

FCAR + Specific/Generic Performance Indicator Assessment Model

Figure 1 shows a comprehensive continual quality improvement (CQI) process flow for an FCAR + specific/generic performance indicator (PI) model classified per Bloom’s three domains and a three-level skill grouping methodology [43,44] adopted by institutions implementing the developmental advising model in this research. ABET criteria for CQI [12] have been implemented in the assessment model, which requires course faculty and academic advisors to make decisions using assessment data collected from students and other program constituencies, ensuring a comprehensive CQI process. This requires the development of quantitative/qualitative measures to ensure that students have satisfied the course outcomes (COs), which are measured using a set of specific or generic PIs/assessments and, consequently, the program-level ABET SOs [12,15,40,41,44,47]. Course faculty are directly involved in the teaching and learning process, using detailed outcomes results to interact closely with advisors and providing students with on-time feedback for performance improvement. On the other hand, models that involve assessment teams that are not directly involved with the students will not support real-time comprehensive CQI, which is an essential element of an authentic OBE system [1,3,4,5,48,49,50]. Such CQI processes do not involve on-time course faculty and advisor interactions based on real-time, relevant, and detailed outcome information for improving performance failures and severely limit comprehensive quality improvement efforts. An ideal CQI cycle would therefore include the course faculty in most levels of its process to generate and execute action items that can directly target real-time improvement in student performance for ongoing courses. The noteworthy aspect of this model is that course faculty work closely with academic advisors and are involved directly with students in most CQI processes, whether at the course or program level.
A “design down” [3,5,6,48,49,50] mapping model was developed as shown in Figure 2, exhibiting authentic OBE design downflow from goals, program educational objectives (PEOs), SOs, course objectives, and COs to PIs. This figure illustrates trends in the levels of breadth, depth, specificity, and details of technical language related to the development and measurement of the various components of a typical OBE “design down” process [3,5,6,48,49,50]. Goals and objectives are futuristic in tense and use generic language for broad application. The term “w/o” (without) in the figure highlights the essential characteristics of goals and objectives. Goals and objectives do not contain operational action verbs, field-specific nominal subject content, or performance scales. Student and course outcomes do not contain performance scales. Performance scales should be implemented with the required descriptors in rubrics [44,45,47]. The FCAR + PI model uses the excellent, adequate, minimal, and unsatisfactory (EAMU) performance levels in rubrics. The reliability and validity of outcome assessment are ensured using an elaborate set of generic and specific PIs and their corresponding hybrid rubrics [44]. The hybrid rubric is a combo of the holistic and analytic rubrics developed to address the following issues related to validity: precision, accuracy of assessment alignment with outcomes, PIs, inter- and intra-rater reliability, and the detail of specificity of acceptable student performances when dealing with assessment of complex and very specialized engineering activities. The hybrid rubric is an analytic rubric embedded with a holistic rubric to cater to the assessment of several descriptors that represent all the required major steps of specific student learning activity for each PI/dimension listed [44].
The research reported in [44] provides procedures for developing and implementing hybrid rubrics for the accurate assessment of PIs related to each CO. These rubrics developed by groups of course specialists in each program are stored in a digital database and provide both faculty and students with clear and accurate details of expected performance in various student learning activities based on the “high expectations” principle of authentic OBE [3,5,6,47,48,49]. Figure 3 shows a sample portion of the database listing the hybrid rubrics, EAMU scales, descriptors, and percentages of score allocations for the electrical engineering program’s PIs 55, 56, and 57 associated with ABET SO “e” (SO_5), as follows: “An ability to identify, formulate and solve engineering problems”. Performance criteria as defined by instructors in the descriptors for the various scales of the hybrid rubrics override the general performance criteria shown in Table 1 to provide academic freedom in assessment.

4.3. Practical Framework—Digital Platform EvalTools®

EvalTools® (Information on EvalTools®) is chosen as the platform for outcome assessment over Blackboard® (http://www.blackboard.com) [51] since it is the only tool that employs the faculty course assessment report (FCAR) and EAMU performance vector methodology [43,45,46,52,53,54,55,56,57]. This embedded assessment methodology employing specific PIs facilitates the effective use of routine course assessments for outcome measurement to achieve a high level of automation of the data collection process. The EvalTools® FCAR module provides summative/formative options and consists of the following components: course description, CO indirect assessment, grade distribution, course reflections, old action items and new action items, CO direct assessment, PI assessment, SO assessment, assignment list, and learning domain and skill level assessment distribution [45,46,55,56,57]. The FCAR uses a performance vector conceptually based on a performance assessment scoring rubric developed by Miller and Olds (1999) [58]. Course instructors collect PI data from a set of course assignments, which are presented in the form of an “EAMU performance vector” categorizing aggregate student performance. The EAMU performance vector counts the number of students who passed the course whose proficiency for that outcome was rated excellent, adequate, minimal, or unsatisfactory as defined by the following: excellent: scores ≥ 90%; adequate: scores ≥ 75% and <90%; minimal: scores ≥ 60% and <75%; and unsatisfactory: scores < 60%. The EAMU performance vector constitutes a direct measure of aggregate student performance that neatly encapsulates information into categories, which can then be quickly reviewed for indicators of non-standard performance. In addition to the performance vector, the instructor reports details regarding assignments used for acquiring the data, along with any relevant observations. Heuristic rules and indicator levels for performance vectors called EAMU have been explained in research related to the FCAR [45,46,55,56,57]. To study the application of this methodology in actual course examples, the scales, indicator levels for the EAMU, and heuristic rules for the performance vector have been listed in Table 1 below. As mentioned earlier, the descriptors for EAMU scales shown in Table 1 are generic and applied to all PIs unless instructors opt to apply topic-specific descriptors of hybrid rubrics for assessing certain PIs of interest.
In Figure 4, we see the performance vector for a mechanical engineering course, THERMODYNAMICS 1, showing the performance of 11 students for several course outcomes (COs). In this clipped portion of the entire table generated by EvalTools®, we see COs 1, 2, and 3 assessed for all 11 students in the class using multiple assessments. The aggregation of different types of assessments aligned to a specific learning outcome at the course level is achieved using a scientific weighted averaging scheme. This scheme gives priority to certain types of assessments over others based on their coverage of learning domains, the percentage of course grading scales, and the maturity of student learning at the time of the assessment. Hussain, Mak, and Addas (2016) provided details of this weighted averaging approach at the 123rd annual conference and exposition of the American Society for Engineering Education (ASEE), Columbus, Ohio, in 2016 [43]. The CO1, “Explain fundamental concepts of thermodynamics and Analyze systems that use pressure measurement devices”, is assessed for every student in the class using relevant multiple assignments, such as homework 1 (HW_1), quiz 1 (QZ_1), and midterm-1 question 1 (Mid-Term-1 Q-1), which are aligned to specific performance indicators and are aggregated together using this scientific weighted averaging scheme [43]. The performance vector provides details of each student’s performance in multiple assessments aligned to performance indicators that correspond to all the COs in the course. EvalTools®, employing the FCAR assessment model, facilitates the electronic storage of the outcome and assessment information for each student collected from several courses during every term. The FCARs from each course are further processed into a performance vector table (PVT) for each SO (Information on EvalTools®) [33].
Figure 5 shows the PVT with information of all assessed PIs for ABET SO “h” (SO8), as follows: “Broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context”. The PIs assessed for each student corresponding to a specific SO are then averaged with weights based on a three-level skill grouping methodology [5,43,52,53,54]. This aggregation methodology ensures that PI information corresponding to various skill levels collected from multiple course levels for a specific SO is averaged by weights that consider both the level of the course (mastery, reinforced, or introductory) and skill (elementary, intermediate, or advanced). This gives the highest precedence to an advanced skill measured in a mastery-level course. The EvalTools® FCAR methodology deployed for the assessment of student learning outcomes has facilitated the effective integration of outcome data in advising systems to help individual students fulfill expected knowledge and skill requirements for their plan of study. The methodology implemented by EvalTools® supports the developmental advising process based on learning outcomes information. A YouTube video also presents some details of the features of this module and how individual student skill data are collected by using specific PIs and course assessments and then integrated by faculty into academic advising [59]. A digital database of essential and accurate outcomes information for all students provided by the EvalTools® advising module effectively supports developmental advising based on the principles of authentic OBE.

4.4. Practical Framework—Summary of Digital Technology and Assessment Methodology

In summary, several essential elements were implemented by the institutions involved in this research to ensure the outcome data collected for every student represent realistic and accurate information for academic advising, as follows:
  • Measurement of outcome information in all course levels of a program curriculum, as follows: introductory (100-/200-level course), reinforced (300-level course), and mastery (400-level course). Engineering fundamentals and concepts are introduced in 100-/200-level courses, then they are reinforced in 300-level courses through application and analysis problems, and finally, in 400-level courses, students attain mastery in skills with activities such as synthesis and evaluation [43,52,53,54];
  • The faculty course assessment report (FCAR) utilizing the excellent, adequate, minimal, and unsatisfactory (EAMU) performance vector methodology [45,55,56,57];
  • Well-defined performance criteria for course and program levels;
  • A digital database of specific PIs [43,44,47,52,53,54] and their hybrid rubrics classified as per Bloom’s revised three domains of learning and their associated levels (according to the three-level skill grouping methodology);
  • Unique assessment mapping to one specific PI [43,44,52,53,54];
  • Scientific constructive alignment for designing assessments to obtain realistic outcome data representing information for one specific PI per assessment [1,2,4,5,43,47,52,53,54,60,61];
  • Integration of direct, indirect, formative, and summative outcomes assessments for course and program evaluations;
  • Calculation of program- and course-level ABET SOs, as well as CO data based upon weights assigned to the type of assessments, PIs, and course levels [43,52];
  • Program, as well as student performance, evaluations considering their respective measured ABET SOs and associated PIs as a relevant indicator scheme;
  • Six comprehensive plan do check act (PDCA) quality cycles to ensure the quality standards, monitoring, and control of education process, instruction, assessment, evaluation, CQI, and data collection and reporting [47];
  • Customized web-based software EvalTools® facilitating all of the above (Information on EvalTools®) [33].

5. Results

Transcript grades are composite performance results derived from an aggregation of several hundred specific student learning activities in any given discipline. It is impossible to extract student performance related to specific curricular content, skill levels, or learning domains from composite transcript grades. As shown in another YouTube video presentation [62], digital transcripts with a detailed list of specific student performance metrics corresponding to ABET SOs would provide an excellent source of information for academic advising, career counseling, and recruitment. Digital transcripts would help academic advisors easily identify deficient skills in students with excellent GPAs or patterns of high-performing activity in academically weak students. In both cases, focused advising in specific areas would significantly help students identify their weaknesses or strengths for appropriate on-time corrective action or career path selection. EvalTools® advising module’s diagnostic tools are used by developmental advisors for a qualitative and quantitative review of detailed reports of single and multi-term ABET SO, PI, and assessment information.

5.1. Mixed Methods Approach for Student Evaluations Using Automated ABET SO Data

In this study, the researchers establish the attainment of effective developmental advising by employing a sequential mixed methods approach involving a combination of quantitative and qualitative analyses of advising information, such as individual student’s SOs, PIs, assessments, instructor feedback, and self-evaluation data. Quantitative analyses involved identifying failing performance for SOs/PIs based on observing red and yellow flags, corresponding scores, and any patterns or trends of serial failures or exceptional performance. Qualitative analyses of specific performance for SOs/PIs included the semantic analysis of the language of failing SO/PI statements, type of assessments, instructor feedback, and/or responses for students’ self-evaluation of their lifelong learning skills. Figure 6 provides a detailed process flow for the mixed methods approach adopted by the Faculty of Engineering programs for implementing effective developmental advising. The objective of the evaluation is to identify patterns or trends of failure or exceptional performance so that developmental advisors can target either the development of remedial action or career path plans, respectively. As mentioned earlier, authentic OBE advising models contribute to providing holistic education for students by either helping them achieve mastery by identifying and overcoming deficiencies or promoting exceptional talent by identifying and developing inherent skills for successful career paths. Therefore, locating patterns of exceptional performance helps advisors identify core learning strengths, and recognizing patterns of deficient performance helps them identify core student learning deficiencies.
For the benefit of effective training in the field of developmental advising, we consider two types of students based on ranging academic performances, one with an above-average GPA (greater than 3.5 on a 5.0 scale) and another with a below-average GPA (less than 3.0 on a 5.0 scale). The approach for failure analysis involves three levels of evaluation depending upon the nature and complexity of student performances. Level 1 is straightforward, involving quantitative and qualitative analyses of SO data. Level 3 is more complex and involves a mixed-method evaluation of SOs, PIs, assessments, and FCAR instructor reflections. All three levels involve locating patterns or trends of failure or exceptional performance for identifying core learning deficiency or strength, subsequently leading to the development of precision plans for remedial action or career path selection, as follows:
  • Level 1: Quantitative review of single- or multi-term SO data, followed by a qualitative semantic analysis of the language of SO statements coupled with a qualitative review of curriculum and course delivery information;
  • Level 2: Quantitative review of single- or multi-term SO and PI data, followed by a qualitative semantic analysis of the language of SO and PI statements, course titles, and assessment types coupled with a qualitative review of curriculum and course delivery information;
  • Level 3: Quantitative review of single- or multi-term SO and PI data, followed by a qualitative semantic analysis of the language of SO and PI statements, course titles, and assessment types coupled with a qualitative review of curriculum, course delivery, and FCAR instructor reflection information.
After identifying the core learning deficiencies, advisors qualitatively review the curriculum and course delivery information to map the learning deficiency to key learning activities that students can target in specific phases of the curriculum or course delivery for precision remedial action and subsequent improvement. When core learning strengths are identified, experienced advisors use this information to recommend a focused plan of study for suitable career paths aligned with students’ inherent skills.

5.2. Automated ABET SO Data for Every Enrolled Student

Specific performance indicator information collected for every enrolled student using digital technology and appropriate methodology, such as the FCAR PVT, if scientifically applied to academic advising and adequately popularized in academia and industry, would revolutionize how institutions can provide learning improvement and career opportunities. A much broader spectrum of learning improvement and career opportunities can then be generated for all students with any range of academic performance. To substantiate this statement, let us look at an electrical engineering (EE) student evaluation in Figure 7.
A consolidated view of ABET SO information calculated from PI measurements is shown for three consecutive terms. The student skill SO data are realistic and correspond closely with actual student performance since 10 essential elements of the assessment model (43,53–55) have been implemented to ensure that outcome data are as accurate as possible.

5.3. Quantitative and Qualitative Analyses of Each Student’s ABET SO Data

As discussed earlier, academically high-performing students with above-average transcript grades may have failing or underperforming skills. The EvalTools® advising module provides detailed skill information, as shown in Figure 8, for such a case of an above-average student of the EE program. A logical, structured, and deductive sequential mixed methods approach for failure analysis provides quick and accurate results due to the utilization of elaborate web-based quantitative analytical tools and scientific diagnostics based on outcome information that seamlessly streamlines the feedback and performance improvement processes [44,52]. Developmental advisors first review quantitative single- or multi-term SO data for a student of interest. The identification of failures is a straightforward process that involves locating red flags in the SO results. The red flags indicate EAMU-based aggregate averages below 60%, as mentioned in the performance criteria listed in Figure 3. Once advisors identify single or multiple SO failures, they are required to qualitatively review the semantics of failing outcomes statements to deduce any patterns of failure that are based on a common and core deficiency in student learning. For the case shown in Figure 8, the red flags clearly highlight a pattern of failures related to ABET SOs “h”, “i”, and “j”, corresponding to the study of the impact of engineering solutions, lifelong learning, and contemporary issues, respectively. Based on a qualitative semantic analysis of the language of SO statements, the pattern of failure observed in this case refers to a core deficiency in developing a good comprehension of issues related to contemporary engineering solutions. The last step dealing with the development of remedial actions involves mapping the core deficiency to student learning activities in a specific phase of the curriculum. Obviously, developmental advisors would require comprehensive knowledge of the curriculum and mechanism of course delivery to complete this step and create accurate remedial action for targeted improvement. As mentioned earlier, if needed, advisors can also view course instructor reflections on failures by using an advanced feature of the advising module called FCAR activation mode. Coming back to the case under discussion, developmental advisors can accurately target certain skills for improving the deduced core student learning deficiency by prescribing specific student learning activities that are based on self-motivated research, coupled with an ability to elaborate and compare the benefits and limitations of contemporary engineering solutions based on their impact on societal, environmental, and economic aspects. To implement effective CQI for achieving holistic learning, it is necessary for advising systems to provide easy access to detailed outcome information for every individual student and to represent data in a convenient format, resulting in the quick identification of failures. Such advising systems will promote the early identification of areas of weakness in performance for otherwise “successful” students to better prepare them for the challenges of leading career roles.

5.4. Quantitative and Qualitative Analyses of Each Student’s PI and Assessment Data

Let us proceed further and review EvalTools® advising module’s color-coded representation of PIs in the comprehensive evaluation of each SO for an individual student. Figure 8 shows, for an individual EE student, a detailed list of PIs, assessments, weighting factors, and course information utilized for the multi-term quantitative evaluation of ABET SO “a” (SO_1), as follows: “An ability to apply the knowledge of mathematics, science and engineering”. It also indicates some aspects of the assessment model that directly contribute to the high level of accuracy required for the aggregation of SO and PI data for academic advising. The outcome information is computed using weighting factors based on the three-level skill methodology for the scientific aggregation of multiple skills measured using various types of assessments and multiple raters in several courses over a period of multiple terms [43,52,53,54]. Advisors can use the comprehensive SO evaluations represented in a scientific color-coded format to easily identify patterns or trends in failures related to specific types of skills. Using a detailed examination of academically weak students’ performance, it is also possible to identify certain areas of strength in learning which are due to the students’ natural affinity for and interest in certain topics of the curriculum.
In Figure 9, the quantitative multi-term evaluation of ABET SO “a” (SO_1) for a typical underperforming EE student shows certain areas of comparatively better patterns of learning that are highlighted in green. We observe that PI_1_12 (“Employ basic electrical power formulations and quantities, such as complex vectors, delta star transformation, network flow matrices (network topology and incidence matrices) and symmetrical components”), PI_1_41 (“Convert a given number from one system to an equivalent number in another system”), and PI_1_45 (“Explain basic semiconductors theory concepts such as applied electrical field, junction capacitance, drift/diffusion currents, semiconductor conductivity, doping, electron, hole concentrations, N-type, P-type semiconductors”) show better performance and are in stark contrast to the majority of the other PIs measured for the two terms, 351 and 352. Based on a qualitative semantic analysis of the language of PI statements and types of course assessments, one significant observation is that these three PIs measure elementary math skills and engineering concepts and cover relatively easier topics, such as Boolean algebra. The other failing PIs, which deal with topics such as the operating principles of various electronic devices and components, the application of Gauss’s Law, Maxwell’s equations, etc., require several advanced engineering concepts coupled with a basic understanding of differential and integral calculus.
Upon further analysis, from the FCAR instructor reflections, it was confirmed that many students exhibited a minimal understanding of differential and integral calculus. Therefore, the core learning deficiency for this advising case was identified as the poor understanding and application of differential and integral calculus. Learning deficiency in fundamental knowledge such as this would require students to refresh their basics in calculus by opting for additional math coursework or tutorial sessions. The failing PI information also strongly suggests that students had initiated learning with the required level of interest but at later stages of the course, needed other mechanisms of course delivery, such as active learning, for the retention of focus and enhanced interest. On the other hand, developmental advisors can also use performance information related to the student’s core learning strength in digital electronics and the application of Boolean algebra for solving circuits to suggest concentration on specific learning activities in core and elective courses, such as Microprocessors and Digital System Design, to enhance career prospects as a digital design or test engineer. Student advising based on such a mixed methods approach would help faculty to identify potential areas of strength or weakness in student performance through the observation of patterns of relatively high or low scores for certain ABET SOs and their corresponding PIs.
Specific/generic PIs corresponding to the Engineering Accreditation Commission (EAC) or ABET SOs for a specific program, targeting a variety of skill sets, and measured using specific assessments in multiple courses for each student form the main source of diagnostic information on which effective failure analyses depend. Remediation efforts based on the identification of strong or deficient performance is a vast and complex topic that can be adequately covered in another exclusive research article to elaborate on detailed steps implemented by academic advisors for CQI. In general, advisors trained in degree plans and course and PI requirements propose to their advisees specific areas of requisite knowledge, learning strategies, and relevant activities in courses for improvement. Advisors also communicate with concerned faculty members to provide performance information on students and highlight course content for concentration and preferable strategies of teaching. The EvalTools® advising module maintains a digital repository of notes containing advisor and student meeting information. Advisors electronically report specific information related to deficiencies in outcomes and suggestions for improvement for each student. Concerned faculty and program administrators regularly access the recorded digital advising information for follow-up and the implementation of student-specific remediation efforts. Advisors also provide effective career counseling by aligning students’ intended career paths to specializations that match their top-performing skills. Institutions or programs that do not employ appropriate digital technology and assessment methodology to implement automation and principles of authentic OBE do not have options for outcome-based advising but rely on traditional mechanisms based on transcript grades.

5.5. An Outcome-Based Advising Example

The initial format for academic advising based on outcome data for the Faculty of Engineering was implemented in the spring term of 2017. The first iteration of the implemented outcome advising format required advisor input for the advisee’s consolidated 11-SO summary of results. A sample of three terms’ summarized 11-SO evaluation data for a typical EE student is shown in Figure 7. The summary of SO results was categorized in the notes to the advisees as excellent, adequate, minimal, or unsatisfactory, according to the performance criteria presented in Figure 4. The advisor would then specifically focus on the SOs marked as unsatisfactory to provide valuable guidance for areas of improvement and corrective actions. Figure 10 below shows a typical outcome-based advising sample for an EE student showing consistent failure for SO_8, which is related to understanding the impact of engineering solutions on economic, societal, and environmental aspects. In this case, the advisor identified learning activities in capstone courses to target improvement actions specific to SO_8. It is mandatory that students periodically review the notes documented by the advisor for improvement actions. The green check mark in the top left-hand corner of Figure 10 indicates that the advisee viewed the advisor’s electronic notes. Based on the positive feedback received from advisees regarding performance enhancement due to the implementation of the initial format of outcome-based advising, the Faculty of Engineering will continue to employ this advising format to improve student performance until the close of the academic year 2019. The second planned iteration would expand advising formats to cover a review of performance indicators and assessment information to produce specific guidance for advisees focusing on course areas and involving concerned faculty members.

5.6. Students as Active Participants

Student empowerment is an integral component of achieving successful learning in authentic OBE systems. The electrical engineering program at Gannon University used state-of-the-art digital reporting features provided by EvalTools® to implement student self-evaluation processes related to the ABET EAC student learning outcomes. These student self-reviews are then corroborated by academic advisor input to further guide advisees toward effective approaches to self-motivated improvement for successful learning. As a case study, the electrical engineering program has carefully selected soft skills mandated by student outcomes that are not easily measurable with assignments given in course activities to augment the student advising activities. The logical choice of soft skills, such as lifelong learning, that are not easily measurable using class activities is factored into the advising process. By doing so, academic advisors attempted to achieve both goals of empowering students as active participants in their learning and measuring the soft skills critical to student outcomes. For SO9, which is the recognition of the need for, and an ability to engage in, lifelong learning, the three corresponding PIs that the electrical engineering program targeted are as follows:
(1)
PI_9_1: demonstrate self-managing ability to articulate the student’s own learning goals;
(2)
PI_9_2: demonstrate self-monitoring ability to assess the student’s own achievements;
(3)
PI_9_3: demonstrate self-modifying ability to make mid-course corrections.

Process for Measuring Soft Skills in Student Advising Activities

For each of the PIs to be assessed, Figure 11 shows the questionnaires and instructions to assist students and address relevant issues accordingly. The questionnaires are based on the principles of metacognition. They guide students to examine their own learning progress and achievements toward the intended performance skills/student outcomes. The advisees first identify areas of strength and weakness in specific skills and knowledge areas. They are then required to track their performance and check whether they are meeting the required standards of the program’s student outcomes.
Finally, the advisees develop a plan of remedial action for the overall improvement of performance related to the deficient outcomes. Students are asked to address each of the questionnaires before advising day and submit their responses electronically to EvalTools® for review by their advisors.

5.7. Quantitative and Qualitative Analyses of Student Responses and Overall SO Results

Figure 12 provides a detailed process flow for the mixed methods approach adopted by the electrical engineering program at GU for implementing effective developmental advising. The objective of this approach is to achieve both goals of empowering students as active participants in their learning and measuring soft skills, such as lifelong learning, corresponding with EAC ABET SO “i” or SO “9” [ABET]. Advisees are required to provide responses every term to a questionnaire that targets lifelong learning skills measured using ABET SO “9”. As mentioned in the earlier section, advisees make a qualitative self-evaluation based on SO results for the previous term to submit their responses to various lifelong learning aspects, such as self-management, self-monitoring, and self-modification.
Advisors then qualitatively evaluate student responses to PI_9_1, PI_9_2, and PI_9_3 for validity and provide supplemental feedback to corroborate any consistent student observations or recommendations for the refinement of remedial actions for improvement. Developmental advisors also use scoring rubrics to assess student responses for PI_9_1, PI_9_2, and PI_9_3 and categorize them as E, A, M, or U performance. The SO “9” results of a complete set of cohorts are then aggregated for a given term to qualitatively evaluate the overall program-level attainment of lifelong learning skills. Finally, developmental advisors quantitatively review detailed multi-term ABET SOs (a-k) data and provide comments and feedback to advisees regarding crucial observations and remedial actions for improvement.
Figure 13 illustrates a sample overall attainment of student performance measured against all the SOs from fall 2016 to fall 2017. The questionnaires were developed in the spring of 2017. Students were charged to submit their responses from the fall of 2017 onward. Based on the EAMU performance criteria shown in Figure 4, the red or “U”, unsatisfactory, (aggregate SO values < 60%) and yellow or “M”, minimal, performance (60% ˂ aggregate SO values <75%) flags indicate areas of concern. The advisor qualitatively examined the responses from this student for his results for spring 2017 and fall 2017. In this case, the student provided comments related to the self-evaluation of his overall performance in the program SOs, identified areas of strengths and weakness, and suggested possible remedial actions for improvement. In addition, the three key aspects of the questionnaire, monitor, manage, and modify, help students significantly focus on and improve lifelong learning SO9 skills.
Figure 14 shows sample responses from this student. The student noted areas of concern for spring 2017 in four SOs, including SO3, SO4, SO7, and SO9, as well as SO4 and SO11 for his entries in fall 2017. He identified and articulated areas for improvement and suggested visiting the tutoring center to improve his math skills. For his spring 2018 entries, he noted his performance in these four areas and identified other knowledge and/or skills for improvement. On advising day, the faculty advisor electronically documented necessary observations after reviewing the student’s submission. Since the SO9 lifelong learning skills are based on how students manage, monitor, and modify their own learning progress, advisors are required first to qualitatively assess the students’ self-assessment, make the student aware of specific areas of improvement, especially if the student’s self-evaluation is consistent with the advisor’s observations, and then make general comments regarding the overall status of SOs attainment as measured by a direct quantitative assessment of student performance.
In this developmental advising case, Figure 15 shows an overall improvement of the “concerned” SOs noted from fall 2016 to fall 2017, excepting the obvious red flag for SO4 related to teamwork. Apparently, the red flag highlights a problematic area that indicates student failure in SO4. However, upon further investigation, based on detailed diagnostics using a drill-down menu that lists key assignments contributing to final SO performance in the interface shown in Figure 15, the advisor quickly identified the specific cause of the failing score for SO4. This student received a “0” score for not submitting a specific key assignment in the ECE327 Senior Design course. Therefore, the SO4 final aggregate value resulted in a low score of 43.50%, below 60%, indicated by a red flag. This was not based on a “real” performance failure. Not turning in the assignment was also noted by the student in his own self-assessment.
Since this advisee’s other key assignments pertaining to the measurement of this specific SO4 had acceptable scores, his developmental advisor concluded that the student achieved the required performance levels for the SO4 skill. In addition to the failing or “U”, unsatisfactory, red flag specific to SO4 in Figure 13, the yellow flags also indicate SOs with “M”, minimal, performance that achieved aggregate values ranging from 60% to 75%. However, upon close observation, in the final year (fall 2017), the results showed an overall improvement in all SOs. Figure 16 elaborately illustrates the observations and recommendations reported by this student’s developmental advisor in digital advising records for both the spring 2017 and fall 2017 terms. Based on a comprehensive evaluation of overall student performance, the developmental advisor therefore concluded that this student adequately met the required performance standards for the concerned SOs.
On advising day, developmental advisors not only deliver a comprehensive evaluation of student progress/performance related to the program’s student outcomes but also directly assess the student’s lifelong learning skills, as reported by SO9. This advising process repeats each semester so that both advisors and advisees can monitor performance for the program SOs, specifically those related to SO9 for lifelong learning skills. To reiterate, one of the key components of OBE is to establish the conditions and opportunities within the educational system that enable and encourage all students to achieve those essential outcomes. In this case, students are given multiple opportunities to achieve the essential outcomes throughout the curriculum in different courses and are also made aware of their progress in meeting them.

5.8. Added Advantage for Evaluating Advising at the Program Level

In addition to the advisor’s feedback to student responses for the performance skills for SO9, faculty advisors also directly measure each student’s attainment of the PIs for SO9 by ranking them using a score-based rubric, the excellent, adequate, minimum, unsatisfactory (EAMU) performance vector, as shown in Figure 17.
The individual results for SO9 are then automatically rolled up, as shown in Figure 18, which indicates advising effectiveness for a given term and the achievement of SO9. Let us examine ABET_PI_9_1 in Figure 18. There are 44 students being assessed for PI_9_1. There are 19 students rated as E, 20 as A, 4 as M, and 1 as U.
Hence, there is a proportion of 2.27% for the U category, which has an overall average of 3.83 out of a rating factor of 5 and is interpreted as achieving 76.6% out of 100%. The corresponding PI_9_1 is color-coded as no flag or a “white” flag to indicate meeting the attainment. Refer to Figure 4 for a detailed classification of the EAMU performance criteria. These specific roll-up data are reviewed, along with other SO data, for program evaluation. Although program evaluation is not the focus of this paper, the direct involvement of students in self-evaluation as part of their advising activities has provided a constructive means to gauge the effectiveness of the advising activities at the program level, as well. The trend observed in our advising systems shows an overall enhancement of student learning in achieving the desired student outcomes. We believe this is due to improved metacognition, especially once students are aware of the specific cause of their failures and align remedial actions with their advisors’ recommendations targeting specific knowledge and skill areas for improvement. In general, our experience with advising systems that employ the direct involvement of advisees in self-evaluation in their attainment of SOs has been profoundly beneficial, resulting in the significant enhancement of students’ lifelong learning skills.

6. Discussion

6.1. Quality Standards of Digital Developmental Advising Systems

OBE models advocate student-centered impact evaluations to qualify education systems; these primarily include the monitoring of overall improvement in SO performance over time [5,6,47]. The multi-term ABET SO (a-k) summary and detailed trend analysis reports (2014-18) for the engineering programs in both institutions indicated positive trend results for the majority of the SOs, thereby providing objective evidence to substantiate the attainment of holistic student learning directly resulting from contributions of a comprehensive education process that includes the successful integration of both curriculum delivery and developmental advising systems. From the OBE perspective, the positive trends in multi-term SOs results indicate sustainable systems promoting the successful collaboration of students and staff to achieve CQI [5,6,47]. The multi-term SO results for programs in both institutions are lengthy reports that were also reviewed by an external advisory committee or industrial advisory board, as an institutional and accreditation requirement for the approval of major components of education delivery, CQI processes, and subsequent improvements, and adequately reported in recent publications [47].
Additionally, the engineering programs at both the GU and IU campuses attained 6 full years of ABET accreditation in 2018 and 2020, respectively, with exceptional results, through the fulfillment of the nine mandatory ABET EAC criteria and auditors reporting several program strengths without any documented weakness or concern [47]. The digital academic systems at programs in GU and IU were qualified by the fulfillment of the ABET criterion 1 on students, which deals with feasibility analyses of accredited programs’ student enrollment, training, academic advising, and graduation details. Positive and credible internal and external reviews and feedback of the digital advising systems implemented at both institutions confirm compliance with international quality standards by comprehensively including student and staff perspectives related to the sustainability of advising systems, their attainment of academic goals, and overall quality improvement.

6.2. Qualitative Comparison of Digital Developmental Advising with Prevalent Traditional Advising

Most engineering programs generally use vague and generic language for outcomes that does not follow a consistent format based on authentic OBE frameworks. Usually, generic and holistic rubrics without detailed topic-specific descriptors are applied to assess program outcomes. Manual assessment models do not assess all students but rather use sampling methods to fulfill minimal accreditation requirements. Generally, advisors do not have access to outcome assessment or evaluation information. The quality of outcome direct assessment data and their availability per individual student are therefore two key factors that drive the initiative for developmental advising based on outcomes. The literature review of this study clearly exhibits a dearth of advising systems based on valid and reliable outcomes data collected from direct assessments. Digital developmental advising models employing the FCAR + specific PI methodology offer solutions to the major deficiencies observed in prevalent advising. Table 2 summarizes several important pedagogical aspects extracted from the literature review, frameworks, and results of this study and used as key quality criteria for qualitatively comparing the benefits of digital developmental advising over prevalent traditional advising systems. The 22 pedagogical aspects act as overarching multi-dimensional quality standards in six broad areas of education, which are authentic OBE and conceptual frameworks, assessment practices, data, staff, student, and process.

6.3. Research Questions

The purpose of this study was to present a state-of-the-art academic advising system that employs best assessment practices and digital technology to tap the maximum potential and benefits of the authentic OBE model and overcome the limitations of contemporary advising mechanisms.

6.3.1. Research Question 1: To What Extent Should Engineering Programs Shift from Program- to Student-Centered Models That Incorporate Learning Outcomes for the Evaluation of Individual Student Performances besides Program Evaluations for Accreditation Requirements?

Based on the literature review of this research, engineering programs following the OBE model should implement a student-centered approach and provide all students with accurate and detailed outcome evaluation information to achieve the two major aspects advocated by an authentic OBE model, as referenced in Section 4.1 of this paper.

6.3.2. Research Question 2: To What Extent Can Manual Assessment Processes Collect, Store, and Utilize Detailed Outcome Data for Providing Effective Developmental Academic Advising to Every Student on an Engineering Campus Where Several Hundred Are Enrolled?

As per the numerous citations mentioned in Section 3 and Section 4 of this paper, it is practically impossible for manual assessment processes to collect, store, and utilize such staggering amounts of outcomes data required for the effective advisement of several hundred students in any engineering campus.

6.3.3. Research Question 3: To What Extent Can the Assessment Process Be Automated Using Digital Technology So That Detailed Outcomes Information for Every Student on Campus Can Be Effectively Utilized for Developmental Advising?

After conducting an exhaustive study of research material related to assessment and evaluation, as referenced in Section 3 and Section 4, the authors of this paper have come to the conclusion that digital systems that implement assessment methodology, such as FCAR + specific/generic PIs using embedded assessments and PVT, can collect, store, and utilize detailed and accurate outcome information for developmental advising, regardless of the size of enrolled student populations.

6.3.4. Research Question 4: What Specific Benefits Can Digital Automated Advising Systems Provide to Developmental Advisors and Their Students?

The advantages of digital advising systems with guided student self-evaluation in meeting outcomes are summarized as follows:
  • Detailed and accurate digital advising records for advisors and advisees showing trends and summaries of the results of performances related to SOs, PIs, and their corresponding assessments;
  • Advisors can employ mixed methods approaches to achieve a consistent and structured mechanism for assessing student performance in meeting outcomes and can focus more on specific, relevant, and constructive advice for quality improvement;
  • Students’ metacognitive skills are boosted with accurate indications of their strong/weak skills and/or knowledge areas to support corrective actions in meeting student outcomes each semester;
  • The student’s self-directed remedial actions can align with the developmental advisor’s recommendations to reinforce overall performance improvement.

6.4. Limitations

Popular LMS tools like Blackboard®, Moodle®, etc. do not offer embedded assessment technology and FCAR with PI classification per Bloom’s Taxonomy. Therefore, the EvalTools® advising module is an option for schools interested in implementing automated advising systems based on outcomes. Unfortunately, such advising modules cannot operate independently and have to integrate with outcome assessment systems. The efficacy of the advising module based on outcomes depends upon the accurate alignment of course assessments with specific learning outcomes and PI information. Advising based on any form of unreliable and inaccurate outcomes data would be counterproductive, if not damaging. Engineering programs cannot just rely on minimal accreditation standards to ensure the quality and standards of their outcome assessment processes. Therefore, an apparent limitation of implementing digital advising systems, such as the advising module offered by EvalTools®, is that several measures, such as the ten essential elements for establishing quality in assessment and evaluation processes mentioned in Section 4.4 of this paper, would have to be mandated by schools to ensure accurate and reliable outcome data for advising. Lastly, since EvalTools® maintains student data on a Google Cloud-based environment, schools requiring the local storage of advising and outcome information would have to provide additional resources, technical support, and the required technology for managing student data on their own local servers.

6.5. Future Work

To minimize accreditation efforts, most engineering programs limit the full scope of their program outcomes by implementing a strict alignment with ABET SOs despite clear instructions from ABET itself to consider their SOs as a non-restrictive fundamental quality standard. Programs may enhance the list of ABET SOs and their associated PIs to include additional relevant technical and transversal skills. Programs can also track socioeconomic, health, or other factors by introducing related SOs and corresponding PIs. This would require the development and implementation of specific direct and/or indirect assessment instruments in the quality process to evaluate SOs and PIs for such skills or factors. Advisors can then evaluate socioeconomic, health, or other factors using the measured SO and PI data. Advisors can also choose to further enhance their evaluation by requesting additional information from students using relevant questionnaires incorporated into the advisee’s self-evaluation feature.
In this study, we focused on developmental advising systems that are based on ABET SO data collected from direct assessments. The ABET SO data are aligned with the International Engineering Alliance’s Washington Accord graduate attributes. Socioeconomic, health, or other factors impacting student performance would be an interesting prospect for future work requiring development and implementation of additional SOs and PIs and their specific assessment instruments.

7. Conclusions

The demand for higher education is ever increasing, with student achievement and accountability posing the biggest challenges to improving the quality of higher education. In order to meet these challenges, an OBE model for student learning, along with several quality standards in higher education, have been adopted by accreditation agencies and educational institutions over the past two decades. With thousands of institutions and programs in a tight race for rank and accreditation, the prevalent understanding and implementation of authentic OBE and CQI need clarification. Referring to the paradigm, purpose, premises, and principles of authentic OBE, every component of an educational process must be based on achieving essential outcomes. Academic advising is a core component of the educational process. It helps students properly align with degree and curriculum requirements and provides them with the necessary guidance for achieving outcomes, graduation, and successful career prospects. Therefore, in the OBE model, advising should be based on and driven by outcome information. NACADA has also clearly stated the importance of student outcomes in defining and implementing academic advising [37]. Academic advising is a major criterion for the fulfillment of regional and international accreditation standards.
With our vast experience in teaching and accreditation, we have yet to come across academic advising systems that are based on accurate and detailed outcome, PI, and assessment information collected for every individual student. NACADA has stated the importance of student learning outcomes for academic advising [37]. Yet several institutions, both within the US and abroad, that have adopted manual assessment models have been unable to provide advising systems that are based upon accurate and detailed outcome, PI, and assessment information for each student. Education systems that rely on deficient manual assessment processes result in misinformed decisions for students due to delays in accessibility or the lack of accurate and detailed learning outcome information. Consequently, wrong choices of the field of study or professional career paths would lead to a wide spectrum of academic or career-related failures.
This research presents an in-depth description of the theoretical, conceptual, and practical frameworks that helped to establish authentic OBE pedagogy at the two engineering campuses of IU and GU. The pedagogical models support the implementation of state-of-the-art digital developmental advising systems based on valid and reliable outcome direct assessment data. The accuracy of direct assessment outcome data is ensured by applying essential elements of an authentic OBE assessment methodology utilizing a digital platform, web-based software, EvalTools®, employing the FCAR, specific/generic PIs, and corresponding rubrics classified per Bloom’s three domains and their learning levels. Digital platform EvalTools®, coupled with FCAR + specific PI methodology streamlines pedagogical processes for the effective collection and evaluation of detailed outcome and assessment information for every student in a higher education institution with thousands of enrolled students. The findings of this study indicate that digital developmental advising models based on authentic OBE frameworks specifically address major deficiencies of prevalent advising since they utilize pedagogical solutions that (a) support the automated collection and reporting of valid and reliable outcome data for every individual enrolled student, (b) collect accurate outcome data using specific PIs and hybrid rubrics that are accurately aligned with intended course topics and their learning activity, (c) provide high precision qualification for student attainment of holistic learning by assessing specific PIs classified per Bloom’s three domains and their learning levels, (d) enable novel mixed methods approaches for the quick and accurate evaluation of student failure and/or strength based on detailed objective assessment data for achieving effective developmental advising, and (e) enable students to easily access detailed multi-term outcome data, reinforce remediation efforts with close collaboration and follow up with advisors, and use outcome-based self-evaluation to enhance their metacognition and lifelong learning skills.
The novel mixed methods investigative models using analytical tools that facilitate comprehensive diagnostics, as shown in some examples of this paper, easily enable the accurate and early identification of learning deficiencies for prompt remediation efforts. On the same note, the early recognition of strong skills in specific engineering activities through the observation of distinct patterns in diagnostics reports related to students’ performance can be followed by precise academic guidance to gain knowledge and skills in associated areas for the overall attainment of holistic expertise. This approach results in the on-time, precise developmental advising necessary for students to make comprehensive decisions regarding the selection of relevant areas of specialization in education, research, and training or industry-related prospects, helping them to evolve into outstanding performers in their respective fields and employing the highest standards to better shape the future of the world we live in today.

Author Contributions

Conceptualization, W.H., W.G.S., and M.F.; methodology, W.H., W.G.S., and M.F.; software, W.H. and M.F.; validation, W.H. and M.F.; formal analysis, W.H., W.G.S., and M.F.; investigation, W.H. and M.F.; resources, W.H. and M.F.; data curation, W.H. and M.F.; writing—original draft preparation, W.H. and M.F.; writing—review and editing, W.H., W.G.S., and M.F.; supervision, W.G.S.; project administration, W.H. and M.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

There are no publicly available data.

Acknowledgments

This research is based on the results of a rigorous 5-year program for the implementation of a comprehensive outcome-based education model involving the curriculum, teaching, learning, advising, and other academic and quality assurance processes for the CE, ME, and EE engineering departments in the Faculty of Engineering at the Islamic University in Madinah and EE department at Gannon University. The program efforts were directly led by Mak Fong, respectively. The authors thank the faculty members for their co-operation and support in completing the necessary quality assurance and academic teaching processes that enabled the collection of the necessary results.

Conflicts of Interest

Author Mak Fong was employed by the company Makteam Inc. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Killen, R. Teaching Strategies for Outcome Based Education, 2nd ed.; Juta, & Co.: Cape Town, South Africa, 2007. [Google Scholar]
  2. Moon, J. Linking Levels, Learning Outcomes and Assessment Criteria. Bologna Process. European Higher Education Area. 2000. Available online: http://aic.lv/ace/ace_disk/Bologna/Bol_semin/Edinburgh/J_Moon_backgrP.pdf (accessed on 19 March 2022).
  3. Spady, W. Choosing outcomes of significance. Educ. Leadersh. 1994, 51, 18–23. [Google Scholar]
  4. Spady, W. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators: Arlington, VA, USA, 1994. [Google Scholar]
  5. Spady, W.; Hussain, W.; Largo, J.; Uy, F. Beyond Outcomes Accreditation; Rex Publishers: Manila, Philippines, 2018; Available online: https://www.rexestore.com/home/1880-beyond-outcomes-accredidationpaper-bound.html (accessed on 19 March 2022).
  6. Spady, W. Outcome-Based Education’s Empowering Essence; Mason Works Press: Boulder, CO, USA, 2020; Available online: http://williamspady.com/index.php/products/ (accessed on 21 March 2022).
  7. Harden, R.M. Developments in outcome-based education. Med. Teach. 2002, 24, 117–120. [Google Scholar] [CrossRef] [PubMed]
  8. Harden, R.M. Outcome-based education: The future is today. Med. Teach. 2007, 29, 625–629. [Google Scholar] [CrossRef] [PubMed]
  9. Adelman, C.; National Institute of Learning Outcomes Assessment (NILOA). To imagine a Verb: The Language and Syntax of Learning Outcomes Statements. 2015. Available online: http://learningoutcomesassessment.org/documents/Occasional_Paper_24.pdf (accessed on 21 March 2022).
  10. Provezis, S. Regional Accreditation and Student Learning Outcomes: Mapping the Territory; National Institute of Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2010; Available online: www.learningoutcomeassessment.org/documents/Provezis.pdf (accessed on 21 March 2022).
  11. Gannon-Slater, N.; Ikenberry, S.; Jankowski, N.; Kuh, G. Institutional Assessment Practices across Accreditation Regions; National Institute of Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2014; Available online: www.learningoutcomeassessment.org/documents/Accreditation%20report.pdf (accessed on 21 March 2022).
  12. Accreditation Board of Engineering & Technology (ABET) USA. Accreditation Criteria. 2023. Available online: http://www.abet.org/accreditation/accreditation-criteria/ (accessed on 11 December 2023).
  13. Dew, S.K.; Lavoie, M.; Snelgrove, A. An engineering accreditation management system. In Proceedings of the 2nd Conference Canadian Engineering Education Association, St. John’s, NL, Canada, 6–8 June 2011. [Google Scholar] [CrossRef]
  14. Essa, E.; Dittrich, A.; Dascalu, S.; Harris, F.C., Jr. ACAT: A Web-Based Software Tool to Facilitate Course Assessment for ABET Accreditation. Department of Computer Science and Engineering, University of Nevada. 2010. Available online: http://www.cse.unr.edu/~fredh/papers/conf/092-aawbsttfcafaa/paper.pdf (accessed on 12 April 2022).
  15. Kalaani, Y.; Haddad, R.J. Continuous improvement in the assessment process of engineering programs. In Proceedings of the 2014 ASEE South East Section Conference, Macon, GA, USA, 30 March–1 April 2014. [Google Scholar]
  16. International Engineering Alliance (IEA). Washington Accord Signatories. 2023. Available online: https://www.ieagreements.org/accords/washington/signatories/ (accessed on 11 December 2023).
  17. Middle States Commission of Higher Education. Standards for Accreditation, PA, USA. 2023. Available online: https://www.msche.org/ (accessed on 11 December 2023).
  18. Mohammad, A.W.; Zaharim, A. Programme outcomes assessment models in engineering faculties. Asian Soc. Sci. 2012, 8. [Google Scholar] [CrossRef]
  19. Wergin, J.F. Higher education: Waking up to the importance of accreditation. Change 2005, 37, 35–41. [Google Scholar]
  20. Aiken-Wisniewski, S.A.; Smith, J.S.; Troxel, W.G. Expanding Research in Academic Advising: Methodological Strategies to Engage Advisors in Research. NACADA J. 2010, 30, 4–13. [Google Scholar] [CrossRef]
  21. Appleby, D.C. The teaching-advising connection. In The Teaching of Psychology: Essays in Honor of Wilbert J. McKeachie and Charles L. Braver; Davis, S.F., Buskist, W., Eds.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2002. [Google Scholar]
  22. Appleby, D.C. Advising as teaching and learning. In Academic Advising: A Comprehensive Handbook, 2nd ed.; Gordon, V.N., Habley, W.R., Grites, T.J., Eds.; Jossey-Bass: San Francisco, CA, USA, 2008; pp. 85–102. [Google Scholar]
  23. Campbell, S. Why do assessment of academic advising? Part I. Acad. Advis. Today 2005, 28, 8. [Google Scholar]
  24. Campbell, S. Why do assessment of academic advising? Part II. Acad. Advis. Today 2005, 28, 13–14. [Google Scholar]
  25. Campbell, S.M.; Nutt, C.L. Academic advising in the new global century: Supporting student engagement and learning outcomes achievement. Peer Rev. 2008, 10, 4–7. [Google Scholar]
  26. Gordon, V.N. Developmental Advising: The Elusive Ideal. NACADA J. 2019, 39, 72–76. [Google Scholar] [CrossRef]
  27. Habley, W.R.; Morales, R.H. Advising Models: Goal Achievement and Program Effectiveness. NACADA J. 1998, 18, 35–41. [Google Scholar] [CrossRef]
  28. He, Y.; Hutson, B. Assessment for faculty advising: Beyond the service component. NACADA J. 2017, 37, 66–75. [Google Scholar] [CrossRef]
  29. Kraft-Terry, S.; Cheri, K. Direct Measure Assessment of Learning Outcome–Driven Proactive Advising for Academically At-Risk Students. NACADA J. 2019, 39, 60–76. [Google Scholar] [CrossRef]
  30. Lynch, M. Assessing the effectiveness of the advising program. In Academic Advising: A Comprehensive Handbook; Gordon, V.N., Habley, W.R., Eds.; Jossey-Bass: San Francisco, CA, USA, 2000. [Google Scholar]
  31. Powers, K.L.; Carlstrom, A.H.; Hughey, K.F. Academic advising assessment practices: Results of a national study. NACADA J. 2014, 34, 64–77. [Google Scholar] [CrossRef]
  32. Swing, R.L. (Ed.) Proving and Improving: Strategies for Assessing the First Year of College; (Monograph Series No. 33); University of South Carolina, National Resource Center for the First-Year Experience and Students in Transition: Columbia, SC, USA, 2001. [Google Scholar]
  33. Information on EvalTools®. Available online: http://www.makteam.com (accessed on 11 December 2023).
  34. Jeschke, M.P.; Johnson, K.E.; Williams, J.R. A comparison of intrusive and prescriptive advising of psychology majors at an urban comprehensive university. NACADA J. 2001, 21, 46–58. [Google Scholar] [CrossRef]
  35. Kadar, R.S. A counseling liaison model of academic advising. J. Coll. Couns. 2001, 4, 174–178. [Google Scholar] [CrossRef]
  36. Banta, T.W.; Hansen, M.J.; Black, K.E.; Jackson, J.E. Assessing advising outcomes. NACADA J. Spring 2002, 22, 5–14. [Google Scholar] [CrossRef]
  37. National Academic Advising Association (NACADA). Kansas State University, KS, USA. 2023. Available online: https://nacada.ksu.edu/ (accessed on 11 December 2023).
  38. Ibrahim, W.; Atif, Y.; Shuaib, K.; Sampson, D. A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes. Educ. Technol. Soc. 2015, 18, 46–59. [Google Scholar]
  39. Kumaran, V.S.; Lindquist, T.E. Web-based course information system supporting accreditation. In Proceedings of the 2007 Frontiers in Education Conference, San Diego, CA, USA, 10–13 October 2007; Available online: https://asu.elsevierpure.com/en/publications/web-based-course-information-system-supporting-accreditation (accessed on 11 March 2022).
  40. McGourty, J.; Sebastian, C.; Swart, W. Performance measurement and continuous improvement of undergraduate engineering education systems. In Proceedings of the 1997 Frontiers in Education Conference, Pittsburgh, PA, USA, 5–8 November 1997; IEEE Catalog no. 97CH36099. pp. 1294–1301. [Google Scholar]
  41. McGourty, J.; Sebastian, C.; Swart, W. Developing a comprehensive assessment program for engineering education. J. Eng. Educ. 1998, 87, 355–361. [Google Scholar] [CrossRef]
  42. Pallapu, S.K. Automating Outcomes Based Assessment. 2005. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.199.4160&rep=rep1&type=pdf (accessed on 12 April 2022).
  43. Hussain, W.; Mak, F.; Addas, M.F. Engineering Program Evaluations Based on Automated Measurement of Performance Indicators Data Classified into Cognitive, Affective, and Psychomotor Learning Domains of the Revised Bloom’s Taxonomy. In Proceedings of the ASEE 123rd Annual Conference and Exposition, New Orleans, LA, USA, 26–29 June 2016; Available online: https://peer.asee.org/engineering-program-evaluations-based-on-automated-measurement-of-performance-indicators-data-classified-into-cognitive-affective-and-psychomotor-learning-domains-of-the-revised-bloom-s-taxonomy (accessed on 21 May 2022).
  44. Hussain, W.; Spady, W. Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes. In Proceedings of the ASEE 124th Annual Conference and Exposition, Columbus, OH, USA, 25–28 June 2017. [Google Scholar]
  45. Mak, F.; Sundaram, R. Integrated FCAR Model with Traditional Rubric-Based Model to Enhance Automation of Student Outcomes Evaluation Process. In Proceedings of the ASEE 123rdAnnual Conference and Exposition, New Orleans, LA, USA, 26–29 June 2016. [Google Scholar]
  46. Eltayeb, M.; Mak, F.; Soysal, O. Work in progress: Engaging faculty for program improvement via EvalTools®: A new software model. In Proceedings of the 2013 Frontiers in Education Conference FIE, Oklahoma City, OK, USA, 23–26 October 2013; pp. 1–6. [Google Scholar] [CrossRef]
  47. Hussain, W.; Spady, W.G.; Naqash, M.T.; Khan, S.Z.; Khawaja, B.A.; Conner, L. ABET Accreditation During and After COVID19—Navigating the Digital Age. IEEE Access 2020, 8, 218997–219046. [Google Scholar] [CrossRef] [PubMed]
  48. Spady, W.; Marshall, K.J. Beyond traditional outcome-based education. Educ. Leadersh. 1991, 49, 71. [Google Scholar]
  49. Spady, W. Organizing for results: The basis of authentic restructuring and reform. Educ. Leadersh. 1988, 46, 7. [Google Scholar]
  50. Spady, W. It’s time to take a close look at outcome-based education. Outcomes 1992, 7, 6–13. [Google Scholar]
  51. Information on Blackboard®. Available online: https://www.blackboard.com/teaching-learning/learning-management/blackboard-learn (accessed on 19 May 2022).
  52. Hussain, W.; Addas, M.F.; Mak, F. Quality improvement with automated engineering program evaluations using performance indicators based on Bloom’s 3 domains. In Proceedings of the Frontiers in Education Conference (FIE), Erie, PA, USA, 12–15 October 2016; pp. 1–9. [Google Scholar]
  53. Hussain, W.; Addas, M.F. Digitally Automated Assessment of Outcomes Classified per Bloom’s Three Domains and Based on Frequency and Types of Assessments; University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA): Urbana, IL, USA, 2016; Available online: http://www.learningoutcomesassessment.org/documents/Hussain_Addas_Assessment_in_Practice.pdf (accessed on 21 May 2022).
  54. Hussain, W.; Addas, M.F. A Digital Integrated Quality Management System for Automated Assessment of QIYAS Standardized Learning Outcomes. In Proceedings of the 2nd International Conference on Outcomes Assessment (ICA), QIYAS, Riyadh, Saudi Arabia, 6–8 December 2015. [Google Scholar]
  55. Estell, J.K.; Yoder, J.-D.S.; Morrison, B.B.; Mak, F.K. Improving upon best practices: FCAR 2.0. In Proceedings of the ASEE 2012 Annual Conference, San Antonio, TX, USA, 10–13 June 2012. [Google Scholar]
  56. Liu, C.; Chen, L. Selective and objective assessment calculation and automation. In Proceedings of the ACMSE’12, Tuscaloosa, AL, USA, 29–31 March 2012. [Google Scholar]
  57. Mak, F.; Kelly, J. Systematic means for identifying and justifying key assignments for effective rules-based program evaluation. In Proceedings of the 40th ASEE/IEEE Frontiers in Education Conference, Washington, DC, USA, 27–30 October 2010. [Google Scholar]
  58. Miller, R.L.; Olds, B.M. Performance assessment of EC-2000 student outcomes in the unit operations laboratory. In Proceedings of the ASEE Annual Conference Proceedings, Charlotte, NC, USA, 20–23 June 1999. [Google Scholar]
  59. Hussain, W. Engineering Programs Bloom’s Learning Domain Evaluations CQI. 2016. Available online: https://www.youtube.com/watch?v=VR4fsD97KD0 (accessed on 21 May 2022).
  60. Mead, P.F.; Bennet, M.M. Practical framework for Bloom’s based teaching and assessment of engineering outcomes. In Education and Training in Optics and Photonics; Optical Society of America: Washington, DC, USA, 2009; paper ETB3. [Google Scholar] [CrossRef]
  61. Mead, P.F.; Turnquest, T.T.; Wallace, S.D. Work in progress: Practical framework for engineering outcomes-based teaching assessment—A catalyst for the creation of faculty learning communities. In Proceedings of the 36th Annual Frontiers in Education Conference, San Diego, CA, USA, 28–31 October 2006; pp. 19–20. [Google Scholar] [CrossRef]
  62. Hussain, W. Specific Performance Indicators. 2017. Available online: https://www.youtube.com/watch?v=T9aKfJcJkNk (accessed on 21 May 2022).
  63. Gosselin, K.R.; Okamoto, N. Improving Instruction and Assessment via Bloom’s Taxonomy and Descriptive Rubrics. In Proceedings of the ASEE 125th Annual Conference and Exposition, Salt Lake City, UT, USA, 25–28 June 2018. [Google Scholar]
  64. Jonsson, A.; Svingby, G. The use of scoring rubrics: Reliability, validity and educational consequences. Educ. Res. Rev. 2007, 2, 130–144. [Google Scholar] [CrossRef]
  65. Hussain, W.; Spady, W.; Khan, S.Z.; Khawaja, B.; Naqash, T.; Conner, L. Impact evaluations of engineering programs using abet student outcomes. IEEE Access 2021, 9, 46166–46190. [Google Scholar] [CrossRef]
Figure 1. FCAR + PI assessment model process flow indicating course faculty involvement in almost all phases of CQI cycles.
Figure 1. FCAR + PI assessment model process flow indicating course faculty involvement in almost all phases of CQI cycles.
Information 15 00520 g001
Figure 2. OBE design down mapping from goals, PEOS, SOs, and COs to PIs [26].
Figure 2. OBE design down mapping from goals, PEOS, SOs, and COs to PIs [26].
Information 15 00520 g002
Figure 3. Portion of lists in a digital database showing PIs and their corresponding hybrid rubrics for ABET SO “e” on problem-solving.
Figure 3. Portion of lists in a digital database showing PIs and their corresponding hybrid rubrics for ABET SO “e” on problem-solving.
Information 15 00520 g003
Figure 4. Clipped portion of performance vector for course ME_262 THERMODYNAMICS 1.
Figure 4. Clipped portion of performance vector for course ME_262 THERMODYNAMICS 1.
Information 15 00520 g004
Figure 5. SO8-PIs, Performance Vector Table (PVT) for term 372 mechanical engineering program.
Figure 5. SO8-PIs, Performance Vector Table (PVT) for term 372 mechanical engineering program.
Information 15 00520 g005
Figure 6. Mixed methods approach to developmental advising at the Faculty of Engineering at IU.
Figure 6. Mixed methods approach to developmental advising at the Faculty of Engineering at IU.
Information 15 00520 g006
Figure 7. EE program, consolidated student evaluation for terms 361, 362, and 371 of an above-average student showing a pattern of weakness in skills related to SOs “h”, “i”, and “j”.
Figure 7. EE program, consolidated student evaluation for terms 361, 362, and 371 of an above-average student showing a pattern of weakness in skills related to SOs “h”, “i”, and “j”.
Information 15 00520 g007
Figure 8. SO_1 “a”, an individual underperforming student’s skill data measured by multiple raters using several PIs in multiple courses, types of assessments, terms, and applying weighting factors, WF.
Figure 8. SO_1 “a”, an individual underperforming student’s skill data measured by multiple raters using several PIs in multiple courses, types of assessments, terms, and applying weighting factors, WF.
Information 15 00520 g008
Figure 9. Patterns of comparatively better learning for a typical underperforming EE student observed in a two-term student evaluation report.
Figure 9. Patterns of comparatively better learning for a typical underperforming EE student observed in a two-term student evaluation report.
Information 15 00520 g009
Figure 10. A typical outcome-based advising sample for an EE student showing consistent failure in SO_8 related to understanding the impact of engineering solutions on economic, societal, and environmental aspects.
Figure 10. A typical outcome-based advising sample for an EE student showing consistent failure in SO_8 related to understanding the impact of engineering solutions on economic, societal, and environmental aspects.
Information 15 00520 g010
Figure 11. Questionnaires and instructions for students’ self-assessment of SO9.
Figure 11. Questionnaires and instructions for students’ self-assessment of SO9.
Information 15 00520 g011
Figure 12. Mixed methods approach to developmental advising in the electrical engineering program at GU.
Figure 12. Mixed methods approach to developmental advising in the electrical engineering program at GU.
Information 15 00520 g012
Figure 13. Case study—a sample student’s overall SO attainment.
Figure 13. Case study—a sample student’s overall SO attainment.
Information 15 00520 g013
Figure 14. A sample student’s self-assessment of his learning status against the student outcomes.
Figure 14. A sample student’s self-assessment of his learning status against the student outcomes.
Information 15 00520 g014
Figure 15. A drill-down menu showing key assignments for SO4.
Figure 15. A drill-down menu showing key assignments for SO4.
Information 15 00520 g015
Figure 16. Advisor’s input to student progress in attaining SOs.
Figure 16. Advisor’s input to student progress in attaining SOs.
Information 15 00520 g016
Figure 17. EAMU rubric performance vector for PI_9_1.
Figure 17. EAMU rubric performance vector for PI_9_1.
Information 15 00520 g017
Figure 18. Effectiveness of advising through assessing SO9.
Figure 18. Effectiveness of advising through assessing SO9.
Information 15 00520 g018
Table 1. Heuristic rules for performance criteria.
Table 1. Heuristic rules for performance criteria.
Specification of EAMU Performance Indicator Levels
Category-Scale%Description
Excellent (E)
(90–100)
Apply knowledge with virtually no conceptual or procedural errors
Adequate (A)
(75–90)
Apply knowledge without significant conceptual and only minor procedural errors
Minimal (M)
(60–75)
Apply knowledge with occasional conceptual and only minor procedural errors
Unsatisfactory (U)
(0–60)
Significant conceptual and/or procedural errors when applying knowledge
Heuristic rules for Performance Vector Tables (PVT):
CategoryGeneral Description
Red FlagAny performance vector with an average below 3.3 and a level of unsatisfactory performance (U) that exceeds 10%
Yellow FlagAny performance vector with an average below 3.3 or a level of unsatisfactory performance (U) that exceeds 10% but not both
Green FlagAny performance vector with an average that is at least greater than 4.6 and no indication of unsatisfactory performance (U)
No FlagAny performance vector that does not fall into one of the above categories
Table 2. Qualitative comparison of digital developmental and prevalent traditional advising.
Table 2. Qualitative comparison of digital developmental and prevalent traditional advising.
AreaPedagogical AspectsDigital Developmental AdvisingPrevalent Traditional AdvisingSectional/Research References
X Authentic OBE and Conceptual FrameworksBased on Authentic OBE FrameworksMaximum fulfillment of authentic OBE frameworksPartial or minimal fulfillment of authentic OBE frameworksSection 1, Section 3.3 and Section 4.1
[21,22,23,24,25,34,35]
Standards of Language of OutcomesMaximum fulfillment of consistent OBE frameworksPartial or minimal fulfillment and lack of any consistent frameworksSection 1, Section 3.3, Section 4.1 and Section 4.2
[21,22,23,24,25,34,35]
Assess studentsAll students assessedRandom or select samplingSection 1, Section 3.3, Section 4.1 and Section 4.2
[12,13,17,18,38,39,40,41,42]
Specificity of OutcomesMostly specific resulting in valid and reliable outcomes data Mostly generic resulting in vague and inaccurate resultsSection 1, Section 3.3, Section 4.1 and Section 4.2
[5,12,43,44,45]
Coverage of Bloom’s 3 Learning Domains and Learning LevelsSpecific PIs that are classified according to Bloom’s 3 learning domains and their learning levelsGeneric PIs that have no classificationSection 1 and Section 4.2
[5,12,43,44,45]
‘Design Down’ ImplementationOBE power principle design down is fully implemented with specific PIs used to assess the course outcomes OBE power principle design down is partially implemented with generic PIs used to assess the program outcomesSection 4.2
[5,12,43,44,45]
Assessment PracticesDescription of RubricsHybrid rubrics that are a combination of analytic and holistic, topic-specific, provide detailed steps, scoring information, and descriptorsMostly holistic generic rubrics, some could be analytic, rarely topic-specific or provide detailed steps, without scoring information and detailed descriptorsSection 4.2
[5,12,43,44,45,63,64]
Application of RubricsApplied to most course learning activities with tight alignmentApplied to just major learning activities at the program level with minimal alignmentSection 4.2
[12,13,17,18,38,39,40,41,42,63,64]
Embedded AssessmentsThe course outcomes and PIs follow consistent frameworks and are designed to enable embedded assessment methodologyThe course outcomes and PIs do not follow consistent frameworks and are not designed to enable embedded assessment methodologySection 4.2 and Section 4.3
[12,13,17,18,38,39,40,41,42]
Quality of Outcomes DataValidity and Reliability of Outcomes DataSpecific outcomes and PIs, consistent frameworks, and hybrid rubrics produce comprehensive and accurate assessment data for all students. Therefore, outcome data can be used for advising purposes.Generic outcomes and PIs, lack of consistent frameworks, and generic rubrics produce vague and inaccurate assessment data for small samples of students. Therefore, outcome data cannot be used for advising purposes.Section 1, Section 3.3, Section 4.2, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6, Section 5.7 and Section 5.8
[5,12,43,44,45]
Statistical PowerHeterogeneous and accurate data. All students, all courses, and all major assessments sampledRandom or selective sampling of students, courses, and assessmentsSection 3.3, Section 4.1, Section 4.2 and Section 4.3
[12,13,17,18,38,39,40,41,42]
Quality of Multi-term SOs DataValid and reliable data.
All data are collected from direct assessments by implementing the following:
several essential elements of comprehensive assessment methodology;
specific PIs;
wide application of hybrid rubrics;
strictly following stringent QA processes and monitoring ensuring tight alignment with student learning.
Usually not available and unreliable.
Due to a lack of the following:
comprehensive assessment process;
specific PIs;
wide usage of rubrics;
stringent QA processes and;
appropriate technology.
Section 1, Section 3.3, Section 4.1, Section 4.2 and Section 4.3
[12,13,17,18,38,39,40,41,42]
StaffAccess to Students Skills and Knowledge InformationAdvisors can easily access student outcomes, assessments, and objective evidence besides academic transcript information Advisors cannot access student outcomes, assessments, and objective evidence. Advising is fully based on academic transcript information.Section 1, Section 3.3, Section 4.2, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6, Section 5.7 and Section 5.8
[12,13,17,18,38,39,40,41,42]
Advisor InteractionsAdvisors have full access to detailed student past and present course performance, thereby providing accurate informational resources to facilitate productive advisor–course instructor dialogueAdvisors do not have access to any detail related to students’ past or present course performance, thereby lacking any information resources for productive advisor–course instructor dialogueSection 1, Section 3.3, Section 5, Section 5.2 and Section 5.3
[12,13,17,18,38,39,40,41,42]
Performance CriteriaAdvisors apply detailed performance criteria and heuristics rules based on a scientific color-coded flagging scheme to evaluate the attainment of student outcomesAdvisors do not refer to or apply any such performance criteria or heuristic rules due to a lack of detailed direct assessment data and associated digital reporting technology Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[12,13,17,18,38,39,40,41,42]
Access to Multi-term SOs DataAdvisors can easily access and use multi-term SO data reports and identify performance trends and patterns for accurate developmental feedbackAdvisors cannot access any type of multi-term SO data reports and identify performance trends and patterns for accurate developmental feedbackSection 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,52,55]
Mixed Methods Approaches to InvestigationAdvisors can easily apply mixed methods approaches to investigation and feedback for effective developmental advising due to the availability of accurate outcome data, specific PIs, and assessment information presented in organized formats using state-of-the-art digital diagnostic reportsAdvisors cannot apply mixed methods approaches to investigation and feedback for effective developmental advising due to the lack of availability of accurate outcome data, specific PIs, and assessment information presented in organized formats using state-of-the-art digital diagnostic reportsSection 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,54,62]
StudentsStudent Accessibility of outcomes dataAll students can review their detailed outcome-based performance and assessment information for multiple terms and examine trends in improvement or any failuresStudents cannot review any form of outcome-based performance or assessments information for multiple terms and cannot examine trends in improvement or any failuresSection 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,54,62]
Student Follow Up Actions for ImprovementBoth students and advisors can track outcome-based performance and systematically follow up on recommended remedial actions using digital reporting featuresNeither students nor advisors can track outcome-based performance, and therefore, they cannot systematically follow up on any recommended remedial actionsSection 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,54,62]
Student Attainment of Lifelong Learning SkillsStudents can use self-evaluation forms and reinforce their remediation efforts with guidance from advisors to enhance metacognition capabilities and eventually attain lifelong learning skillsStudents do not have any access to outcome data and therefore cannot conduct any form of self-evaluation for outcome performance and therefore cannot collaborate with any advisor guidance on outcomesSection 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,54,62]
ProcessIntegration with Digital TechnologyPedagogy and assessment methodology fully support integration with digital technology that employs embedded assessmentsLanguage of outcomes, alignment issues, and a lack of rubrics make it difficult to integrate with digital technology employing embedded assessmentsSection 1, Section 4.3, Section 5.1, Section 5.2, Section 5.3, Section 5.4, Section 5.5, Section 5.6 and Section 5.7
[40,42,43,47,50,51,54,62]
PDCA Quality ProcessesSix comprehensive PDCA quality cycles for stringent quality standards, monitoring, and control of the education processLack of well-organized and stringent QA cycles or measures and technology for implementing the education processSection 1, Section 3.3 and Section 4.3
[40,42,43,47]
Impact Evaluation of AdvisingCredible impact evaluations of developmental advising can be conducted by applying qualifying rubrics to multi-year SO direct assessment trend analysis information. There is no need for control or focus groups and credibility issues related to student survey feedback.Impact evaluations are usually based on indirect assessments collected using student surveys. Several issues related to use of control or focus groups and credibility of feedback have to be accordingly dealt with.Section 5.8
[47,65]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hussain, W.; Fong, M.; Spady, W.G. Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations. Information 2024, 15, 520. https://doi.org/10.3390/info15090520

AMA Style

Hussain W, Fong M, Spady WG. Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations. Information. 2024; 15(9):520. https://doi.org/10.3390/info15090520

Chicago/Turabian Style

Hussain, Wajid, Mak Fong, and William G. Spady. 2024. "Digital Developmental Advising Systems for Engineering Students Based on Accreditation Board of Engineering and Technology Student Outcome Evaluations" Information 15, no. 9: 520. https://doi.org/10.3390/info15090520

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop