Next Article in Journal
Nonclassical Systemics of Quasicoherence: From Formal Properties to Representations of Generative Mechanisms. A Conceptual Introduction to a Paradigm-Shift
Previous Article in Journal
A System Dynamics Model Examining Alternative Wildfire Response Policies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dominant Factors for an Effective Selection System: An Australian Education Sector Perspective

1
La Trobe Business School, La Trobe University, Melbourne VIC 3086, Australia
2
Department of Information Technology, Melbourne Polytechnic, Melbourne 3072, Australia
3
Holmesglen Institute, Southbank VIC 3006, Australia
*
Author to whom correspondence should be addressed.
Systems 2019, 7(4), 50; https://doi.org/10.3390/systems7040050
Submission received: 22 September 2019 / Revised: 28 October 2019 / Accepted: 30 October 2019 / Published: 1 November 2019

Abstract

:
With the latest advancements in information technologies, many organisations expect systems to provide effective support in the recruitment process and decision making. However, there is a lack of clarity on the dominant factors required for an effective recruitment system which can influence business outcomes. This paper aimed to identify the predominant factors in the employee selection process and the use of a management system for decision support. The empirical study consisted of a qualitative survey of 74 samples and a quantitative survey of 204 individual participants from 17 organisations coming from technical and further education (TAFE)/dual education (higher education and vocational education) sector members of the Victorian TAFE Association in Australia. Using commonly adopted exploratory factor analysis (EFA) of 38 variables, the data triangulation of the qualitative and quantitative analysis resulted in conformance of five dominant factors under three themes. We believe the results of the study offer actionable suggestions in developing an effective recruitment system and furthers the research in this field of study.

1. Introduction

Recruitment systems increasingly require a focus on talent acquisition (TA) which is the process of advertising and attracting potential job applicants from internal or external sources to the organisation and then to assess their suitability for the job advertised with the final goal to select the applicant that best fits the job description [1]. Huffcutt and Culbertson [2] note, “it is rare, even unthinkable, for someone to be hired without some type of interview”. In light of TA being of prime importance to an organisation, Wyatt and Jamieson [3] note that employee selection processes are known to be notorious for decision making which is unreliable and often invalid and, hence, propagating the need for an expert system based on an objective approach to reduce the possible biases and prejudices which are mainly based on probable predispositions as clearly documented in their literature. This process has two key stakeholders: First are the interviewers who are the representatives of the organisation and play a vital role in conducting interviews to identify the best candidate suited for the position [4]. The second key stakeholders are the applicants or candidates who express interest in becoming future employees of an organisation by applying for the job advertised [5].
The selection methods currently used by interviewers to assist with predicting the most suitable applicant that matches the job requirement are diverse and imperfect [6]. To increase the accuracy of the prediction of the right applicant, there are different types of applicant testing methods that are constructed and implemented by different organisations [7]. Personality tests, integrity tests, and psychological and psychometric tests are most commonly used to achieve good recruitment outcomes [8]. Information technology (IT) has been used in facilitating the execution of these tests and to further calculate the applicant scores as well as report the test performance with explanations and recommendations. These tests have also advanced into various recruitment management and applicant management expert systems that have been introduced [6]. Wyatt and Jamieson [3] have gone to great lengths to analyse and build an expert system called “CHAOS” (Computerised Helpful Advice on Selection) to assist with building an objective hiring process. Through their research, they have demonstrated that an expert system-based approach to decision making will provide more reliable decisions known for their consistency. They also mention that there is always contention in the process of recruitment and selection, thereby making it an ideal domain to build an expert system to assist managers’ decision processes enabling objectivity and fairness. In this context, another vital dimension to cover is the validity of the selection procedure. Robertson and Smith [9] outline the essential elements in the design and validation of the employee selection process.
In more recent times, with the advancements in IT, several applications of technologies in the selection processes are also being upgraded using artificial intelligence. This is necessary due to the uncertainty and increased risk coupled with the limited time to make the right decision to select the best applicant for the organisation [10]. Communication robots focusing on social innovation are being tested for non-verbal behaviour recognition to predict social interaction outcomes [11]. The architecture of these robots shows that they are emotionally aware and tuned to identify face detection, speech recognition, emotions reading, and gesturing, along with interview data processing [12]. Based on the desired level of sophistication of such expert systems, job interviews could be automated with either structured or unstructured interviews. However, many organisations are not able to adopt such sophisticated recruitment systems. This is because several influencing factors guide the recruitment process for making the system effective, and there is a lack of research in this direction. Apart from factors such as job interviews being structured or unstructured playing a significant role, the personality aspects are critical as they lead to work performance and finally to employee retention. Such factors having a long-term impact are essential for a successful recruitment system. In order to unearth all these key dimensions, this research study focused on identifying the dominant factors in the interview and selection process for developing an effective recruitment system.
In this paper, all types of job interviews were broadly categorised under two headings, either structured/high-structured interviews (HSI) or unstructured/low-structured interviews (LSI) [13]. There have been many definitions for the term “structured interview” in the literature [14]. One, in particular, provides a crisp and clear understanding: “the degree of discretion that an interviewer is allowed in conducting the interview” [15]. Unstructured, on the other hand, is when the interviewer can ask different questions to each applicant in any order and pattern that the interviewer deems fit [16]. Based on the job interview type being structured or unstructured, increased or decreased objective/subjective elements play a role in the hiring decision. Research has established that structured interviews have higher criterion-related validity and improved dependability than unstructured interviews [13]. However, there is evidence in support of the counter-argument as well, where an empirical examination of hiring decisions of auditors highlighted that increased subjective elements in the hiring decision had the highest explanatory power in influencing the hiring decision [17]. This paper intends to uncover more benefits and drawbacks of these two types of interviews by understanding the dominant factors in the employee selection decision process. The research was based on an empirical study with a focus specifically on the current recruitment systems in practice within an Australian technical and further education (TAFE)/dual-sector setting. In addition, the findings are presented from the perspectives of these stakeholders when they experienced the same process from different contexts based on their roles, such as hiring member and applicant, and outcomes which were further classified into successful and unsuccessful applicants.
The purpose of this paper was to discuss the dominant factors of the selection process, specifically studying the current status of 17 organisations from an Australian TAFE/dual-sector in the use of technology and management systems within their TA process. This study is a continuation of ongoing research, and the objective of this empirical study was to identify and understand the dominant factors of the employee selection process. This research carries forward from discussions on the quantitative analysis of the critical aspects of the selection process outlined in our previous study [18]. The focus of this paper was to consider the factors that can influence the decision based on different perspectives of the participants, such as hiring members, successful applicants, as well as unsuccessful applicants. Here, we adopted a mixed-methods approach using statistical techniques for quantitative analysis and thematic analysis for the qualitative analysis to meet the research objective.
This paper is organised around the predominant factors in the employee selection process and the use of a management system for decision support. Section 2 provides the research context and method along with introducing the sample participants of this investigation, namely the TAFE/dual education (higher education and vocational education) sector members of the Victorian TAFE Association (VTA). This section also presents a breakdown of the characteristics of the sample collected. Section 3 describes the pragmatic study design to map this paper’s research question with the qualitative and quantitative questions and sets the stage for the analysis to be conducted in the next section. Section 4 breaks the examination into two parts with the quantitative analysis findings presented first followed by the qualitative data presented next. Section 5 summarises the results from these investigations and compares the findings from both the quantitative and qualitative methods as part of data triangulation. Lastly, Section 6 concludes with recommendations for future research.

2. Literature Review of Theoretical Framework and Empirical Study Background

2.1. A Review of the Theoretical Framework

Research on improving human resource management (HRM) processes with the help of expert systems is an ongoing endeavour. In these studies, the key areas of improvement in HRM systems and applications calls for further investigation by taking a multi-level approach to the analysis [19], accounting for selection bias [20], and reviewing the associations among all HRM systems in an organization to better understand the HRM expert systems with its performance outcomes [21]. The focus of using expert systems in HRM has been steadily increasing with time. This is further evidenced in the recent shift of discussions related to individual HR practices in employing HR systems to the whole process [22]. This is also evidenced in the review by Sackett and Lievens [23] on the possible platforms to consider for improving the selection process. Some of the seminal and fundamental theoretical frameworks that researchers have used over the last few decades are reviewed, and a summary is presented in Table 1 below. Overall, there is no single framework that is approved as an all-encompassing theoretical framework by these researchers for this topic.
This research assumed the Applicant Attribution-Reaction Theory (AART) framework by Ployhart and Harold [24], which is a model integrating attribution theory into applicant reactions. Validating this information with the explanation given by Lederman and Lederman [25] on the purpose of theoretical frameworks in research, it can be stated that using AART for this empirical study would serve to be a guiding framework for our research investigation as illustrated in Figure 1 below. They confer that qualitative researchers tend to analyse the data they have collected by invoking a theory to assist them in establishing their findings from the investigation in the context of existing literature.
As this study is the second part of an ongoing research project outlined earlier by Rozario and Venkatraman [18], this paper focused on identifying the dominant factors of the selection decision based on the existing processes with the current use of systems and technology by the various organisations operating in the Australian TAFE/dual education sectors within urban and regional Victoria.

2.2. Empirical Study Background

In this section, we present background information on the empirical study conducted with data collected from 17 TAFEs/dual-sector organisations listed with the VTA using face-to-face interviews and an anonymous online survey instrument. A research method can be entirely qualitative, quantitative, or mixed-methods based on a combination of both qualitative and quantitative approaches. We have identified the mixed-methods approach as an appropriate research method for this study due to the fact of its various properties. The mixed-methods approach involves the collection of qualitative data in the form of open-ended questions and quantitative data from closed-ended questions in response to the research questions. The procedure for data collection and analysis is rigorous in obtaining both forms of data [26]. In this study, the research methodology was a mixed-methods approach with convergent parallel and concurrent design techniques. Bell and Bryman [27] note that the convergent parallel design technique, where the data obtained from quantitative and qualitative sets are interpreted concurrently, provides a more comprehensive and a richer multi-dimensional understanding and response to the research questions. By doing so, the concurrent data triangulation is enabled, wherein the data obtained from the qualitative and quantitative sets during each research phase undergo a compare and contrast process. This way, the datasets support one another, bringing about a complete picture of the research question posed [28]. Figure 2 below illustrates an overview of this research design. Following this, the background information on the data collected from the 17 organisations in this research is tabulated.

3. Empirical Study Design

Previous research has established the critical aspects of a selection process from the perspectives of both hiring members and applicants [18]. Based on the results obtained, the platform was set in this study to explore further the identification of the dominant factors to be considered for improving the selection process of recruitment systems. Specifically, this paper investigated the empirical evidence in supporting the dominant factors for the employee interview and selection process with the potential to enhance the recruitment systems.
Based on the research design for this study, as outlined in Section 2.2. above, both the probability and non-probability techniques of sampling were used for this research. The stratified sampling method, which uses some known characteristics of the participant, was utilised for this study. This was in the form of identifying human resources professionals and executives who represent their institutes in the VTA forums since their subject knowledge and practical exposure are associated with a direct impact in this study. Additionally, under the non-probability technique, snowball sampling and homogeneous purposive sampling were used. One of the main criteria for shortlisting participants for this study involved the capacity of the participant to contribute to the research finding, which is possible only if they have undergone the employee selection process of a TAFE. By implementing this judgemental step, we ensured that only the current and past employees in any position of a TAFE were included in the study, and the homogeneous purposive sampling technique was used. The overall sample size, even after using these sampling techniques, would be quite challenging to manage due to the size and, therefore, quota sampling was implemented. The maximum participants from the urban area were set to five from each organisation, and for the regional area, it was set to two from each institution. This resulted in 50 participants from the urban area and 14 participants from the regional area with 64 targeted participants. However, during the fieldwork, a few more interested participants volunteered to undergo face-to-face qualitative interviews. Likewise, for the survey, the target responses were fixed at a minimum of 10 members per institution for the urban area and 5 responses for the regional area which totalled to at least 105 participants from all TAFEs to enable representability and generalisability of the population. Table 2 below lists the 17 VTA members that were identified to be part of the Victoria TAFE sector and operating in either urban or regional Victoria. Both current and previous employees of these institutes were invited to participate in this empirical study.
There were 74 interviews for qualitative data and 204 individual survey participants for quantitative data from these 17 organisations. The total number of participants from the qualitative and quantitative methods with their individual personal experiences are tabulated in Table 3. It outlines the number of participants per organization (with the names coded) for the face-to-face semi-structured interviews as part of the qualitative data collection. Likewise, for the online survey, the participants were requested to recollect their experiences during instances when they were taking the role as a successful applicant, unsuccessful applicant, or a hiring member which resulted in 605 unique such instances. As the focus of this study was on the experiences in the TAFE/dual-sector alone, the 52 instances forming a non-TAFE-based experience were categorised as “other” and were removed from the analysis. However, in this category, the 27 responses collected from unsuccessful participants were retained after a confirmation upon a preliminary examination of their organisations, whether related to the service or the education sector in Australia. The final set of valid data prepared for this analysis from the survey consisted of 553 total experiences combining successful, unsuccessful and hiring member perspectives. This information (N) on the quantitative and qualitative data collected is tabulated in Table 3 below.
A conscious effort was undertaken to encourage participation and to ensure equal representation from both genders in this study. Table 4 below gives the gender distribution of the survey participants.
The age distribution of the participants in this study was spread across a wide range from the ages of 25 to 74. Most of the participants (39%) belonged to the age group 55–64 and closely following that age range was the age group 45–54 (31%). Table 5 below presents the frequency distribution across the age groups of all the participants in this study.
Table 6 below provides the distribution of the citizenship status of the participants. It demonstrates that most of the participants (90%) are Australian citizens, a smaller number are permanent residents of Australia (8%), and a minimal number of participants (2%) were categorised as “other”. This information assisted in establishing the homogeneous nature of the participants in the context of their working rights in Australia. Therefore, their responses may not be affected by any changes in the immigration rules related to the work permits. This ensures that an unbiased response was obtained from the participants focusing only on the nature of the employment and the selection process for their employment without much effect from work permits and other external immigration-related information.
Previous work [18] analysed a few critical aspects using the hypothesis testing technique from the perspective of a hiring member and successful or unsuccessful interview applicant. In continuing that, this section attempted to explore all the variables used in this study in association with the hiring process. It assists in finding the results to the research question, which is related to identifying the dominant factors in the hiring process, as outlined in Table 7 below. These questions were carefully selected to align with the current literature also related to using expert systems in HRM. This section presents the findings in two parts: the first part involved the use of exploratory factor analysis as a quantitative technique, and the second part involved the use of thematic analysis as a qualitative technique to establish the dominant factors. In conclusion, a comparative analysis of both findings was completed as part of the data triangulation to establish the reliability and validity of the results reported. Finally, we present how both techniques taken together provided evidence to address the research question.

4. Analysis of Dominant Factors to Consider for Improvements to the Interview Selection Process

4.1. Quantitative Analysis: Exploratory Factor Analysis

From the previous study [18], the findings show that the results were significantly similar in all the critical aspects as shortlisted for this study, irrespective of the applicant being successful or unsuccessful. In this study, we further perform the exploratory factor analysis (EFA) that takes into account the successful applicant’s experience and the hiring member’s experience along with their responses to proposed improvements as provided in the survey. Table A1 in Appendix A presents questions related to the hiring process from the perspective of the hiring member (HM) and the hired successful applicant (HS) along with improvements to the hiring process (HP) with the frequency value. The responses are provided using the Likert scale from 1 to 7, where 1—strongly agree, 2—agree, 3—somewhat agree, 4—neither agree nor disagree, 5—somewhat disagree, 6—disagree and 7—strongly disagree. This tabulated information on the mean needs to be understood in reference to the 7 points Likert scale provided in this section.
The 38 variables listed in Table A1 in Appendix A are processed using the exploratory factor analysis technique for finding dominant factors. This technique was used to arrive at the meaningful factors that can be listed as dominant factors, which could then be considered in improving the effectiveness of the employee selection interview process. An EFA assesses the number of factors that are common in a survey instrument that impacts its measures and examines the association among each common factor to the equivalent measure and the strength of that factor [29]. Researchers use exploratory factor analysis for various purposes: (i) to detect the constructs’ natures that prompt responses in a survey; (ii) to decide on interconnected sets of items; (iii) to establish the breadth and depth of measurement scales; (iv) to organize the most significant features in that group of items; and (v) to produce factor scores that signify fundamental ideas [29]. Further, EFA is considered as an appropriate multivariate statistical approach to assist with data reduction, to explore associations among categories, and in estimating the measurement scales’ construct validities [30]. Exploratory factor analysis consists of a sequence of statistical steps for analysis. The first step is the planning step, where the suitability of the data for an EFA is determined by checking the sample size to establish a reasonable factorability followed by generating a correlation matrix and conducting a test to measure the sampling adequacy. The second step involves extracting the factors using principal axis factoring (PAF), which is the conventional method of extraction for EFA, or principal component analysis (PCA), which is another method of extraction in EFA, both of which were used in this paper. The third step requires determining a fixed number of factors to retain. The fourth step contains factor rotation with varimax as the commonly used rotation method. The fifth step consists of interpreting the factor structure and assigning new labels [31].
As outlined above, primarily, an examination for determining the factorability of the 38 variables listed in Table A1 in Appendix A was carried out. Initially, 36 of the 38 items were detected to be correlated with at least a value greater than 0.25 with one other item, thereby establishing a reasonable factorability. Furthermore, the Kaiser–Meyer–Olkin measure of sampling adequacy resulted in 0.807, as demonstrated in Table 8 below, which is higher than the generally suggested value of 0.5, and the Bartlett’s test of sphericity established the significances with 0.000 for the 38 variables processed and confirmed the adequacy of the required sampling. Additionally, the communalities tables from SPSS, as outlined in Table A2 in Appendix A using the PAF method for extraction in the initial communalities, were all above 0.3, further confirming that each variable had some common variance with other variables in that table. With these results showing overall positive indicators, EFA was considered as a suitable technique for this table of 38 variables.
After establishing the EFA technique to be an appropriate statistical testing tool for this dataset, the results obtained by conducting EFA for the 38 variables were summarised (Table A3 in Appendix A). Table A3 provides the tabulation of the variables against the total initial eigenvalues (highlighted in blue) which indicate that the first five factors described up to 26%, 8%, 7%, 5%, and 5% of the variance individually (% value highlighted in green). The sixth to the eleventh factors had initial eigenvalues just over one and individually described close to 3% of the variance. Apart from these first 11 factors, the remaining factors from 12–38 had a total initial eigenvalue less than 1.0 and, therefore, needed to be removed from further analysis. This resulted in the 11 shortlisted factors that were retained for further processing. The results for all the eleven factors were individually studied using oblimin and varimax rotations of the factor loading matrix. The identified eleven factors explain close to 70% of the variances in the dataset and is illustrated by a scree plot in Figure 3 below. It shows the “levelling off” of eigenvalues after eleven factors and the inadequate frequency of primary loadings and difficulty of interpreting the twelfth factor and subsequent factors as mentioned earlier. There was little difference between the oblimin and varimax solutions; therefore, both rotations were examined before deciding to use a varimax rotation for the final solution as it is a popular rotation method for EFA.
Overall, using the EFA techniques, we could reduce the 38 variables to 11 components that were above an eigenvalue of 1.0; this is illustrated in the scree plot in Figure 3 above and the variance Table A3 in Appendix A. In other words, 27 items were disregarded due to the fact of their non-contribution to the simple factor structure. In addition, it failed to meet the minimum criteria for avoiding a cross-loading of 0.3 and a primary factor loading of 0.4. The items related to improvements in the hiring process such as “HR can collect feedback/suggestions on interview experience from applicants” and “constructive interview performance feedback should be provided”, did not load above 0.3 on any factor. In the same table where the total variance was explained with initial eigenvalues, the items “promote an objective and standard model for the hiring process across all TAFE” and “always have an HR representative during interviews to ensure standard/consistency” had factor loadings around 0.3 on only one other factor.
The 11 factors identified as outlined in the scree plot above are listed with the variables grouped in each factor using the varimax with Kaiser normalization rotation which provided the best-defined factor structure. All items in this analysis had primary loadings above 0.4. In order to confirm that the 11 factors retained from PAF were correct, the PCA was also conducted, and the results had the same 11 factors extracted. All 11 components grouped as the case processing summary holding similar variables to the PCA were verified to match with the output from PAF and were processed further for the Cronbach’s alpha test. The results are tabulated in Table 9 below. From this table, it is evident that six cases reported a reliability scale being <0.7, negative, or single variable grouping and, therefore, were disregarded from further analyses. Using the results from the reliability statistics in Table 9, five exploratory factors with a reliability scale of more than 0.7 were identified to be reliable (in bold), as shown in Table 9 below.
The EFA components grouping now required five new labels for identification. The resulting five components from the EFA techniques were labelled based on the variables contained in each case-processing summary. For this purpose, the rotated component matrix in Table A4 in Appendix A was used to understand the contributing nature of the survey questions towards the 11 components identified. The resulting five reliable components from Table 9 above are labelled as outlined in Table 10 below.

4.2. Qualitative Analysis—Thematic Analysis

Following the quantitative analysis in the previous section, this section now examines the data collected from a qualitative perspective using a thematic analysis. In particular, it provides evidence for the data triangulation, considering the similarity in the data provided in Table 10 above and Table 11 below. The semi-structured interviews with 74 participants for an average duration of 45 min involved participants answering approximately 25 questions. Using NVivo software, the transcribed data were assigned preliminary codes which were developed using keywords from the interview question and were descriptive of the content. Additionally, there was a systematic analysis for patterns and themes across all interviews, which finally resulted in 35 such relevant codes that were identified for this study. This was later grouped into different categories based on the similarity and association of the content resulting in six different categories as evidenced in Table A5 in Appendix B. Using thematic analysis of NVivo, this was further analysed and reviewed for broad themes based on the features being discussed thereby establishing dimension reduction which resulted in three themes. Information related to the list of codes, categories, and themes with their relevant association to each other is presented in Table A5 in Appendix B for reference. Overall, these manual and in-depth analyses have resulted in 31 codes, 5 categories, and 3 themes. Furthermore, the five categories extracted using the thematic analysis are summarised in Table 11 below and suited the extracted factors in the EFA, thereby establishing data triangulation, as discussed earlier.
The above two quantitative and qualitative analyses confirmed, indicating that the five distinct factors listed from each method based on the responses of participants in the selection process established moderate and internally consistent findings with evident data triangulation. Using the quantitative and qualitative analysis, it yielded similar results, thereby establishing that the results were satisfactory and acceptable. The five dominant factors to be considered in improving and ensuring proper standards in the employee selection interview process were:
  • Training the hiring members for the interview process;
  • Planning and preparing for the interview process;
  • Removing the bias of the hiring members during the interview process;
  • Providing feedback to applicants to ensure a transparent process; and
  • Ensuring the hiring decisions are process-driven instead of driven by the interviewer’s personality.
Despite operating in a developed country, the use of technology in this sector to help with the decision making in the selection interview process was limited or nil. Due to the lack of training and technology, hiring members suggested some useful methods. On the one hand, these suggestions could bring enhancements to the recruitment process. On the other hand, due to the variations obtained from different participants, a further in-depth study is required to determine the consistency for bias that may exist.
In response to the interview question “Do you have any recruitment management systems that you used or was it just emails and paper-based?”, most of the 74 participants responded negatively; participants 12, 22, 28, 30, 31, 38, and 39 were some of the many who had commented that the process relied only on “email- and paper-based” methods. However, there were a few of the 17 organisations that seemed to have this worked out, which was appreciated by some hiring members, such as P11 who stated:
“At [withheld], we use [withheld] on our website, and people can see our positions. So, they apply for our position online, and then our interview process is managed through that recruitment module through the back end. We know which people are shortlisted. We can see where people are at through the stages. That way we can see if they are unsuccessful quite early or we can see if they progress through to the interview stage, etc.”
Using NVivo, a word frequency of the responses to the use of recruitment systems in each participant’s HR department was calculated. The results obtained from NVivo are given in Figure 4 below. There appears to be a good acceptance and eagerness among participants for the implementation of HR expert systems, as highlighted in red in Figure 4 below. This refers to the current practice of using paper and emails only with a positive response and reasons for adopting recruitment systems noted as follows:
  • Transparent outcome;
  • Updating services;
  • Improved process;
  • Understating the requirements better;
  • Data stored as a database and managed online with objective rating systems;
  • Better position to provide relevant feedback and to defend the decision taken; and
  • Finally making the organization appear professional.

5. Summary of Findings and Discussion

This section aims to summarise how the research question of this ongoing study was addressed by providing an analysis of the dominant factors involved in the consideration for improving the selection process. We analysed the results and the key empirical findings of this research by including the statistical data analysis (quantitative) as well as the narrative data analysis (qualitative). The five dominant factors that were identified for improving the employee selection process are discussed. Prior to discussing the specific dominant factors for improvement of the selection process, this section begins with a review of the findings based on all the factors associated with the selection process and provides more in-depth insights and the reasoning behind the grouping and clustering of some factors in order to form the dominant factors.
An overall review of this study’s findings demonstrates 38 individual elements used for analysis associated with the employee selection process. It shows clear indications in the results that there was a strong correlation among multiple elements that assess the different stages of the selection process. The EFA technique was used to effectively process the data further to perform data reduction by associating the correlated elements that cluster together to form a group. The data was reduced to 10 possible groups with the other elements listed within each associated group. Each group was renamed based on the elements it held for a most suitable indication of the group name. Interviewer Training was the name given to a group that consisted of seven elements such as the length of the interview; having relevant interview questions; being prepared for the interview; ensuring equal panel participation; using an appropriate interview method; having qualified interviewers; interviewers organised with the conduct of the proceeding. The results of this group, with the seven elements, collectively report 90% reliability. This finding aligned with existing literature and was also evident from its popularity of usage relating to the necessity for training of the interviewer and the applicant [32,33,34,35]. The next group was renamed as Planned and Structured Interview Process which consisted of six elements such as providing interview questions for structured interview; allocating sufficient preparation and planning for the process; appointing qualified and relevant interviewers; hiring member’s confidence in the overall process; total duration for the selection process; and the promptness of arriving at the final selection decision. The results of these elements collectively formed a new group which was found to be 82% reliable. The third group was called Bias in the Interview, and it consisted of three elements such as the interviewers’ bias on gender, religion, and ethnicity which collectively reported a 91% reliability of belonging to this group. The fourth group was named as Interviewer’s Personality and Fit, consisting of three elements such as the interviewer’s temperament impacting on the panel’s decision; the interviewer’s personality and attitude impacting on the selection decision; the interviewer’s preference for unstructured interviews. This group reports a reliability score of 71% with these three elements. The final group was called the Interview and Transparency, as it consists of two elements which were associated with the formation of the panel interview and seeking transparency. These elements reported a 78% reliability of being combined in the same group. In summary, these were the top five groups that had a reliability score > 70% and, therefore, are listed as the dominant factors that need to be considered in improving the selection process.
The findings in this empirical study also support previous research which has found these elements as important factors in establishing a robust talent acquisition process in an employee selection system [36]. What further emerges from the findings is that, as part of the data triangulation, a qualitative analysis was carried out simultaneously, and the results from the qualitative analysis were consistent with the results reported using the quantitative analysis approach. It thereby confirms the validity and reliability of the key findings for the research question in this study. In the qualitative analysis, 31 individual codes were identified, which were further grouped into six different categories based on the similarity of their nature. The six categories were the hiring members involvement; interview process enhancements; variation of bias in interviews; interview process problems; applicant feedback; and external association to the interview process. These six categories were further classified into the emerging three themes:
  • Ensuring the integrity of the interview selection process;
  • Enhancing the applicant feedback process with enriched information; and
  • Contributory elements towards the overall satisfaction of the interview selection process.
The first five categories from the qualitative analysis aligned very closely with the five factors resulting from the quantitative analysis that demonstrated the close similarity among both findings. This information is summarised and illustrated in Figure 5 below.
Responses to all questions from all 74 participants were processed by NVivo software to find the Pearson’s correlation for word similarity. This resulted in exposing the current lack of expert systems in this sector and the positive reception by participants to adopt a more system-based decision for the employee selection process. From this study, we can arrive at the following concrete improvements to the recruitment system that create evidence-based recommendations for change in the selection processes:
  • Ongoing training should reinforce that discriminatory questions cannot be asked;
  • Include an HR/neutral representative on the committee; and
  • Rather than necessarily providing feedback, perhaps the HR role should be to obtain feedback on the process from unsuccessful applicants for each position.
In summary, this paper carried out further research based on our previous study, which attempted to uncover the critical aspects of the employee selection process [18]. Technological assistance was one of the key critical aspects brought into the limelight by the hiring members. By correlating the findings reported in the previous work that were found in tandem with the discussions in this paper, the results were well justified. The identification of the five dominant factors in this study—namely, training, planning factors of the interview approach, avoiding bias, influence by the interviewer’s personality, and establishing panel interviews for transparency—have perfectly synchronised with our previous related research investigations.
We observed some of the shortcomings in the process found in this study that is summarized here. The relatively low standard of the hiring process was shown when the quality of the shortlisted candidates could not be quantified. With the help of customized expert systems, some organizations could manage to substantiate the quality of the candidates in the hiring process. However, since such a system is required to be well customized, the limitation could be in the language that is probably understood only within that organization. Introducing a standard model that could be used across all 17 organizations in this sector would establish a common language. This measure will also serve the candidates to understand their level of individual and non-subjective performance when they ask for feedback from the hiring team. These findings are in line with the arguments presented by other researchers from various associated fields, such as an open communication method in organisations between the organization and its stakeholders [37]. A systematic approach which indicates “how much better” or worse the candidate was to the hired person, would denounce any subjective personal interpretation of the hiring managers. It is imperative to undertake a systematic analysis of the process to determine the type of person to be hired. The outcome of such an in-depth analysis could be used to prove the reliability and validity of the hiring process. The cost to organisations of using poor recruitment techniques or selecting someone who does not last long with the organisation or even selecting an unsuitable person who stays with the organisation could have an extraordinarily significant negative impact on the organisation in terms of morale, cost, and potential loss of revenue [3]. In this context, Cook and Cripps [38] present the various psychological assessment techniques used for selection and identify only those that are worth using.

6. Conclusions and Future Research

This paper presented the findings of an exploratory study conducted to determine the dominant factors of the employee selection process for achieving a robust recruitment system. The empirical study consisted of 17 organisations from an Australian TAFE/dual-sector, and a mixed-method of both quantitative and qualitative survey analysis was adopted to study the use of technology and management systems for the employee selection process from various perspectives of participants including hiring members and successful and unsuccessful candidates. The findings from the EFA and NVivo analytic tools revealed five dominant factors of the selection process, such as training, planning factors of the interview approach, avoiding bias, influence by the interviewer’s personality, and establishing panel interviews for transparency, have not only confirmed with the critical factors revealed from our previous investigation but also successfully cleared the data triangulation for reliability and validity of results. These findings align well with existing literature in promoting the need for developing a selection decision tool for organisations taking into consideration the fundamental factors outlined in this study.
This study acknowledges some of the limitations within which the research was carried out that would lead to future research associated with this topic. This study followed a cross-sectional timeframe with data presented to that period only instead of a longitudinal study over a broader ongoing timeframe for analysing the selection process in this sector. The study may be subjected to partial perspectives where inferences were drawn based only on those who participated in this study, while the non-participants may have a completely different opinion on the organisations’ selection processes. It would help to enhance the reliability and validity of the findings if the entire population of the sector was included. Bearing in mind these limitations, future research to conceptualize a consistent selection decision tool addressing these dominant factors through an expert system can be undertaken. This will enable actualizing a decision-making tool to assist hiring members while providing room for a structured and unstructured interview that is in-line with the emerging mixed-methods interview process across all positions in an organisation. Future research investigation would explore the possibility of building a conceptual model to assist hiring members in the selection process taking into consideration the critical aspects outlined in the previous paper and the dominant factors presented in this paper.

Author Contributions

Conceptualization, S.D.R., S.V., M.-T.C. and A.A.; methodology, S.D.R. and S.V.; software, S.D.R.; validation, S.D.R., S.V., M.-T.C., and A.A.; formal analysis, S.D.R.; investigation, S.D.R.; writing—original draft preparation, S.D.R.; writing—review and editing, S.V.; visualization, S.D.R.; supervision, M.-T.C., S.V. and A.A.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

This section contains a proof of results which are SPSS-generated tables for the exploratory factor analysis.
Table A1. Survey questions with SPSS labels.
Table A1. Survey questions with SPSS labels.
#SPSS LabelNMeanSurvey Question
1Total Duration (HM)1383As a hiring member—The duration of the total hiring process we follow is reasonable
2Interview Bias (HM)1386As a hiring member—There was a bias of some sort in the hiring decision (gender, race, religion, etc.)
3Qualified Interviewer (HM)1383As a hiring member—Every interviewer selected is qualified to be part of the hiring team
4Interview Planning and Prep (HM)1383As a hiring member—Sufficient time is set aside for planning and preparing the hiring process
5Prompt Final Decision (HM)1383As a hiring member—Interviewers are prompt with providing their final hiring decision
6Interviewer Questions Provided (HM)1382As a hiring member—Interviewers are provided with a set of questions for structured interviews
7Prefer Unstructured Interviewers (HM)1385As a hiring member—Interviewers prefer unstructured interviews
8Interviewer Temperament (HM)1385As a hiring member—Interviewers’ temperament affected the hiring decision
9Process Taken Seriously (HM)1382As a hiring member—Interviewers take the hiring process seriously
10Process Needs Improvements (HM)1383As a hiring member—The existing hiring process we follow requires improvements
11Overall Good Process (HM)1383As a hiring member—Overall, I feel we have a good hiring process set for this organisation
12Reasonable Total Duration (HS)3502As a successful applicant—The duration of the total hiring process was reasonable
13Appropriate Methods (HS)3502As a successful applicant—The interview method used was appropriate (phone/face-to-face/panel, etc.)
14Bias on Religion (HS)3506As a successful applicant—There was bias in the hiring decision based on religion
15Bias on Gender (HS)3506As a successful applicant—There was bias in the hiring decision based on gender
16Bias on Ethnicity (HS)3506As a successful applicant—There was bias in the hiring decision based on ethnicity
17No Bias (HS)3503As a successful applicant—There was no bias of any sort in the hiring decision
18Temperament Impacted (HS)3505As a successful applicant—The interviewer’s temperament impacted on the hiring decisions
19Relevant Questions (HS)3502As a successful applicant—All interview questions were relevant to the job
20Organised Process (HS)3502As a successful applicant—The interview process was well organised
21Prepared Interviewers (HS)3502As a successful applicant—The interviewers were well prepared for the interview
22Interview Length (HS)3502As a successful applicant—The length of the interviews was reasonable
23Equal Panel Participation (HS)3502As a successful applicant—All interviewers in the panel participated equally in the interview
24Qualified Interviewer (HS)3502As a successful applicant—I felt the interviewers had the necessary qualifications to interview
25Internal Employee Preference (HS)3504As a successful applicant—I feel that internal employees are preferred to external applicants for interviews
26Provided Feedback (HS)3504As a successful applicant—Constructive interview feedback was provided after the interview
27Process Improvements Required (HS)3504As a successful applicant—The hiring process requires many improvements
28Regard Based on Process (HS)3503As a successful applicant—I have high regard for this organisation based on its hiring process
29Overall satisfaction (HS)3503As a successful applicant—Overall, I was satisfied with the entire hiring process
30Scoring System (HP)2032In general, an interview scoring sheet can be used to assist in hiring decisions
31Panel Structure (HP)2034In general, one-to-one interviews are better than panel interviews
32Multiple Interviews (HP)2035In general, multiple one-to-one interviews can replace a panel interview
33Applicant Suggestions (HP)2033In general, HR can collect feedback/suggestions on interview experience from applicants
34Interview Feedback (HP)2032In general, constructive interview performance feedback should be provided
35Objective Consistent System (HP)2033In general, we promote an objective and standard model for hiring process across all TAFE
36Transparency (HP)2033In general, the hiring process needs to have more transparency in the hiring decision
37HR Representative (HP)2033In general, we always have an HR representative during interviews to ensure standard/consistency
38Successful Applicant Summary (HP)2004In general, we share a summary of a successful candidate with other interview applicants
Table A2. Communalities using the principal axis factoring extraction method for EFA.
Table A2. Communalities using the principal axis factoring extraction method for EFA.
Communalities
InitialExtraction
HS_Total_hiring_process0.5350.438
HS_Appropriate_Int_Method0.6900.740
HS_Religion_Bias0.5980.517
HS_Gender_Bias0.8480.926
HS_Ethnicity_Bias0.8180.799
HS_No_Bias0.4300.305
HS_Intrwr_Temp_Impact0.6160.633
HS_Relvnt_IntQ0.6100.601
HS_Int_Process_Organised0.7880.761
HS_Prepared_Intrwr0.8090.766
HS_Int_length_Reasonale0.7100.765
HS_Equal_Panel_Particp0.6630.591
HS_Intwr_Quals_Suffcnt0.7560.801
HS_InternalEmp_Prf_ExtEmp0.4550.430
HS_IntFdbk_Provided0.5040.424
HS_Hir_Process_Imprv_Req0.6830.705
HS_HighReg_HiringProcess0.7290.817
HS_Overall_Satisfied0.7130.778
HP_Use_Interview_Scoring0.4280.354
HP_ReplacePanel_1to10.5110.644
HP_Multiple_1to10.5450.570
HP_HR_collect_suggestions0.3420.286
HP_Constructive_interview_feedback0.4210.423
HP_Objective_standard_model0.3160.310
HP_More_transparency0.4660.468
HP_HR_representative_Int0.3110.269
HP_summary_successful_candidate0.3910.312
HM_Resonable_Total_Dur0.5920.697
HM_Bias_Present0.5220.468
HM_Qualfd_Intwr0.5660.550
HM_Sufficient_PlanPrep0.5470.479
HM_Prompt_FinalDesn0.5400.445
HM_Intrwr_ProvidedwithQuestion0.5560.590
HM_Intrwr_Prfer_UnstructuredInt0.5310.457
HM_Intvwrs_Temprmt_ImpactedDecsn0.6010.704
HM_Intwrs_take_Hpserious0.4520.451
HM_HP_requires_Imp0.6160.564
HM_Overall_HP_goodinOrg0.7510.714
Table A3. Total variance explained with initial eigenvalues (extraction method: principal axis factoring).
Table A3. Total variance explained with initial eigenvalues (extraction method: principal axis factoring).
Total Variance Explained
FactorInitial EigenvaluesExtraction Sums of Squared LoadingsRotation Sums of Squared Loadings
Total% of VarianceCumulative %Total% of VarianceCumulative %Total% of VarianceCumulative %
19.78125.74025.7409.43524.82924.8293.91010.28810.288
22.9197.68033.4202.5806.78831.6172.9907.86918.157
32.5276.65040.0702.0275.33436.9512.8007.36925.526
41.9495.12845.1981.4503.81640.7672.0755.45930.986
51.8104.76349.9611.4333.77244.5391.7404.58035.565
61.4483.81053.7721.0552.77647.3151.7294.55040.115
71.3533.55957.3310.9142.40549.7201.6154.24944.365
81.2453.27660.6070.7832.06151.7811.6124.24348.608
91.1503.02563.6320.6911.81853.5991.2383.25851.866
101.0972.88766.5190.6481.70555.3040.9332.45554.321
111.0212.68669.2050.5351.40956.7130.9092.39256.713
120.9372.46771.672
130.9162.41074.082
140.8082.12676.208
150.7752.03978.247
160.7211.89680.143
170.6861.80681.949
180.6321.66483.614
190.6181.62785.241
200.5791.52586.766
210.5461.43888.203
220.4851.27689.479
230.4531.19290.672
240.4211.10791.778
250.3770.99192.769
260.3530.92993.698
270.3130.82494.522
280.2890.75995.281
290.2730.71895.999
300.2600.68496.683
310.2310.60797.290
320.2020.53197.821
330.1820.47998.300
340.1620.42698.726
350.1540.40599.131
360.1360.35999.490
370.1150.30399.794
380.0780.206100.000
Table A4. Rotated component matrix (extraction method: principal component analysis; rotation method: varimax with Kaiser normalization).
Table A4. Rotated component matrix (extraction method: principal component analysis; rotation method: varimax with Kaiser normalization).
Component
1234567891011
HS_Int_length_Reasonale0.824
HS_Relvnt_IntQ0.787
HS_Int_Process_Organised0.715
HS_Prepared_Intrwr0.691
HS_Equal_Panel_Particp0.687
HS_Appropriate_Int_Method0.680
HS_Intwr_Quals_Suffcnt0.542 0.400
HM_Intrwr_ProvidedwithQuestion 0.783
HM_Sufficient_PlanPrep 0.699
HM_Qualfd_Intwr 0.689
HM_Overall_HP_goodinOrg 0.5850.464
HM_Resonable_Total_Dur 0.551
HM_Prompt_FinalDesn 0.514
HS_HighReg_HiringProcess 0.723
HS_Hir_Process_Imprv_Req −0.717
HM_HP_requires_Imp −0.602
HM_Bias_Present −0.530
HS_Gender_Bias 0.861
HS_Religion_Bias 0.831
HS_Ethnicity_Bias 0.828
HM_Intvwrs_Temprmt_ImpactedDecsn 0.760
HM_Intrwr_Prfer_UnstructuredInt 0.653
HS_Intrwr_Temp_Impact 0.527 −0.403
HP_ReplacePanel_1to1 0.805
HP_Multiple_1to1 0.759
HP_HR_representative_Int 0.699
HP_Objective_standard_model 0.606
HP_More_transparency 0.576
HP_Constructive_interview_feedback 0.521
HP_summary_successful_candidate
HS_IntFdbk_Provided 0.684
HS_Overall_Satisfied 0.468 0.555
HP_Use_Interview_Scoring 0.687
HS_Total_hiring_process −0.519
HP_HR_collect_suggestions −0.4170.441
HS_No_Bias −0.756
HS_InternalEmp_Prf_ExtEmp 0.730
HM_Intwrs_take_Hpserious 0.535

Appendix B

Table A5. Qualitative analysis: coding process and list.
Table A5. Qualitative analysis: coding process and list.
#CodesCategoriesThemes
1Constructive FeedbackApplicant FeedbackEnhancing applicant feedback process with enriched information
2Feedback UtilityApplicant FeedbackEnhancing applicant feedback process with enriched information
3Organisational Change ImpactsExternal Association to the Interview ProcessEnsuring integrity of the interview selection process
4Feeling of Being UnsuccessfulExternal Association to the Interview ProcessEnhancing applicant feedback process with enriched information
5Feeling of Being SuccessfulExternal Association to the Interview ProcessEnhancing applicant feedback process with enriched information
6Interviewers Seriousness of the ProcessHiring Members InvolvementContributory elements towards the overall satisfaction of the interview selection process
7Participation of Panel MembersHiring Members InvolvementContributory elements towards the overall satisfaction of the interview selection process
8Applicant Database Like SeekInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
9Best Interview: ElementsInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
10Northern Territory state’s Feedback Process Replication in Victoria stateInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
11Practical Improvements to HPInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
12Request for FeedbackInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
13Share Interview Questions Prior InterviewInterview Process EnhancementsContributory elements towards overall satisfaction of the interview selection process
14Use of Technology: Recruitment Management SystemInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
15Common Selection Process for all TAFEsInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
16Use of Scores and RanksInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
17Shortlisting Strategies for InterviewInterview Process Enhancementsoverall satisfaction of the interview selection process
18Shortlisting Strategies After InterviewInterview Process EnhancementsContributory elements towards the overall satisfaction of the interview selection process
19Appropriate Interview MethodInterview Process ProblemsEnsuring integrity of the interview selection process
20Duration of InterviewInterview Process ProblemsEnsuring integrity of the interview selection process
21Interview Planning and Prep ProcessInterview Process ProblemsEnsuring integrity of the interview selection process
22Interview Outcome Conveyed DurationInterview Process ProblemsEnsuring integrity of the interview selection process
23Interview TrainingInterview Process ProblemsEnsuring integrity of the interview selection process
24Interviewer QualifiedInterview Process ProblemsEnsuring integrity of the interview selection process
25Organised Interview ProcessInterview Process ProblemsEnsuring integrity of the interview selection process
26Relevant Interview QuestionsInterview Process ProblemsEnsuring integrity of the interview selection process
27Cert IV implementedInterview Process ProblemsEnsuring integrity of the interview selection process
28Key Selection Criteria - EnhancementInterview Process ProblemsEnsuring integrity of the interview selection process
29Worst Interview: ElementsInterview Process ProblemsEnsuring integrity of the interview selection process
30Interviewer Preference Between Structured and Unstructured InterviewsInterview Process ProblemsEnsuring integrity of the interview selection process
31Training on How to Use the Scoring and RankingInterview Process ProblemsEnsuring integrity of the interview selection process
32Hiring Decision OverriddenVariation of Bias in InterviewsEnsuring integrity of the interview selection process
33Internal Employees PreferredVariation of Bias in InterviewsEnsuring integrity of the interview selection process
34Interviewer BiasVariation of Bias in InterviewsEnsuring integrity of the interview selection process
35Underpin Fairness, Equality, and TransparencyVariation of Bias in InterviewsEnsuring integrity of the interview selection process

References

  1. Ekwoaba, J.O.; Ikeije, U.U.; Ufoma, N. The impact of recruitment and selection criteria on organizational performance. Glob. J. Hum. Resour. Manag. 2015, 3, 22–23. [Google Scholar]
  2. Huffcutt, A.I.; Culbertson, S.S. APA Handbook of Industrial and Organizational Psychology, Vol. 2: Selecting and Developing Members for the Organization; Zedeck, S., Ed.; American Psychological Association: Washington, DC, USA, 2010; pp. 185–203. [Google Scholar]
  3. Wyatt, D.; Jamieson, R. Improving Recruitment and Selection Decision Processes with an Expert System. In Proceedings of the Second Americas Conference on Information Systems, Phoenix, AZ, USA, 16–18 August 1996. [Google Scholar]
  4. Sudheshna, B.; Sudhir, B. Employee Perception and Interview Process in IT Companies. Splint Int. J. Prof. 2016, 3, 89–91. [Google Scholar]
  5. Ababneh, K.I.; Al-Waqfi, M.A. The role of privacy invasion and fairness in understanding job applicant reactions to potentially inappropriate/discriminatory interview questions. Pers. Rev. 2016, 45, 392–418. [Google Scholar] [CrossRef]
  6. Saidi Mehrabad, M.; Brojeny, M.F. The development of an expert system for effective selection and appointment of the jobs applicants in human resource management. Comput. Ind. Eng. 2007, 53, 306–312. [Google Scholar] [CrossRef]
  7. Ones, D.S.; Viswesvaran, C. Integrity tests and other criterion-focused occupational personality scales (COPS) used in personnel selection. Int. J. Sel. Assess. 2001, 9, 31–39. [Google Scholar] [CrossRef]
  8. Gierlasinski, N.J.; Nixon, D.R. A Comparison of Interviewing Techniques: HR versus Fraud Examination. Oxf. J. Int. J. Bus. Econ. 2014, 5, 1. [Google Scholar]
  9. Robertson, I.T.; Smith, M. Personnel selection. J. Occup. Organ. Psychol. 2001, 74, 441–472. [Google Scholar] [CrossRef]
  10. Petrović, D.; Puharić, M.; Kastratović, E. Defining of necessary number of employees in airline by using artificial intelligence tools. Int. Rev. 2018, 3–4, 77–89. [Google Scholar] [CrossRef]
  11. Naim, I.; Tanveer, M.I.; Gildea, D.; Hoque, M.E. Automated analysis and prediction of job interview performance. IEEE Trans. Affect. Comput. 2016, 9, 191–204. [Google Scholar] [CrossRef]
  12. Khosla, R.; Chu, M.-T.; Yamada, K.G.; Kuneida, K.; Oga, S. Innovative embodiment of job interview in emotionally aware communication robot. In Proceedings of the 2011 International Joint Conference on Neural Networks, San Jose, CA, USA, 31 July–5 August 2011; pp. 1546–1552. [Google Scholar]
  13. Tsai, W.C.; Chen, F.H.; Chen, H.-Y.; Tseng, K.-Y. When Will Interviewers Be Willing to Use High-structured Job Interviews? The role of personality. Int. J. Sel. Assess. 2016, 24, 92–105. [Google Scholar] [CrossRef]
  14. Levashina, J.; Hartwell, C.; Morgeson, F.; Campion, M.A. The structured employment interview: Narrative and quantitative review of the research literature. Pers. Psychol. 2014, 67, 241–293. [Google Scholar] [CrossRef]
  15. Huffcutt, A.I.; Arthur, W., Jr. Hunter and Hunter (1984) revisited: Interview validity for entry-level jobs. J. Appl. Psychol. 1994, 79, 184–190. [Google Scholar] [CrossRef]
  16. Dana, J.; Dawes, R.; Peterson, N. Belief in the unstructured interview: The persistence of an illusion. Judgm. Decis. Mak. 2013, 8, 512. [Google Scholar]
  17. Law, P.K.; Yuen, D.C. An empirical examination of hiring decisions of experienced auditors in public accounting: Evidence from Hong Kong. Manag. Audit. J. 2011, 26, 760–777. [Google Scholar] [CrossRef]
  18. Rozario, S.D.; Venkatraman, S.; Abbas, A. Challenges in Recruitment and Selection Process: An Empirical Study. Challenges 2019, 10, 35. [Google Scholar] [CrossRef]
  19. Peccei, R.; Van de Voorde, K. The application of the multilevel paradigm in human resource management–outcomes research: Taking stock and going forward. J. Manag. 2019, 45, 786–818. [Google Scholar] [CrossRef]
  20. Schmidt, J.A.; Pohler, D.M. Making stronger causal inferences: Accounting for selection bias in associations between high performance work systems, leadership, and employee and customer satisfaction. J. Appl. Psychol. 2018, 103, 1001. [Google Scholar] [CrossRef]
  21. Jiang, K.; Messersmith, J. On the shoulders of giants: A meta-review of strategic human resource management. Int. J. Hum. Resour. Manag. 2018, 29, 6–33. [Google Scholar] [CrossRef]
  22. Boon, C.; den Hartog, D.N.; Lepak, D.P. A systematic review of human resource management systems and their measurement. J. Manag. 2019. [Google Scholar] [CrossRef]
  23. Sackett, P.R.; Lievens, F. Personnel selection. Annu. Rev. Psychol. 2008, 59, 419–450. [Google Scholar] [CrossRef]
  24. Ployhart, R.; Harold, C. The Applicant Attribution-Reaction Theory (AART): An Integrative Theory of Applicant Attributional Processing. Int. J. Sel. Assess. 2004, 12, 84–98. [Google Scholar] [CrossRef]
  25. Lederman, N.G.; Lederman, J.S. What Is a Theoretical Framework? A Practical Answer. J. Sci. Teach. Educ. 2015, 26, 593–597. [Google Scholar] [CrossRef]
  26. Saunders, M. Research Methods for Business Students, 6th ed.; Lewis, P., Thornhill, A., Eds.; Pearson Education: Harlow, UK, 2012. [Google Scholar]
  27. Bell, E.; Bryman, A.; Harley, B. Business Research Methods; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
  28. Saunders, M. Research Methods for Business Students, 7th ed.; Lewis, P., Thornhill, A., Eds.; Pearson Education: Harlow, UK, 2016. [Google Scholar]
  29. DeCoster, J. Overview of Factor Analysis. 1998. Available online: www.stat-help.com/notes.html (accessed on 23 May 2019).
  30. Williams, B.; Onsman, A.; Brown, T. Exploratory factor analysis: A five-step guide for novices. Australas. J. Paramed. 2010, 8, 3. [Google Scholar] [CrossRef]
  31. Conway, J.M.; Huffcutt, A.I. A review and evaluation of exploratory factor analysis practices in organizational research. Organ. Res. Methods 2003, 6, 147–168. [Google Scholar] [CrossRef]
  32. Chamorro-Premuzic, T. Psychology of Personnel Selection; Furnham, A., Ed.; Cambridge University Press: Cambridge, UK, 2010. [Google Scholar]
  33. Bradley-Adams, K. Face to face with success: Keith Bradley-Adams offers advice on how to behave in interviews and how to answer tricky questions. (Career Development) (Brief article). Nurs. Stand. 2011, 25, 63. [Google Scholar]
  34. Entwistle, F. How to prepare for your first job interview. Nurs. Times 2013, 109, 14–15. [Google Scholar]
  35. Emett, S.; Wood, D.A. Tactics: Common questions-evasion: Identify ploys and learn methods to get the specific answers you need. J. Account. 2010, 210, 36. [Google Scholar]
  36. Macan, T. The employment interview: A review of current studies and directions for future research. Hum. Resour. Manag. Rev. 2009, 19, 203–218. [Google Scholar] [CrossRef]
  37. Miller, V.D.; Gordon, M.E. Meeting the Challenge of Human Resource Management: A Communication Perspective; Routledge: New York, NY, USA, 2014. [Google Scholar]
  38. Cook, M.; Cripps, B. Psychological Assessment in the Workplace; John Wiley & Sons Ltd.: West Sussex, UK, 2005. [Google Scholar]
Figure 1. Theoretical framework using Applicant Attribution-Reaction Theory (AART); adapted from Ployhart and Harold [24].
Figure 1. Theoretical framework using Applicant Attribution-Reaction Theory (AART); adapted from Ployhart and Harold [24].
Systems 07 00050 g001
Figure 2. Research design overview.
Figure 2. Research design overview.
Systems 07 00050 g002
Figure 3. Scree plot for 38 variables processed.
Figure 3. Scree plot for 38 variables processed.
Systems 07 00050 g003
Figure 4. Word frequency for the use of recruitment systems in HR departments.
Figure 4. Word frequency for the use of recruitment systems in HR departments.
Systems 07 00050 g004
Figure 5. Mapping the qualitative and quantitative findings.
Figure 5. Mapping the qualitative and quantitative findings.
Systems 07 00050 g005
Table 1. Summary of the literature review.
Table 1. Summary of the literature review.
BackgroundRelated TheoryIntroduced BySynopsis of the Theory
PersonalityCognitive-Affective System TheoryWalter Mischel and Yuichi ShodaPersonality tendencies may be stable in a specific context but may vary significantly on other domains due to the psychological cues and demands unique to one context.
Work performanceTrait Activation TheoryRobert Tett and Dawn BurnettTrait activation theory states that employees will be looking for and derive fundamental satisfaction from a work environment that consents for the easy expression of their unique personality traits.
Job interviewInterpersonal Deception TheoryDavid B. Buller and Judee K. BurgoonDescribes how people handle actual/perceived deception knowingly or unknowingly while involved in face-to-face communication.
Signalling TheoryMichael SpenceOne party (agent) credibly conveys some information about itself to another party (principal).
The Theory of Planned BehaviourIcek AjzenExplaining human behaviour by including perceived behavioural control. Connecting behaviour with beliefs to improve the predictive power of the theory of reasoned action.
Theory of Reasoned ActionMartin Fishbein and Icek AjzenIt is used in predicting how a person would behave based on their pre-existing approaches and behavioural intents.
Social Cognitive TheoryAlbert BanduraRemembering the consequences and sequence of others experience and using this information to guide their own subsequent behaviours even when they have not experienced it beforehand.
RetentionExpectancy-Value TheoryLynd-StevensonThere must be a balanced relationship between the candidate’s expectations and the value the company can deliver.
Table 2. List of Victorian TAFE association members.
Table 2. List of Victorian TAFE association members.
#Urban Institutes (Melbourne)#Regional InstitutesRegional Location
1Box Hill Institute1Federation TrainingChadstone
2Chisholm2Federation UniversityBallarat
3Holmesglen3Gordon Institute of TAFEGeelong
4Kangan Institute4South West TAFEWarrnambool
5Melbourne Polytechnic5Wodonga TAFEWodonga
6William Angliss Institute6Sunraysia InstituteMildura
7AMES Australia7GOTAFEWangaratta
8RMIT University
9Swinburne
10Victoria Polytechnic
Table 3. Response rate for qualitative and quantitative data collection.
Table 3. Response rate for qualitative and quantitative data collection.
TAFE/Dual-SectorInterviewsSuccessful Interview ExperienceUnsuccessful Interview ExperienceHiring Member
(Name Coded)Qualitative Data (N)Quantitative Data (N)
15923
262227
3822513
42613
5445428
621107
72934
851445
952156
107591428
1161688
1221025
132301
1461926
1552455
16518412
172835
Other052270
Total7436891146
Table 4. Gender distribution of survey participants.
Table 4. Gender distribution of survey participants.
GenderFrequencyPercentage
Male10451%
Female9848%
Do not wish to answer21%
Table 5. Age distribution of survey participants.
Table 5. Age distribution of survey participants.
AgeFrequencyPercentage
25–34157%
35–443216%
45–546431%
55–647939%
65–74147%
Table 6. Citizenship distribution of survey participants.
Table 6. Citizenship distribution of survey participants.
CitizenshipFrequencyPercentage
Australian Citizen18490%
Permanent Resident 168%
Other42%
Table 7. Research question mapped with the online survey and interview questions.
Table 7. Research question mapped with the online survey and interview questions.
Research QuestionQuantitative Analysis—Online Survey QuestionsQualitative Analysis—Interview Questions
What are the dominant factors to be considered for improving the effectiveness of the current hiring process?• The duration of the total hiring process was reasonable
• There was bias in the hiring decision
• All interview questions were relevant to the job
• The interview process was well organised
• Constructive interview feedback was provided
• What more information would you have liked when the hiring decision was conveyed to you?
• Can you describe the best interview you have had as an interviewee? Why is it the best?
• Can you describe the best interview you have had as an interviewer?
• How do you think we can underpin fairness, equity, and transparency in the hiring process?
Table 8. Kaiser–Meyer–Olkin test and Bartlett’s test.
Table 8. Kaiser–Meyer–Olkin test and Bartlett’s test.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.807
Bartlett’s Test of SphericityApproximate chi-square2486.455
df703
Significance0.000
Table 9. Cronbach’s alpha reliability statistics.
Table 9. Cronbach’s alpha reliability statistics.
Valid CasesReliability Statistics
N%Cronbach’s AlphaN Items
Case Processing Summary 135095.10.903 *7
Case Processing Summary 213837.50.815 *6
Case Processing Summary 313536.7−0.6965
Case Processing Summary 435095.10.905 *3
Case Processing Summary 513536.70.714 *3
Case Processing Summary 620355.20.784 *2
Case Processing Summary 720355.20.5844
Case Processing Summary 820355.20.1175
Case Processing Summary 920355.20.0893
Case Processing Summary 1013536.70.0212
Case Processing Summary 1113536.70.0212
* Reliability scale of >0.7.
Table 10. The Exploratory Factor Analysis components with labels (using SPSS).
Table 10. The Exploratory Factor Analysis components with labels (using SPSS).
#EFA Components Label
1Training
2Planning and structured interviews
3Bias in the selection process
4Interviewer’s personality
5Panel Interview and Transparency
Table 11. Thematic analysis: categories (using NVivo).
Table 11. Thematic analysis: categories (using NVivo).
#Thematic Analysis (Categories)
1Hiring members involvement
2Interview process enhancements
3Variation of bias in interviews
4Interview process problems
5Applicant feedback

Share and Cite

MDPI and ACS Style

Rozario, S.D.; Venkatraman, S.; Chu, M.-T.; Abbas, A. Dominant Factors for an Effective Selection System: An Australian Education Sector Perspective. Systems 2019, 7, 50. https://doi.org/10.3390/systems7040050

AMA Style

Rozario SD, Venkatraman S, Chu M-T, Abbas A. Dominant Factors for an Effective Selection System: An Australian Education Sector Perspective. Systems. 2019; 7(4):50. https://doi.org/10.3390/systems7040050

Chicago/Turabian Style

Rozario, Sophia Diana, Sitalakshmi Venkatraman, Mei-Tai Chu, and Adil Abbas. 2019. "Dominant Factors for an Effective Selection System: An Australian Education Sector Perspective" Systems 7, no. 4: 50. https://doi.org/10.3390/systems7040050

APA Style

Rozario, S. D., Venkatraman, S., Chu, M. -T., & Abbas, A. (2019). Dominant Factors for an Effective Selection System: An Australian Education Sector Perspective. Systems, 7(4), 50. https://doi.org/10.3390/systems7040050

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop