Next Article in Journal
Application of Integrated Project Delivery (IPD) in the Middle East: Implementation and Challenges
Previous Article in Journal
Strength Performance of the Connection between Brick and SPF Lumber
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Confirmatory Factor Analysis of Performance Measurement Indicators Determining the Uptake of CPS for Facilities Management

1
CIDB Centre of Excellence, Faculty of Engineering and the Built Environment, Doornfontein Campus, University of Johannesburg, Johannesburg 2006, South Africa
2
College of Design, Construction and Planning, University of Florida, Gainesville, FL 32611, USA
3
Department of Construction Management and Quantity Surveying, University of Johannesburg, Johannesburg 2006, South Africa
*
Author to whom correspondence should be addressed.
Buildings 2022, 12(4), 466; https://doi.org/10.3390/buildings12040466
Submission received: 6 March 2022 / Revised: 20 March 2022 / Accepted: 22 March 2022 / Published: 11 April 2022
(This article belongs to the Section Construction Management, and Computers & Digitization)

Abstract

:
With the use of cyber-physical systems (CPS) advanced computational capabilities, the delivery of facilities management (FM) mandates are efficiently and effectively conducted. Since performance measurement is an important yardstick in ascertaining the outcome of FM approaches, this study assesses the performance measurement indicators that influence the uptake of CPS for FM functions. Using a structured questionnaire, data were collected from built environment professionals in the Gauteng province of South Africa. Data collected was analysed using a five-stage process which includes: data reliability and validity, descriptive statistics, establishing a difference in groups’ opinion, principal component analysis, and model testing and fit statistics for confirmatory factor analysis. Results from the study indicate that three significant performance indicators are influential in determining the uptake of CPS for FM, which are operations efficiency, facility adaptation, and client’s satisfaction. The study contributes immensely to the body of knowledge. It unveils the significant performance measurement indicators that would help organisations, facilities managers, and policymakers guide their decisions hinged on the espousal of innovative technologies. Furthermore, the study serves as a solid theoretical base for further studies showcasing a roadmap for digitalisation for FM functions by unravelling the significant performance measurement indicators.

1. Introduction

According to Atkin and Brooks [1], facilities management (FM) encompasses a broad range of services, including building maintenance, real estate management, domestic services, health and safety, and contract management. Additionally, FM is defined as “the incorporation of multiple disciplines to ensure the functionality of the built environment by integrating people, process and technology” [2] (p. 54). Also, “the integration of processes within an organisation in the built environment to maintain and develop the agreed services which support and improve the effectiveness of the organisation’s primary activities and management of the impact of these processes upon the workplace” [3] (p. 15). FM is attributed with tactical positioning while aiming to balance business concerns and the management of services and technical processes [4,5,6]. Therefore, it can be said that FM covers a wide spectrum of service solutions, including accessibility, sustainability, safety, hospitality, and productivity [7]. The task of FM during the lifecycle of built-up infrastructure is significant, resulting from the strategic values that accompany its delivery and the contribution made in the assessment of utilisation of facilities.
The construction industry is still characterised by inefficiencies in its processes resulting from outdated approaches and methods in service delivery [8,9,10]. FM functions are not spared as they are still plagued with a myriad of challenges that hamper their efficiency and effectiveness. One of the major challenges posed to FM task functions is keeping abreast with innovative technologies, which aids in keeping accurate data in the process of appropriate planning and decision making [11,12,13]. McKinsey [14] stated that for the adoption of digital technologies, the construction/facilities management industry is ranked 21st among 22 industries. Furthermore, Ikuabe et al. [15] affirmed that digital technologies at the various phases of building projects are least encountered at the operations/maintenance phase. Oladejo [16] noted that the inadequacy in the core and fundamental principles of FM, which yields low technical output, is a major challenge faced in FM functions. Hence, the non-alignment of FM tasks with evolving technological innovations results in a setback for optimum performance. Islam et al. [17] observed that the building industry is forced to introduce efficient upkeep of service systems due to the evolving development of technologies for the attainment of longevity in operations and functions of buildings.
The fourth industrial revolution (4IR) has ushered in the intensified digital/information technology application, which is evident in different aspects of human activities [18,19,20]. The upsurge in digital technologies has been experienced in sectors such as banking, health and medical, manufacturing, transportation, etc. [21,22,23]. Although still lagging in adopting digital technologies for its processes, the construction/facilities management industry is gradually imbibing the principles to propagate the benefits of its espousal [15]. With the numerous development problems facing the 21st century and as contained in the Sustainable Development Goals (SDGs), no people and profession can efficiently thrive without professional strategies, visions, and aspirations that are embedded in the technologically driven realm of knowledge [24,25]. Therefore, the espousal of digital technologies in FM activities would aid in abating some of the perennial challenges posed to the effective delivery of its core mandates. One of such digital technologies is cyber-physical systems (CPS). The system is attributed to computing networks whose operation is hinged on integrating computational abilities and physical processes. According to Kim and Park [26] and Alguliyev et al. [27], CPS serves as the connection between the virtual and physical realm. The system serves as an enabling platform for resolving real-time problems in the fast-changing technological age [25,28,29]. In addition, Hererich et al. [30] affirmed that CPS provides the prospects of reducing downtime of equipment and enhancing operational efficiency. Hence, the integration of CPS for FM activities would aid in the seamless delivery of the core objectives of an organisation.
Since FM connects the knowledge of design and management of the completed facility through tactical implementation for the conformance of predetermined goals [31,32], the evaluation of performance of the adopted digital technology must be ascertained. Performance measurement provides perceptive contributions to effective control by offering discernments on the need and choice of various control systems. By the support of measurement functions, a mechanism of performance measurement contributes to the attainment of better goals of an organisation. Therefore, stakeholders’ need for performance measurement can be construed in the context of FM. Fundamentally, the contributions of FM would be ascertained over a spectrum of well-defined criteria for performance which would entail maintenance, finance, or economics. Therefore, the uptake of CPS for FM functions would be propagated when the measurement of the system’s delivery can be attributed to better and enhanced output compared to the conventional systems used for FM mandates. With this in mind, this study is geared towards assessing the performance measurement factors that would necessitate the uptake of CPS for FM. The other sections of this paper include the review of extant literature, research methodology, findings and discussion, conclusion, and recommendations.

2. Theoretical Background

It is evident that in the present fast-changing world, organisations’ performances and successes are hugely dependent on their ability to gather and integrate massive information flows and take intelligent actions on the information. According to [33,34], performance is a process or action to accomplish or carry out a function or a task. Moreover, Wong and Snell [35] (p. 54) made more elaborations, noting that task performance refers to the ability to efficiently engage in tasks delivery stipulated in the “job description” and with its recognition as part of the job and demands the ability for the utilisation of a given set of knowledge and skills. Furthermore, Armstrong [36] (p. 240) noted that accomplishing an obligation “following predetermined procedures” is what connotes performance. Ismajli et al. [37] stated that performance is a function of motivation, skills, and knowledge factors. The context of motivation in this regard refers to the “willingness” to accomplish a particular task. Skills refer to the “ability” to engage in a specific task or job, and knowledge refers to “knowing what to do”. The general performance perspective is a fusion of the highlighted three factors [33].
Performance measurement offers insightful contributions to efficient control by recognising the need and choice of adopting control mechanisms. Through the support of measurement functions, a system of performance measurement contributes to the organisation’s attainment of better goals. In a well-defined context, the need for performance measurement by broad management can be construed in the context of FM [38,39]. The contributions of FM would be adjudged by the stakeholders of an organisation over a spectrum of outlined criteria for performance which could include metrics of economics, finance, or maintenance. The contributions of FM to organisations’ delivery are viewed in different facets, including service delivery, resource control, culture, strategy, supply chain management, and change management [40]. These are all pointers towards the effectiveness of FM contribution in organisations and the examination of the several ways in which performance can be influenced. Medne and Lapina [41] emphasised the necessity for the focus of performance measurement on continuous improvement. In addition, the efficient system of performance measurement ought to make provisions for accurate and timely feedback on operations effectiveness [42]. Based on this, Reference [43] outlined the dimensions for measurement as follows: change management, planning, controlling and evaluating, allocation of resources, measurement and improvement, communication, measurement, and motivation.
Goals and objectives can be regarded as two related but different concepts. New Mexico City [44] outlined a goal as a broad statement stating the intended achievement of the programme. At the same time, an objective is a defined, measurable condition whose attainment is a must in the accomplishment of a certain programme goal. It can be deduced from the preceding that a goal can target a set objective due to its verifiable and finite characteristics. Lehigh [45] (p. 1) posits that a goal “is a broad statement about the desired outcome with one or more specific objective(s) that define in precise terms what is to be accomplished within a designated time frame”. Irrespective of the nonfigurative nature of goals, they still serve as commencement points for processes in management in an organisation. The duty of management is only carried out successfully when there is a clear and adequately articulated goal statement [46]. Organisations set goals for several reasons. They are set for the provision of a standard for performance. It places focus on the organisation’s activities and gives a clear direction for all stakeholders in the organisation. What is expected of everyone is known, and there is channelling of efforts to deliver specific outcomes [47]. Moreover, Cascio [48] affirms that goals focus on the requirement of a specific performance; they mobilise efforts for the accomplishment of performance of higher levels and foster determination for the performance of higher levels. They also give a clear direction of the destination of the organisation. When there is proper and appropriate execution of goal setting, it is seen to improve both performance behaviours and outcomes [49,50]. There is a positive influence on performance due to the influence from goals on direction, the effort of attention and persistence, as well as its stimulation on strategy development [50]. Moreover, when tasks are synergised with goals, it breeds more interest in stakeholders, leading to more directed attention cognitively and behaviourally [51].
In organisations, exploitation pursues efficiency by enabling all tasks, activities, and processes to be predictable and repeatable. In comparison, innovative concepts and ideas are, by nature, non-routine [52] and have some involvement of a certain degree of uncertainty and ambiguity [53]. Therefore, it is advised that planning and organisation should be differently considered during initiating innovations since there are some basic incompatibilities between innovation and efficiency [53]. In furtherance to this, Govindarajan [52] (p. 14) noted that “while break all the rules is toxic as a leadership mantra, there is some truth in that notion, because there are different rules for innovation”. As earlier stated, there is a strong recommendation of goal setting for specific and challenging goals. This is due to the need for better task performance delivery of the goals when compared with setting a goal that seems easy, ambiguous, vague, and usually referred to as “do your best goals” [54]. Moreover, putting into action the goal-setting plan, stipulated criteria should detail the: specificity of the goal, its measurability, attainability, relevance, and time frame [55,56]. However, some schools of thought believe that some of these highlighted goal attributes might not be beneficial in some given conditions, such as the search for innovation and creativity [57,58].
For non-operational goals, the primary objective is the legitimisation of the organisation. Hence, such goals are regarded as ambiguous and abstract goals designed to encourage stakeholders to embrace the values and cultures of the organisation [59]. For operational goals, the primary objective is to aid the measurement of the effectiveness and performance of the organisation [60]. Managers in organisations deploy distal and proximal goals when implementing the goal-setting strategies [61]. Proximal goals are short-term, benchmark goals, while distal goals are more of longer-term and outcome goals [62]. Usually, distal goals are fragmented into proximal goals for clarity and response spanning through the attainment of the long-term objective [62]. It is important to note that the eventual outcome in the pursuit of a particular goal by an organisation will affect all organisation stakeholders [49]. Hence, the strategic plan put into action to achieve a stipulated goal should be all-inclusive of the various stakeholders in the organisation. According to Grünig and Kühn [63], the design and implementation of strategies can be done at functional, business, and corporate levels. For its proper effectiveness, there must be an alignment of the organisation’s objectives and the adopted strategies. In addition, Elbanna et al. [64] noted that the accomplishment of organisational objectives anchors the success of the adopted strategies.

3. Methodology

The study aims to assess the performance measurement factors that would influence the espousal of CPS for FM. The study adopted a deductive approach prefaced on a post-positivism philosophical perspective using quantitative data elicited from built environment professionals with a questionnaire survey. Tan [65] noted that a questionnaire helps gather information from a large number of respondents, which also permits the quantifiability and objectiveness of the research, hence, the choice of a questionnaire survey for the study. The study area for the research was the Gauteng province of South Africa, which is characterised by a wide range and a vast number of built-up facilities and a large pool of construction/facilities management professionals. The snowball sampling technique was adopted for the study, resulting from the difficulties in reaching out to built environment professionals knowledgeable in the area of digital technologies for FM. The sampling technique has the tendency to increase the sample size [66]; hence, the number of respondents that partook in the survey were two hundred and eighteen (218). The instrument for data collection had two sections. The first section enquired on the background information of the respondents. In contrast, the second section elicited responses based on the performance measurement factors that would be influential to the uptake of CPS for FM. The researcher distributed the questionnaire electronically, which aided the rapid distribution and collection of the responses from the respondents. The method of data analysis employed for analysing the information on the respondents’ background information was percentage. In addition, the reliability and validity of the research instrument was ascertained using Cronbach’s alpha. An alpha value of 0.899 was obtained, thus indicating high reliability since the value is closer to 1.00 [67]. The normality of the sampled data was ascertained using multivariate Kurtosis, which indicated a non-normality of the sampled data.
A total of sixteen factors were identified and initially assessed with the use of mean item score (MIS), resulting from the literature review, as shown in Figure 1. In addition, the Kruskal–Wallis H-test was used in determining if there is a significant difference in the responses of the respondents’ working affiliation. Furthermore, exploratory factor analysis (EFA) was deployed to evaluate the unidimensionality and factor-analysis abilty of the identified factors using SPSS version 27. EFA is used in the examining of the measurement structure of a group of variables [68]. Subsequent to the establishment of the outcome of the conducted EFA, a confirmatory factor analysis (CFA) was conducted. This was carried out to assess the measurement equivalency of the performance measurement factors influencing the uptake of CPS for FM. This was conducted using EQuation software (EQS) version 6.4. A multifaceted approach was adopted by the study for the model assessment with the use of the following indexes: Satorra–Bentler scaled chi-square (S−Bχ2), Bentler comparative fit index (CFI), goodness-of-fit index (GFI), standardised root mean square residual (SRMR), root mean square error of approximation (RMSEA), and root mean square error of approximation with 95% or 90% confidence interval (RMSEA @ 95% or 90% CI). These indexes provide a robust assessment of the suitability of fit of the identified factors with the data sample. Moreover, these outlined indexes present the comparative or incremental fit indexes and the absolute fit index [69,70]. Furthermore, the analysis gave the values of the internal consistency, z-statistics, and construct validity. Cronbach’s alpha and the Rho coefficient were employed to assess the internal reliability and consistency of the measurement constructs [69,71]. There is a strong consensus on using both measurement tools to establish internal consistency and validity of items constituting a questionnaire [72,73].

4. Results

4.1. Background Information of Respondents

The results obtained from the respondents’ demographic details showed that based on the professional designation of the respondents, 26.1% of the total respondents were quantity surveyors, 21.6% were facilities managers, 17.9% were construction managers. In comparison, project managers and engineers were 11% and 10.6% of the total respondents, respectively. In addition, architects and computer programmers made up 7.8% and 5% of the total respondents. Concerning years of professional experience, findings showed that respondents with 6–10 years of professional experience accounted for 30.7% of the aggregate number of respondents, while those with 1–5 years professional experience made up 24.8%, and 11–15 years and 16-20 years formed 19.3% and 10.6% of the total respondents, respectively. Based on the highest educational qualification, 41.7% had an honour’s degree, 27.1% had a bachelor’s degree, 19.7% had a master’s degree, and 8.3% had a doctorate degree. Based on the working organisation of the respondents, 34.4% are affiliated with consultancy firms, 44% work with contracting organisations, and 21.6% are with government establishments.

4.2. Descriptive Statistics and Kruskal–Wallis H-Test

Table 1 outlines the result of the mean ranking and K–W test conducted for the measurement variables for the performance measurement construct. Results indicate that the most rated variable is a built-in capability of facility adaptation (MIS = 4.39), the significance of quality assurance (MIS = 4.30), attainment of customers’ specific needs (MIS = 4.29), timely communication of policy changes (MIS = 4.27), and improvement of internal processes of the organisation (MIS = 4.26). In comparison, the least ranked variables are anticipation of the attainment of future needs of the organisation (MIS = 4.17), identification of problems in facilities (MIS = 4.16), and improved customer satisfaction (MIS = 4.14). For the individual groups of working organisations, respondents affiliated with government establishments most rated variables are: built-in capability of facility adaptation, evaluation of existing trends, and attainment of customers’ specific needs, while respondents working for contracting organisations mostly rated built-in capability of facility adaptation, the significance of quality assurance, and significance of cost savings. In addition, those working with consulting firms mostly ranked built-in capability of facility adaptation, attainment of customers’ specific needs, and significance of quality assurance. The overall group mean for all respondents is 4.24, while the different respondents based on those working for government establishments, contracting organisations, and consulting firms are 4.26, 4.20, and 4.27, respectively. Findings from the K–W test show that 2 variables from the 16 variables have a p-value ˂ 0.05, thus indicating a significant difference in the opinions of the respondents concerning these variables based on their working organisations. The variables are improving the organisation’s internal processes, and evaluating existing trends with p-values of 0.046 and 0.004, respectively. Moreover, the Cronbach alpha value for the measurement variables was indicated to be 0.899, thus portraying good reliability of the research instrument and the internal consistency of the measurement variables.

4.3. Exploratory Factor Analysis

Exploratory factor analysis (EFA) was conducted on the identified performance measurement factors that are influential to the adoption of CPS for FM. Table 2 outlines a KMO value of 0.926, which is >0.6 set as the threshold set by the study. In addition, the Bartlett test of sphericity gave a value of 1087.303 and a p-value of 0.000, thus being significant. These results affirm the factorability and suitability of the data to undergo EFA. Furthermore, the correlation matrix of the output was inspected to ascertain the suitability of the data for analysis. Findings revealed that most of the variables had a value ≥ 0.3, which upholds the suitability of the dataset. Moreover, the Cronbach alpha value of 0.899 was given, as reported in the preceding section.
Table 3 outlines the commonalities of the measurement variables extracted. The results provided show that all the variables had a value above 0.5, which was the threshold set for the study, thus indicating that all the variables do explain much of the variance. Furthermore, with the use of PCA adopting the varimax rotation, it is revealed that three components with an eigenvalue greater than 1 were extracted among the measurement variables. The total cumulative variance was given as 73.5%, which is above the threshold of 50% set for this study.
Table 4 shows the result of the rotated component matrix of the measurement variables for performance measurement. The result outlines that the factor loading for the variables is above 0.4, which is the threshold set for the study. In conjunction with the values of the extracted communalities, these results showcase that all the variables within a given component attain a good relationship with each other. The result of the rotated component matrix indicates that there are three components extracted. The first component has factor loadings ranging from 0.842 to 0.510 and named operations efficiency. The second component has factor loadings ranging from 0.764 to 0.514 and is named facility adaptation. The third component ranges from 0.787 to 0.513 named client’s satisfaction.

4.4. Confirmatory Factor Analysis

A confirmatory factor analysis (CFA) was conducted to affirm the validity of the resulting group factors from the EFA. The study used the robust maximum likelihood (RML) estimation since the data for the study was proven to be non-normal. The adoption of this estimation method would aid in catering for the non-normality of the dataset [72]. Table 5 shows the resulting groups of the performance measurement indicators that would influence the uptake of CPS for FM functions. It portrays the standardised coefficient (λ), which indicates the construct validity, the z-statistics, and the coefficient of determination (R2). In addition, the internal consistency of the model was ascertained with Cronbach alpha test and Rho alpha test. The use of both tests in determining model reliability is encouraged for a more robust internal consistency [74]. As indicated in Table 5, the standardised coefficient shows that all the factors range from 0.686 to 0.883, thus portraying a good construct validity since all the variables explained are above 50% (having a value above 0.5) of the total variance of the model. Furthermore, the Cronbach alpha, and Rho alpha test result gave coefficients of 0.899 and 0.786, respectively, indicating a good reliability since it is closer to 1 and above the threshold of 0.7 [75]. Moreover, the value of the z-statistics as shown in Table 5 shows that all the variables have values above 1.96, thus portraying the significant influence of the identified performance measurement indicators. In addition, the R2, which further indicates the significance of the identified factors, shows that all the indicators have a value closer to 1.0. The R2 establishes the predictive accuracy of the identified factors, and the predictive accuracy is given to be substantial, moderate, and weak when the values are 0.75, 0.50, and 0.25, respectively [76,77]. Since the values of the R2 have a range of 0.649 to 0.884, it is affirmed that the identified performance measurement indicators have a substantial predictive accuracy. While the formed groups indicate a predictive accuracy of 0.673, 0.715, and 0.698 for operations efficiency, facility adaptation, and client satisfaction, respectively.
Table 6 outlines the fit indexes set for the analysis and the output estimates derived for the data analysis. As shown, the CFI and GFI had values of 0.996 and 0.985, respectively. Bentler [71] and Iacobucci [78] recommended that for CFI and GFI to meet good fit, they must have a value greater than or equal to 0.95, while values greater than or equal to 0.90 can be adjudged as acceptable fit. Hence, the values obtained from the analysis can be passed as meeting the good fit requirement. Furthermore, the SRMR and RMSEA had values of 0.030 and 0.024, respectively. Bentler [71] noted that values of SRMR and RMSEA that are less than or equal to 0.05 are considered a good fit, while those less than 0.08 are considered acceptable fit. Based on the aforementioned, it can be said that the values of SRMR and RMSEA met the good fit criteria. In addition, the sample data for the model gave an S−Bχ2 of 8.593 with 2 degrees freedom and an accompanying p-value of 0.00. According to Zhong et al. [79], due to the high sensitivity of the chi-square concerning sample size and the accompanying data normality, it is largely inadequate and less reliable. Then, it is recommended that a normed chi-square is adopted [80]. This is derived by dividing the chi-square by its degree of freedom. Hence, the normed chi-square obtained was 4.297. Byrne [69] recommended a normed chi-square value ranging from 3.00 to 5.00 as a good fit. This affirms the adequacy of the hypothesised model.

5. Discussion

The findings of the study showed that the performance indicators that are influential to the uptake of a revolutionary digital technology such as CPS for the delivery of FM mandates are operations efficiency, facility adaptation, and client satisfaction. The result from the analysis, which portrays the value of R2 indicates that facility adaptation is the most decisive performance measurement indicator influential in the espousal of CPS for FM functions. This is followed by the client’s satisfaction and operations efficiency. This finding of the study is supported by the goal-setting theory, which affirms that persons or organisations given exhaustive, challenging, but realistic goals deliver in better terms than those whose mandates are non-specific, easy, and with no properly defined goals [60,63,81,82]. Likewise, these persons and organisations must possess the required capability, set outlined goals, and set mechanisms for performance feedback [55]. Therefore, it is premised on the notion that performance can be improved upon if there is an effort to strive towards a defined goal [83]. Ascertaining and measuring the deliveries from the innovativeness from a utilised system is seen as an indicator for determining the use of such a system. For facilities management, one of such yardsticks is the satisfaction derived by the users from the facility. Measuring up the satisfaction derived from the utilisation of a system would serve as a propelling factor in using such a system for the outlined task [84,85,86,87]. For facilities management, if it is clear that the services and operations derived from infusing cyber-physical systems indicate an upscale from the previously utilised conventional approaches, then the espousal of the system would be encouraged. Measuring the improvement obtained from the applied innovative technology serves as an accelerating influence on the system’s adoption. Similarly, when there is a significant improvement in the facility’s standard resulting from the use of technological innovation, the tendency to accept the use of such a system is likely. The complexities involved with the facilities management would encourage methods and approaches that would seek to improve the facilities for customers’ use. Therefore, innovative technologies such as cyber-physical systems, whose utilisation would be advantageous due to their high computational capabilities, would help improve the facility, therefore giving credence to its espousal. Furthermore, the significance of time-saving to be derived from the deployment of non-conventional methods of facilities management would be a propelling measure for its usage. This is supported by Legris et al. [88], who identified the reduction of time in the delivery tasks and activities as a significant indicator for measuring up the performance of a system. Against the backdrop of those mentioned above, it is imperative to note that the satisfaction derived by the end-users of facilities whose functional mandates are enhanced by digital technologies such as CPS is a significant drive towards the uptake of such systems.

6. Conclusions

The study evaluated the influence of performance measurement indicators in the uptake of cyber-physical systems for facilities management. Through the review of extant literature, sixteen variables were identified and subsequently grouped into three distinct groups resulting from exploratory factor analysis. The confirmatory factor analysis further affirmed these findings with model fit indices, construct validity, and high reliability, thus indicating that the major performance indicators influential in the espousal of CPS for FM are operations efficiency, facility adaptation, and client satisfaction. These findings provide a vivid insight on the expected performance output and subsequent measurement that would propel the espousal of emerging digital technology such as CPS for FM functions. Hence, the practical implication of the study’s findings lies in its contribution to aiding policymakers and facility managers with the requisite knowledge of the expected performance measurement indicators that would be beneficial from adopting the system for FM compared to the conventional systems used for FM.
In addition, the study contributes theoretically to the body of knowledge on the conversation of adopting technological innovations for FM practices. In the current era of the fourth industrial revolution, efforts are being made to help propagate the benefits that accrue from various emerging digital technologies; hence, this study helps showcase a roadmap that clearly stipulates one of the drivers of the uptake of digitalising FM processes. Moreover, the study’s findings provide a good theoretical base for further studies that attempt to focus on the enhancement of FM functions through the digitalisation of its processes. However, it is important to note that the study was conducted in the Gauteng province of South Africa; therefore, care must be taken by not generalising its findings. Future studies can be conducted in other provinces for a more robust and generalisable outcome.

Author Contributions

Conceptualisation, C.A. (Clinton Aigbavboa), C.A. (Chimay Anumba), A.O. and M.I.; methodology, M.I. and L.A.; writing—original draft preparation, M.I.; writing—review and editing, M.I, C.A. (Clinton Aigbavboa), C.A. (Chimay Anumba), A.O. and L.A.; supervision, C.A. (Clinton Aigbavboa), C.A. (Chimay Anumba), and A.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Atkin, B.; Brooks, A. Total Facility Management, 4th ed.; Wiley-Blackwell: Hoboken, NJ, USA, 2015. [Google Scholar]
  2. International Facilities Management Association. Definition of Facility Management. 2016. Available online: www.ifma.org (accessed on 17 September 2021).
  3. British Institute of Facilities Management. Facilities Management. 2011. Available online: www.cips.org (accessed on 22 September 2021).
  4. Jawdeh, H.B. Improving the Integration of Building Design and Facilities Management; University of Salford: Salford, UK, 2013. [Google Scholar]
  5. Kamaruzzaman, S.N.; Zawawi, E.M.A. Development of facilities management in Malaysia. J. Facil. Manag. 2010, 8, 75–81. [Google Scholar] [CrossRef] [Green Version]
  6. Isa, N.M.; Kamaruzzaman, S.N.; Mohamed, O.; Berawi, M.A. Review of Facilities Management Functions in Value Management Practices. Int. J. Technol. 2017, 8, 830. [Google Scholar] [CrossRef] [Green Version]
  7. Kok, H.B.; Mobach, M.P.; Omta, O.S. The added value of facility management in the educational environment. J. Facil. Manag. 2011, 9, 249–265. [Google Scholar] [CrossRef]
  8. Aghimien, D.; Aigbavboa, C.; Oke, A. Digitalisation for Effective Construction Project Delivery in South Africa. In Proceedings of the Contemporary Construction Conference, Innovative and Dynamic Built Environment, Coventry, UK, 5–6 July 2018. [Google Scholar]
  9. Ikuabe, M.O.; Aghimien, D.O.; Aigbavboa, C.O.; Oke, A.E. Inhibiting Factors to the Adoption of Digital Technologies in the South African Construction Industry. In Proceedings of the 5th Research Conference of the NIQS (RECON5), Minna, Nigeria, 9–10 November 2020; Nigerian Institute of Quantity of Surveyors: Lagos, Nigeria, 2020; pp. 455–461. [Google Scholar]
  10. Windapo, A. The construction industry transformation and the digital divide: Bridging the gap. S. Afr. J. Sci. 2021, 117, 1–4. [Google Scholar] [CrossRef]
  11. Ensafi, M.; Thabet, W. Challenges and Gaps in Facilities Management Practices. In Proceedings of the Annual Associated Schools of Construction International Conference, Long Beach, CA, USA, 5–9 April 2021; pp. 237–245. [Google Scholar]
  12. Hudson, M. Facility Management for Universities. In Proceedings of the Conference in Hong Kong on New World Order in Facility Management, Hong Kong, China, 3–5 June 2004. [Google Scholar]
  13. Nidhi, M.; Ali, S. The Challenges and Benefits of Facility Management. J. All Res. Educ. Sci. Methods 2020, 8, 248–253. [Google Scholar]
  14. McKinsey Global Institute. A Route to Higher Productivity. Reinventing Construction; McKinsey Global Institute: Washington, DC, USA, 2017. [Google Scholar]
  15. Ikuabe, M.O.; Aghimien, D.O.; Aigbavboa, C.O.; Oke, A.E. Exploring the Adoption of Digital Technology at the Different Phases of Construction Projects in South Africa. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Dubai, United Arab Emirates, 10–12 March 2020; pp. 1553–1561. [Google Scholar]
  16. Oladejo, E.I. Evaluation of Challenges of Facilities Management in Tertiary Healthcare Institutions in South East Nigeria. Ph.D. Thesis, Nnamdi Azikiwe University Awka, Awka, Anambra State, Nigeria, 2014. [Google Scholar]
  17. Islam, R.; Hassan, T.; Sarajul, N.; Mohamed, F. Factors Influencing Facilities Management Cost Performance in Building Projects. J. Perform. Constr. Facil. 2019, 33, 04019036. [Google Scholar] [CrossRef]
  18. Hoosain, M.S.; Paul, B.S.; Ramakrishna, S. The Impact of 4IR Digital Technologies and Circular Thinking on the United Nations Sustainable Development Goals. Sustainability 2020, 12, 10143. [Google Scholar] [CrossRef]
  19. Kayembe, C.; Nel, D. Challenges and Opportunities for Education in the Fourth Industrial Revolution. Afr. J. Public Aff. 2019, 11, 79–94. [Google Scholar]
  20. Sima, V.; Gheorghe, I.G.; Subić, J.; Nancu, D. Influences of the Industry 4.0 Revolution on the Human Capital Development and Consumer Behavior: A Systematic Review. Sustainability 2020, 12, 4035. [Google Scholar] [CrossRef]
  21. Ajibade, P.; Mutula, S.M. Big Data, 4IR and Electronic Banking and Banking Systems Applications in South Africa and Nigeria. Banks Bank Syst. 2020, 15, 187–199. [Google Scholar] [CrossRef]
  22. Lee, J.-Y.; Lim, J.-Y. The Prospect of the Fourth Industrial Revolution and Home Healthcare in Super-Aged Society. Ann. Geriatr. Med. Res. 2017, 21, 95–100. [Google Scholar] [CrossRef] [Green Version]
  23. Serumaga-Zake, J.M.; van der Poll, J.A. Addressing the Impact of Fourth Industrial Revolution on South African Manufacturing Small and Medium Enterprises (SMEs). Sustainability 2021, 13, 11703. [Google Scholar] [CrossRef]
  24. Aghimien, D.O.; Aigbavboa, C.O.; Oke, A.E. Critical Success Factors for Digital Partnering of Construction Organisations—A Delphi Study. Eng. Constr. Archit. Manag. 2020, 27, 3171–3188. [Google Scholar] [CrossRef]
  25. Ikuabe, M.O.; Aigbavboa, C.; Oke, A. Cyber-Physical Systems: Matching up Its Application in the Construction Industry and Other Selected Industries. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Dubai, United Arab Emirates, 10–12 March 2020; pp. 1543–1552. [Google Scholar]
  26. Kim, K.; Park, K. An Overview and Some Challenges in CPS. J. Indian Inst. Sci. 2013, 93, 1–8. [Google Scholar]
  27. Alguliyev, R.; Imamverdiyev, Y.; Sukhostat, L. Cyber-Physical Systems and Their Security Issues. Comput. Ind. 2018, 100, 212–223. [Google Scholar] [CrossRef]
  28. Bhrugubanda, M. A Review on Applications of Cyber Physical Systems. J. Innov. Sci. Eng. Technol. 2015, 2, 728–730. [Google Scholar]
  29. Taymanov, R.; Sapozhnikova, K.; Ionov, A. Topical Metrology Problems in the Era of Cyber-Physical Systems and Internet of Things. In Proceedings of the 18th International Congress of Metrology, Paris, France, 19–21 September 2017; Corletto, C., Ed.; EDP Sciences: Les Ulis, France, 2017; p. 09006. [Google Scholar] [CrossRef] [Green Version]
  30. Hererich, M.; Uebernickel, F.; Brenner, W. The Impact of Cyber-Physical Systems on Industrial Services in Manufacturing. Procedia CIRP 2015, 30, 323–328. [Google Scholar] [CrossRef] [Green Version]
  31. Wang, Y.; Wang, X.; Wang, J.; Yung, P.; Jun, G. Engagement of Facilities Management in Design Stage through BIM: Framework and a Case Study. Adv. Civ. Eng. 2013, 2013, 189105. [Google Scholar] [CrossRef] [Green Version]
  32. McCarroll, P. The House That Facilities Management Built: A Story of Identity and Metaphor in a Secondary Service. Cult. Organ. 2017, 12, 291–305. [Google Scholar] [CrossRef]
  33. Dubnick, M. Accountability and the Promise of Performance: In Search of the Mechanisms. Public Perform. Manag. Rev. 2005, 28, 376–417. [Google Scholar]
  34. Marques-Quinteiro, P.; Curral, L.; Passos, A.M.; Lewis, K. And Now What Do We Do? The Role of Transactive Memory Systems and Task Coordination in Action Teams. Group Dyn. Theory Res. Pract. 2013, 17, 194–206. [Google Scholar] [CrossRef]
  35. Wong, Y.L.; Snell, R. Employee Workplace Effectiveness: Implications for Performance Management Practices and Research. J. Gen. Manag. 2003, 29, 53–69. [Google Scholar] [CrossRef]
  36. Armstrong, M.; Murlis, H. Reward Management: A Handbook of Remuneration Strategy and Practice, 1st ed.; Hay Group: London, UK, 2000. [Google Scholar]
  37. Ismajli, N.; Zekeri, J.; Qosja, E.; Krasniqi, I. The Performance of Motivation Factors on Employee Performance in Kosovo Municipalities. J. Public Adm. Gov. 2015, 5, 23–39. [Google Scholar]
  38. Amaratunga, D.; Haigh, R.; Sarshar, M.; Baldry, D. Assessment of Facilities Management Process Capability: A NHS Facilities Case Study. Int. J. Health Care Qual. Assur. 2002, 15, 277–288. [Google Scholar] [CrossRef]
  39. Amaratunga, D.; Baldry, D. A Conceptual Framework to Measure Facilities Management Performance. Prop. Manag. 2003, 21, 171–189. [Google Scholar] [CrossRef]
  40. Amos, D.; Musa, Z.N.; Au-Yong, C.P. A Review of Facilities Management Performance Measurement. Prop. Manag. 2019, 37, 490–511. [Google Scholar] [CrossRef]
  41. Medne, A.; Lapina, I. Sustainability and Continuous Improvement of Organisation: Review of Process-Oriented Performance Indicators. J. Open Innov. Technol. Mark. Complex. 2019, 5, 49. [Google Scholar] [CrossRef] [Green Version]
  42. Lampe, H.W.; Hilgers, D. Trajectories of efficiency measurement: A bibliometric analysis of DEA and SFA. Eur. J. Oper. Res. 2015, 240, 1–21. [Google Scholar] [CrossRef]
  43. Koleoso, H.A.; Omirin, M.M.; Adewunmi, Y.A. Performance measurement scale for facilities management service in Lagos-Nigeria. J. Facil. Manag. 2017, 15, 128–152. [Google Scholar] [CrossRef]
  44. New Mexico State University. The Difference between a Goal and an Objective; New Mexico State University: Las Cruces, NM, USA, 2013. [Google Scholar]
  45. Lehigh, J. Setting Goals: Key Accountability or Goal...What’s the Difference? 2013. Available online: www.hr.lehigh.edu (accessed on 4 March 2021).
  46. Naidu, A.; Joubert, R.; Mestry, R.; Mosoge, J.; Ngcobo, T. Educational Management and Leadership: A South African Perspective; Oxford University Press: Cape Town, South Africa, 2008. [Google Scholar]
  47. Rundansky-Kloppers, S. Planning. In Principles of Business Management; Strydom, J., Ed.; Oxford University Press: Cape Town, South Africa, 2009; pp. 62–73. [Google Scholar]
  48. Cascio, W. Managing Human Resources; Irwin/McGraw-Hill: New York, NY, USA, 2006. [Google Scholar]
  49. Locke, E. Linking Goals to Monetary Incentives. Acad. Manag. Exec. 2004, 18, 130–133. [Google Scholar] [CrossRef]
  50. Locke, E.; Latham, G. Building a Practically Useful Theory of Goal Setting and Task Motivation: A 35-Year Odyssey. Am. Psychol. 2002, 57, 705. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Shah, J.Y.; Friedman, R.; Kruglanski, A.W. Forgetting All Else: On the Antecedents and Consequences of Goal Shielding. J. Personal. Soc. Psychol. 2002, 83, 1261. [Google Scholar] [CrossRef]
  52. Govindarajan, V.; Trimble, C. The Other Side of Innovation: Solving the Execution Challenge; Harvard Business Press: Boston, MA, USA, 2010. [Google Scholar]
  53. Sicotte, H.; Langley, A. Integration Mechanisms and R&D Project Performance. J. Eng. Technol. Manag. 2000, 17, 1–37. [Google Scholar]
  54. Heslin, P.; Carson, J.; VandeWalle, D. Practical Applications of Goal Setting Theory to Performance Management. In Performance Management: Putting Research into Practice; Smither, J., Ed.; Jossey Bass: San Francisco, CA, USA, 2008; pp. 1–5. [Google Scholar]
  55. Latham, G. Goal Setting: A Five-Step Approach to Behavior Change. Organ. Dyn. 2003, 32, 309–318. [Google Scholar] [CrossRef]
  56. MacLeod, L. Making SMART Goals Smarter. Physician Exec. 2012, 38, 68–72. [Google Scholar] [PubMed]
  57. Lund, S.K. Innovation under Pressure: Reclaiming the Micro-Level Exploration Space. Ph.D. Thesis, KTH Royal Institute of Technology, Stocholm, Sweden, 2015. [Google Scholar]
  58. Prather, C.W. The Dumb Thing about SMART Goals for Innovation. Res. Technol. Manag. 2005, 48, 14. [Google Scholar] [CrossRef]
  59. Vancouver, J.; Milsap, R.E.; Peters, P. Multilevel Analysis of Organisational Goal Congruence. J. Appl. Psychol. 1994, 79, 666–679. [Google Scholar] [CrossRef]
  60. Young, G.; Smith, K.G. Units, Divisions, and Organisations: Macro-Level Goal Setting. In New Developments in Goals Setting and Task Performance; Locke, E., Latham, G., Eds.; Routledge: New York, NY, USA, 2013; pp. 311–327. [Google Scholar]
  61. Stock, J.; Cervone, D. Proximal Goal-Setting and Self-Regulatory Processes. Cogn. Ther. Res. 1990, 14, 483–498. [Google Scholar] [CrossRef]
  62. Brown, T.C.; Warren, A.M. Distal Goal and Proximal Goal Transfer of Training Interventions in an Executive Education Program. Hum. Resour. Dev. Q. 2009, 20, 265–284. [Google Scholar] [CrossRef]
  63. Grünig, R.; Kühn, R. The Strategy Planning Process: Analyses, Options, Projects; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  64. Elbanna, S.; Andrews, R.; Pollanen, R. Strategic Planning and Implementation Success in Public Service Organisations: Evidence from Canada. Public Manag. Rev. 2016, 18, 1017–1042. [Google Scholar] [CrossRef]
  65. Tan, W.C. Practical Research Methods; Pearson Custom: Singapore, 2011. [Google Scholar]
  66. Atkinson, R.; Flint, J. Accessing Hidden and Hard-to-Reach Populations: Snowball Research Strategies. Soc. Res. Update 2001, 33, 1–4. [Google Scholar]
  67. Pallant, J. SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS for Windows (Version 12), 2nd ed.; Allen and Unwin: Crows Nest, Australia, 2005. [Google Scholar]
  68. Abbasi, M. Culture, Demography and Individuals’ Technology Acceptance Behaviour: A PLS Based Structural Evaluation of an Extended Model of Technology Acceptance in South-Asian Country Context. Ph.D. Thesis, Brunel Business School, Brunel University, Uxbridge, UK, 2011. [Google Scholar]
  69. Byrne, B. Structural Equation Modelling with EQS-Basic Concepts, Applications and Programming; Lawrence Erlbaum Associates: New York, NY, USA, 2006. [Google Scholar]
  70. Kaplan, D. Structural Equation Modelling: Foundations and Extensions, 2nd ed.; Sage: Riverside County, CA, USA, 2009. [Google Scholar]
  71. Bentler, P. EQS 6 Structural Equation Program Manual; Encino: Los Angeles, CA, USA, 2005. [Google Scholar]
  72. Field, A. Discovering Statistics Using SPSS for Windows, 3rd ed.; Sage Publications: London, UK, 2009. [Google Scholar]
  73. Hair, J.; Hult, G.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage Publication: London, UK, 2013. [Google Scholar]
  74. Cepeda-Carrion, G.; Cegarra-Navarro, J.-G.; Cillo, V. Tips to Use Partial Least Squares Structural Equation Modelling (PLS-SEM) in Knowledge Management. J. Knowl. Manag. 2019, 23, 67–89. [Google Scholar] [CrossRef]
  75. Hair, J.; Risher, J.; Sarstedt, M.; Ringle, C. When to Use and How to Report the Results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  76. Aghimien, L.M.; Aigbavboa, C.O.; Anumba, C.J.; Thwala, W.D. A Confirmatory Factor Analysis of the Challenges of Effective Management of Construction Workforce in South Africa. J. Eng. Des. Technol. 2021. [Google Scholar] [CrossRef]
  77. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The Use of Partial Least Squares Path Modeling in International Marketing; Emerald Publishing Limited: Manchester, UK, 2009; pp. 277–319. [Google Scholar] [CrossRef] [Green Version]
  78. Iacobucci, D. Structural Equations Modeling: Fit Indices, Sample Size, and Advanced Topics. J. Consum. Psychol. 2010, 20, 90–98. [Google Scholar] [CrossRef]
  79. Zhong, X.; Yuan, K. Bias and Efficiency in Structural Equation Modeling: Maximum Likelihood versus Robust Methods. Multivar. Behav. Res. 2011, 46, 229–265. [Google Scholar] [CrossRef]
  80. Kline, R.B. Principles and Practice of Structural Equation Modelling, 2nd ed.; Guilford Press: New York, NY, USA, 2005. [Google Scholar]
  81. Dubrin, A. Essentials of Management; Mason: San Francisco, CA, USA, 2012. [Google Scholar]
  82. Oettingen, G.; Gollwitzer, P. Strategies of Setting and Implementing Goals. In Social Psychological Foundations of Clinical Psychology; Maddux, J., Tangney, J., Eds.; The Guilford Press: New York, NY, USA, 2010; pp. 114–135. [Google Scholar]
  83. du Plessis, P.; Conley, L.; du Plessis, E. Teaching and Learning in South African Schools; Van Schaik: Pretoria, South Africa, 2007. [Google Scholar]
  84. Agarwal, R.; Prasad, J. The Antecedents and Consequents of User Perception in Information Technology Adoption. Decis. Support Syst. 1998, 22, 15–29. [Google Scholar] [CrossRef]
  85. Hernandez-Ortega, B.; Serrano-Cinca, C.; Gomez-Meneses, F. The Firm’s Continuance Intentions to Use Inter-Organisational CTs: The Influence of Contingency Factors and Perception. Inf. Manag. 2014, 51, 747–761. [Google Scholar] [CrossRef]
  86. Man, W.; Toorn, W. Culture and the Adoption and Use of GIS within Organisations. Int. J. Appl. Earth Obs. Geoinf. 2002, 4, 51–63. [Google Scholar]
  87. Phichitchaisopa, N.; Naenna, T. Factors Affecting the Adoption of Healthcare Information Technology. EXCLI J. 2013, 12, 413–436. [Google Scholar]
  88. Legris, P.; Ingham, J.; Collerette, P. Why Do People Use Information Technology? A Critical Review of the Technology Acceptance Model. Inf. Manag. 2003, 40, 191–204. [Google Scholar] [CrossRef]
Figure 1. Framework for Data Analysis.
Figure 1. Framework for Data Analysis.
Buildings 12 00466 g001
Table 1. Performance Measurement (Mean Ranking and Kruskal–Wallis H-test).
Table 1. Performance Measurement (Mean Ranking and Kruskal–Wallis H-test).
IndicatorsGovt.Contra.Consult.TotalK–W
MRMRMRMRX2Sig.
Built-in capability of facility adaptation4.4014.2814.5114.3914.1600.125
Significance of quality assurance4.2674.2814.3534.3020.2910.864
Attainment of customers’ specific needs4.3634.2094.3624.2931.9850.371
Timely communication of policy changes4.3254.2264.2964.2740.3810.826
Improvement of internal processes of the organisation4.3634.19114.2964.2652.1210.046 **
Significance of time savings4.2674.2094.3344.2651.2680.531
Improvement in facilities’ standards4.3064.2644.21114.2571.0500.591
Improvement in evaluation process through customers’ involvement4.2674.2264.2784.2480.0160.992
Informed decision making4.17134.2264.3254.2481.5620.458
Significance of cost savings4.19104.2814.21114.2481.8020.406
Evaluation of existing trends4.4014.08144.2494.21114.5190.004 **
Economic utilisation of the facility4.11164.2454.17154.19121.1880.552
Stakeholders’ perception of facilities performance4.17134.17134.21114.18130.2190.896
Anticipation of the attainment of future needs of the organisation4.19104.18124.13164.17140.3840.825
Identification of problems in facilities4.19104.07154.2494.16151.9170.383
Improved customer satisfaction4.15154.07154.21114.14161.0850.581
Group Mean4.264.204.274.24
Cronbach Alpha0.899
NB: M = Mean Item Score, R = Rank, Govt. = Government, Contra. = Contracting organisation, Consult. = Consulting firm, K–W = Kruskal–Wallis H-test, X2 = Chi-square, ** = significant (p ˂ 0.05).
Table 2. KMO and Bartlett’s Test.
Table 2. KMO and Bartlett’s Test.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.926
Bartlett’s Test of SphericityApprox. Chi-Square1087.303
df120
Sig.0.000
Table 3. Communalities.
Table 3. Communalities.
LabelPerformance Measurement IndicatorsInitialExtraction
PME1The built-in capability of facility adaptation1.0000.617
PME2Identification of problems in facilities1.0000.491
PME3Improved customer satisfaction1.0000.640
PME4Improvement in evaluation process through customers’ involvement1.0000.884
PME5Informed decision making1.0000.756
PME6Significance of cost savings1.0000.543
PME7Significance of time savings1.0000.545
PME8Significance of quality assurance1.0000.536
PME9Attainment of customers’ specific needs1.0000.626
PME10Timely communication of policy changes1.0000.879
PME11Stakeholders’ perception of facilities performance1.0000.673
PME12Improvement in facilities’ standards1.0000.636
PME13Economic utilisation of the facility1.0000.788
PME14Evaluation of existing trends1.0000.662
PME15Improvement of internal processes of the organisation1.0000.743
PME16Anticipation of the attainment of future needs of the organisation1.0000.619
Table 4. Rotated Component Matrix.
Table 4. Rotated Component Matrix.
IndicatorsComponent
123
Significance of time savings0.842
Significance of quality assurance0.714
Significance of cost savings0.711
Improvement in facilities’ standards0.774
Identification of problems in facilities0.656
Improvement of internal processes of the organisation0.510
Built-in capability of facility adaptation 0.764
Economic utilisation of the facility 0.726
Timely communication of policy changes 0.584
Improvement in evaluation process through customers’ involvement 0.526
Significance of cost savings 0.520
Identification of problems in facilities 0.514
Improved customer satisfaction 0.787
Significance of time savings 0.638
Significance of quality assurance 0.525
Stakeholders’ perception of facilities performance 0.513
Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalisation.
Table 5. Factor loading, z-statistics, and internal consistency of the model.
Table 5. Factor loading, z-statistics, and internal consistency of the model.
Groups LabelStandardised Coefficient (λ)Z-StatisticsR2Significant at 5% Level?Group R2Cronbach’s AlphaRho
Coefficient
Operations efficiencyPM70.7997.4690.736Yes0.6730.8990.786
PME80.7667.7240.841Yes
PME60.7617.7260.842Yes
PME120.8228.1530.832Yes
PME20.7527.2940.743Yes
PME150.7748.3820.664Yes
Facility AdaptationPME10.7349.9310.743Yes0.715
PME130.81111.8270.672Yes
PME50.73810.8260.706Yes
PME140.8839.9260.794Yes
PME160.69911.8370.649Yes
PME100.71710.2940.783Yes
Client’s SatisfactionPME90.79213.8930.811Yes0.698
PME110.68613.2740.793Yes
PME30.8399.2940.729Yes
PME40.8279.7830.884Yes
Table 6. Robust Fit Indexes for Performance Measurement Indicators.
Table 6. Robust Fit Indexes for Performance Measurement Indicators.
Fit IndexCut-Off ValueEstimateIndication
S−Bχ2 8.593
dfX > 0.002Good fit
CFIx ≥ 0.90 (acceptable)0.996Good fit
x ≥ 0.95 (good fit)
GFIx ≥ 0.90 (acceptable)0.985Good fit
x ≥ 0.95 (good fit)
RMSEA x ≤ 0.08 (acceptable)0.024Good fit
x ≤ 0.05 (good fit)
SRMRx ≤ 0.08 (acceptable)0.030Good fit
x ≤ 0.05 (good fit)
NFIx ≥ 0.90 (acceptable)0.965Good fit
x ≥ 0.95 (good fit)
NNFIx ≥ 0.90 (acceptable)0.993Good fit
x ≥ 0.95 (good fit)
RMSEA 90% CI 0.001:0.024 Acceptable range
p-valuex > 0.050.00 Acceptable range
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ikuabe, M.; Aigbavboa, C.; Anumba, C.; Oke, A.; Aghimien, L. Confirmatory Factor Analysis of Performance Measurement Indicators Determining the Uptake of CPS for Facilities Management. Buildings 2022, 12, 466. https://doi.org/10.3390/buildings12040466

AMA Style

Ikuabe M, Aigbavboa C, Anumba C, Oke A, Aghimien L. Confirmatory Factor Analysis of Performance Measurement Indicators Determining the Uptake of CPS for Facilities Management. Buildings. 2022; 12(4):466. https://doi.org/10.3390/buildings12040466

Chicago/Turabian Style

Ikuabe, Matthew, Clinton Aigbavboa, Chimay Anumba, Ayodeji Oke, and Lerato Aghimien. 2022. "Confirmatory Factor Analysis of Performance Measurement Indicators Determining the Uptake of CPS for Facilities Management" Buildings 12, no. 4: 466. https://doi.org/10.3390/buildings12040466

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop