Next Article in Journal
Improving Natural Ventilation Conditions on Semi-Outdoor and Indoor Levels in Warm–Humid Climates
Previous Article in Journal
Analytic Hierarchy Process & Multi Attribute Utility Theory Based Approach for the Selection of Lighting Systems in Residential Buildings: A Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Construction Technology Adoption Cube: An Investigation on Process, Factors, Barriers, Drivers and Decision Makers Using NVivo and AHP Analysis

by
Samad M. E. Sepasgozar
1,* and
Steven Davis
2
1
Faculty of Built Environment, University of New South Wales, Sydney 2052, Australia
2
Faculty of Engineering, University of New South Wales, Sydney 2052, Australia
*
Author to whom correspondence should be addressed.
Buildings 2018, 8(6), 74; https://doi.org/10.3390/buildings8060074
Submission received: 11 April 2018 / Revised: 18 May 2018 / Accepted: 24 May 2018 / Published: 26 May 2018

Abstract

:
Due to the complexity, high-risk and conservative character of construction companies, advanced digital technologies do not become widely adopted in the short term, while vendors make determined efforts to overcome this and disseminate their technologies. This paper presents the methods of an investigation addressing the extremely complex issues related to the current practices of digital technology adoption in construction. It discusses how construction companies follow a specific logical process linked to need, project objectives, the characteristics of the adopting organization and the characteristics of the new technology to be adopted. The study aims to demonstrate a novel method of data collection and analysis, such as data and methodological triangulation techniques, including the use of NVivo and AHP to explore how companies make the decision to uptake new technology (e.g., advanced crane, tunnel boring machine or drones) by focusing on customer and vendor activities, their interactions, contributing factors and people involved in the process. The major original contribution of this paper is developing an innovative methodological cube for investigating the Construction Technology Adoption Process (CTAP) covering technology adoption, acceptance, diffusion and implementation concepts. CTAP is a framework that delineates the phases of the process that customer organizations use when deciding to adopt a new digital technology and the parallel vendor activities. The significance of these contributions is that they enable vendors to understand how to match their strategies with customer expectations in each phase of CTAP. It also provides a benchmark for new construction companies to use the current best practices of decision making. Future research is warranted to more clearly delineate any differences with respect to developing nations or related industries such as mining and property management.

1. Introduction

Innovation plays an important role in maintaining a competitive advantage for firms and meeting the evolving demands of industry [1,2,3]. In construction considerable research indicates that new technologies have a large beneficial effect on overall performance, productivity, safety and efficiency [4,5,6,7,8,9,10]. For example, adding vibration to the rolling action of compactors exhibited a 260% increase in productivity during the mid-1980s when this technology was introduced by Caterpillar [11]. This leads to a growing focus on introducing new technologies [4,12,13,14,15] to the construction industry by various firms in order to benefit from such advantages [16,17]. According to Goodrum et al. [18], awareness of a particular technology does not ensure its adoption, and a “series of interrelated events” is required for the success of its implementation. Therefore, it is desirable to expedite the technology diffusion rate, and so, technology vendors and suppliers try to facilitate the technology adoption process with supportive activities [19]. However, knowing which activities are truly supportive requires a deep understanding of the process undertaken by construction companies in deciding to use the technologies.
The size of the global construction technology market is estimated to reach US$145 billion in 2015 [20]. The report also shows that the top five largest manufacturers sold construction technology at a value of US$72 billion in 2013 [21]. The continuous increase in the number and value of new construction technologies at an ever-increasing speed, coupled with accelerating technological advancement of their functions, produced a growing appetite for facilitating the technology uptake process by customers. Understanding this process is also important because the construction market is frequently subject to booms and collapses. For example, between 2012 and 2013, U.S. construction machinery suffered a considerable decline of 21 percent in exporting construction technology from US$13.7 billion–US$10.8 billion [22]. These fluctuations have a huge impact on equipment and tool manufacturers, particularly recently-established vendors. Thus, there is an urgent need to understand the process of technology adoption from two perspectives: (a) customers, who make the adoption decision and seem to be under a “bombardment of technological products” [2]; and (b) vendors, who use different strategies to encourage the adoption of their technologies.
Since technology has the potential to enhance competency, it is recognized to be an important business strategy [23,24,25]. In addition, many governments encourage organizations to implement new technologies [26,27,28,29]. The ability to know how, why and where construction companies adopt new technologies is also critical, because it gives the ability to expedite the rate of technology diffusion by facilitating its adoption. According to Art [30], understanding this process is a critical insight for managers involved in marketing innovation. However, investigation of the process by which construction companies select and implement new construction technologies for their projects remains an open question.
Models of the technology adoption process exist outside construction. However, these are not suitable for the construction industry, as construction is widely recognized as a complex system and distinctly different from other industries [30,31,32,33,34,35]. There is, therefore, a need to investigate the current industry practice of decision making in order to provide a systematic framework to assist industry players in facilitating the process and to hopefully become more successful at utilizing the most appropriate technology in the timeframe available.
The goal of this study is to thoroughly understand the process of technology adoption decision making in construction. Specifically, the research aims to provide a systematic picture of the customers’ decision-making practice including their interaction with vendors and the associated vendor activities. The objectives are outlined as follows:
(1)
to investigate the process used by customers of new technologies as they move from recognizing a need to actually using a new technology;
(2)
to investigate the interaction and relationships between the activities of customers and vendors;
(3)
to explore influential factors affecting the process;
(4)
to identify individuals involved in the process;
(5)
to formulate the understanding of these activities into a comprehensive framework;
(6)
to validate this framework empirically against industry practice.
This paper presents the construction technology adoption cube, which is used for a systematic multi-phase framework including the commencement of the operation of the technology, following the investigation of possible solutions. The aim of this paper is to understand the methods used to investigate how a customer commits to using a new technology and how the vendor supports the customer in the process of decision making.
This paper goes beyond previous studies focusing on the individual customer’s intention to use a specific technology at a specific single stage [36], by developing a multi-stage framework for understanding the decision process at the organizational level. In addition to the customers’ activities, the related vendors’ activities were investigated, which have been overlooked in previous research [37]. In addition, factors contributing to the adoption decision process are identified and mapped to the framework. Furthermore, the individuals involved in the process are identified.

2. The Concept of Construction Technology

2.1. Innovation

Innovation, as a general definition, refers to the actual use of any idea, practice, material artefact or technology “perceived to be new to the relevant adopting unit” Slaughter [38,39]. Innovation can be “technological” or “organizational” [40]. This study focuses on technological innovation [41], namely technology that refers to a product, rather than organizational, one referring to an introduction of advanced management techniques [39].

2.2. Construction Technology

‘Technology’ generally comprises artefacts, the knowledge about them and the practices of the operation of the artefacts [42,43]. In the context of the construction industry, ‘technology’ refers to tools, machines and modifications of them that are used to achieve a goal, perform a specific function or solve a problem [44,45]. ‘Construction technology’ embraces systems, tools, equipment and any combination of resources used in the process of construction from design to demolition [46,47]. The literature shows that there is a shift from manually-operated systems and equipment to “machine-dominated” construction operations [48,49]. The literature needs to explore and evaluate the way that digital technologies can be utilized to improve productivity and safety in construction projects. A list of technologies that have received attention in the literature include: building information modelling; virtual and augmented reality; mobile and wearable technologies; LiDAR technology; automated material identification; real-time locating and tracking systems; GPS-guided plant and machinery [7,15,50,51,52]. In this study, ‘construction technology’ refers to any tool, plant or equipment used for carrying out physical construction activities, and advanced technology refers to the latest models of such a plant and equipment. Examples of construction and mining job-site technologies are:
(i)
the new Autonomous Haulage System developed by Komatsu Ltd., using a high-precision GPS navigation system, millimetre wave radar and optic-fibre gyro technology to control unmanned trucks on predetermined courses [53];
(ii)
the universal piling and drilling rig LRB 18 with vibrator type LV 20 with virtual reality and a positioning system.

2.3. Technology Adoption, Acceptance, Diffusion and Implementation

This paper mainly describes a sample of investigation on the process used by construction companies involving the technology adoption decision as shown in Figure 1. It demonstrates an example of technology adoption investigation intending to develop a decision framework and contributing to the body of knowledge in this area by delineating the initial stages that customers and vendors pass through up to analysing the collected information [54]. The framework highlights the needs and efforts of customers and vendors, paying attention to the various issues and constraints of the customer-vendor relationships.
Specifically, the research aims to provide a systematic picture of the customers’ decision-making practice including their interaction with vendors and the associated vendor activities. In order to attain the aims, the study followed an exploratory qualitative and quantitative research approach. This approach was employed because it demonstrates the understanding required to explain the process in a broader way and provides a rich insight into the adoption process by investigating the current practice of adoption in the construction industry. Previous works classified the studies in the field of technology adoption into three key perspectives [47]:
(i)
the “socio-economic perspective”, focusing on diffusion theories [55];
(ii)
the “managerial perspective”, focusing on adoption considering organization procedures [56,57,58]; and
(iii)
the “psychological perspective”, focusing on technology acceptance [59,60].
The three perspectives investigate the process at the industry, company and individual levels, respectively [47]. The technology dissemination process involves actively marketing the technology to ensure that a potential customer knows about a new technology [19]. It is different from diffusion and dissemination because diffusion refers to a “let it happen” strategy, which is a passive approach [47].
Technology adoption refers to the steps a customer takes in the process through a specific path to reach a decision to accept using the new technology or reject it. While technology adoption research investigates the customer’s organization process at the organizational level, technology acceptance research mainly explores the factors influencing the end user behaviour at the individual level [61]. When the technology is adopted, the technology implementation process will commence. Technology implementation is the process of using the technology as a result of the customer decision, which has already been made [40,53,62].
In order to create this picture, it will be necessary to investigate the practices and interactions using qualitative methods. The picture will then be in the form of a framework that groups related activities and interaction with the sequential process of technology adoption. The most appropriate research approach for creating such a framework is to collect data from industry practitioners and analyse these data using grounded theory techniques. Ensuring that the framework is accurate requires rich data; therefore, substantial datasets have been collected from different sources.

3. Research Design

The research employed exploratory and mixed methods to collect and analyse high-quality data to achieve each research objective.

3.1. Mixed Research Method for Construction

Mixed research methods including qualitative and quantitative methods were chosen to explore the overall structure of the construction technology adoption process. Qualitative methods were specifically chosen rather than exclusively quantitative methods because the literature shows that research is scarce in this area, and so, there is a need to investigate and interpret the basic processes occurring, the aim being to produce new insights and understanding of the phenomena concerned [63,64,65,66]. The chosen qualitative method is recognized as a useful research approach to determine richer information about the process of technology adoption [33].

3.2. Grounded Theory

Grounded Theory (GT) is employed in order to transform qualitative data such as transcriptions and photos into a theoretical or conceptual framework, as it is the best qualitative analysis approach for systematically discovering theory from data [67,68,69]. GT involves fragmenting empirical data (in this case, the transcripts of the semi-structured interviews), adding codes to these fragments, collecting fragments with common codes into categories and then theorizing based on the interactions between these categories. “Based on GT, data collection and analysis reciprocally inform and shape each other through an emergent iterative process” [69,70]. Collecting codes into categories is also called thematic analysis since it focuses on the themes of the interviews [71]. GT helps identify dimensions of contrast, formulate typologies, present conceptual models and generate propositional statements [72]. This approach recently has been used in the construction industry; for example, see [73,74,75,76].

3.3. The Methodological Cube

The data from the substantial datasets were collected from semi-structured interviews recruiting both customers and vendors in different regions and then carefully analysed using a specific procedure of the thematic analysis. As the data were collected from different sources using different formats (transcriptions, photos, etc.), several techniques were employed to carefully analyse the data in appropriate ways, each revealing unique aspects of the process and/or cross-validating each other. In Figure 2, the methodological cube was designed to collect data and analyse them in an innovative way from different dimensions of the complex process of technology adoption. The cube illustrates the four techniques that were used to analyse the data:
(1)
cluster analysis of the exhibition data to explore how vendors disseminate their technologies and to classify them based on their dissemination strategies;
(2)
factor analysis of the exhibition data to explore how vendors support decision makers at the early stages of the adoption process, validating the first phase of the framework;
(3)
thematic analysis of transcriptions of semi-structured interviews to examine the adoption process based on the current best practices of the industry by purposely selected experienced customer and vendor participants who have recently purchased or sold a new technology; and
(4)
the Analytic Hierarchy Process (AHP) to prioritize factors influencing the adoption and the importance of individuals holding various positions in the decision making process; this last is not the main object of the study, but gives more insight about the factors and individuals.

3.4. Triangulation Methods

The triangulation techniques including a variety of data types and participants are very helpful and important for theories of the adoption process, because they increase the broadness of the concepts and the scope of the theory [77]. The variation and large number of samples prevent observational bias [78]. Utilizing both prolonged engagement and persistent observation in Technology Exhibitions (TEs) enables the researcher to be confident in obtaining an accurate interpretation of the meaning of data both from TEs and interviews, which increases the credibility of the results [78,79,80,81,82].
  • Data triangulation was achieved by collecting data from both sides involved in the process (i.e., customers and vendors); different companies (e.g., family businesses and corporations); a diverse range of businesses (e.g., pumping and earthmoving); and different regions (e.g., Australia and North America).
  • Methodological triangulation was achieved by using different analysis methods (clustering, factor and thematic analysis), different data types (e.g., photos, voice records, checking structured forms) and from different sources (e.g., TEs or outside exhibitions).
In order to investigate vendor and customer activities and interactions, two strategies were chosen to collect data:
(a)
Technology Exhibitions (TEs) were visited to immerse the investigator in the technology market [19]. TEs are the best environment to investigate vendor and customer interactions and to record vendor dissemination activities. These visits also enabled the researcher to learn about new technologies on the market, which helped in interpreting some of the interview data.
(b)
Selective participants were recruited using a wide range of strategies including criterion-chain and comparative sampling methods and interviewed to collect the details of the technology adoption process from both the customer and vendor perspectives. Obtaining views from both sides enabled cross-validation of the findings. Each of the TE visits and sampling methods are explained in the following sections.

3.5. Technology Exhibitions

TEs are market places where vendors set up stalls to market their products and customers visit to learn about these products. Customers may discover new technologies at the exhibitions or they may learn new things about technologies that they already know of. Attending these exhibitions enabled the researcher to see how vendors present their technologies and observe vendor customer interactions ‘in the wild’. This provided the researcher with the background and understanding necessary to shape the adoption framework. In order to give more detail to the framework based on data that might not be readily apparent from observing the TEs, experienced participants were recruited to be interviewed. These interviews were conducted face-to-face in order to enable the researcher to take advantage of body language and other such cues to direct the interviews to be more in-depth where appropriate [83]. The combination of data resources from exhibitions and interviews provides a valuable originality for this exploratory research and opens the possibility to triangulate the findings in order to increase the validity of the research [69,82,84,85].
Protocols were designed for collecting data from each of the technical exhibitions visited. This mainly consisted of a consistent set of measures covering the attributes presented by each vendor at the exhibition. This consistency enabled comparisons to be made between exhibitions both within one country and across two continents. Similarly, protocols were designed and tested for the semi-structured interviews to ensure that a consistent dataset was collected. Furthermore, in order to fully represent industry practice, participants who had experienced involvement in the process of technology adoption in either Australia or North America (Canada and the U.S.) were recruited. Several participants were additionally recruited who had experienced the process in other countries such as in Latin America, Germany, Japan, China, Singapore and Finland, to ensure that the findings do in fact generalize to the worldwide construction industry. The main advantages and disadvantages of the methodology are also discussed with references to the literature in the following sections.

3.6. Selective Participants Using Criterion-Chain and Comparative Sampling Strategies

Chain sampling involves selecting extra participants based on recommendations of previous participants to cover gaps in the sample [86]. Comparative sampling [86,87,88,89] was used to select participants so that there were major differences between participants on various scales (i.e., smaller and larger companies) and locations (i.e., Australia and North America), so that comparisons could be made along each of these scales. This combination of strategies was used to pick participants from different companies serving a diverse range of businesses such as drilling, pumping and earthmoving. According to Corbin and Strauss [77], variation is significant in theory building because “it increases the broadness of concepts and scope of the theory” [37].
Figure 3 schematically illustrates the profile of five experienced participants from Sydney, Melbourne, Nevada and North Dakota that were sequentially recruited from the crane industry. This method of sampling was designed specifically for this study, because the investigator aimed to become immersed in the construction technology market community (e.g., crane technology market) and also aimed to elicit facts rather than individual behaviour [90]. This method of collecting data is more robust than the commoner chain sampling technique [86], whereby researchers collect ‘convenience’ data based on the availability of participants [91]. This gives significantly greater originality to the study since it provides unique data to explore the technology adoption process.
The participants were recruited using a combination of three strategies: (a) a criterion-based strategy; (b) a comparative strategy; and (c) a chain strategy [87,88,89]. The combination is designed to select appropriate participants in order to maximize the value and quality of the data from the interviews. For example, one technology that was investigated was advanced mobile cranes. Five participants were recruited based on the combination sampling strategy of the ‘criterion-chain’ from the crane business. Each of these participants was recruited specifically because they had either purchased or sold one or more cranes from a particular crane manufacturer (Brand x) with a large market share, and it was desired to see both sides of the purchasing operation for the same company.
The criteria used for selecting the participants was that they:
  • had been involved in the technology adoption process for at least two major purchases of different technology types in the previous three years; and
  • had been with their present company for at least three years, such that they had good knowledge of their company’s procedures.

3.7. Semi-Structured Interview

The semi-structured interview technique was chosen as the best tool to collect data about the vendors’ and customers’ experiences of the technology adoption process (refer to Figure 4), because it enables the researcher to get in-depth data about the process being studied [63,92,93].
This type of interview is a flexible tool that allows the researcher to generate rich data to advance understanding and consequently develop an empirically- and theoretically-grounded argument about the process [65,93]. However, this type of interview is time consuming both in terms of data gathering and analysis compared to structured interviews. Structured interviews assist the researcher to gather data in a highly structured way. However, they suffer from the disadvantage that they do not allow the investigator to obtain deep understanding of the adoption process, or to explore the cause-and-effect relationship between activities, perspectives and indicators [94]. The problem is that the respondents are forced to choose between alternative choices in structured interviews rather than give their own unique opinion. Thus, the respondent’s replies may not reflect the ‘true’ variation in practices [93]. On the other hand, surveys and structured interviews are easier to conduct and create data that are easy to analyse, and so, they have been used extensively in research of technology adoption in construction (e.g., Alkalbani, Rezgui [95]). Unstructured interviews have the problem that each interview is unique with different content. This prevents the researcher from comparing and contrasting the different interviews, leading to difficulty in generalizing and modelling the phenomena.
The format of semi-structured interviews has been applied in many research studies in construction to investigate a process and the associated factors such as Agapiou [96], Sarshar and Isikdag [97], Bassioni et al. [98], Redmond et al. [99], Aziz and Salleh [100] and Samuelson and Björk [101]. For example, Samuelson and Björk [101] investigated factors that affect the decisions to implement different techniques of information technology in construction, as well as the actual adoption process. They justified their choice of semi-structured interviews because it allowed a wider discussion, while the interview was held within defined areas and the selected theoretical framework. The semi-structured interview is a tool to collect rich data, which is open to the participants’ decision about what is important and relevant to discuss. The participants’ flexibility can allow expressing the process based on the facts rather than agreeing with or rejecting structured questions. Figure 4 shows the structure of semi-structured interviews that is used in this study. The interview questions were designed based on the proposed Construction Technology Adoption Framework (CTAP).
Step 1 involved the invitation letter, the consent form, the interview agenda and some questions regarding the participant’s background. The answers to the background questions were used in selecting who would be in the sample to ensure that it was representative. Step 2 was designed to explore whether there were different adoption processes employed for dissimilar technologies.
Step 3 was designed to ask the participant to examine the proposed stages of CTAP as it applies to their selected technology cases. Step 4 was the main focus of the interview, and allowed participants to freely explain each stage of the process revolving around the main topics such as relevant activities and contributing factors in each stage. Step 5 was added to give the participant a chance to reflect on his or her responses regarding the importance of the stages now that they had spent time focusing on their own specific details. Samples of participants’ responses including their sketches and comments to these steps are shown in Figure 5 and Figure 6.
The data from the interviews were analysed and used to: (a) explore the key stages of the ‘investigation, adoption decision, and implementation’ phases of the technology adoption process; (b) identify contributing factors and individuals involved in the adoption process; and (c) classify customers based on their adoption process similarities. Details are provided in the relevant papers, e.g., [10].

3.8. Questions in the Survey Instrument

In order to investigate the construction technology adoption process, semi-structured interviews were employed [63,77,93]. Participants were asked to compare the decision process for two different technologies that they had been involved in for purchasing or selling. In order to ensure that the technologies were significantly different, participants were first asked to identify two different technology groups that they had been involved in for purchasing and/or utilization. They were then asked to select two different technologies, one from each group, as case studies for the interview.
These participants’ quotations were separated into a subsample in ‘NVivo 10’ for analysis. NVivo is a powerful workspace for qualitative analysis, which enables parts of interviews and the ideas contained therein to be tracked without losing access to the source data.
In order to generalize the technology adoption process across countries and cross-validate them, it was checked that the subsample included sufficient participants from the construction industry. Samples of supplementary data are provided in Figure 7.
The dataset was systematically analysed in two iterations. First, different stages of the process were identified and the framework developed. Second, both the contributing factors and the individuals who were involved in the adoption decision process were discussed.

3.9. Exploring the Process

In this section, the transcriptions from the interviews are examined using thematic analysis techniques [102,103] in order to identify themes representing the stages a customer passes through after technology investigation. The themes were identified based on customer and vendor activities and were used to structure the adoption decision framework.
Figure 8 shows the flowchart of the research method. The first five steps developed and validated the framework. Next, all of the findings were merged to define each stage of the framework. Finally, the factors and individuals involved in the adoption decisions were identified.
In Step 1, the transcriptions were coded in a systematic way as shown in Figure 9. In Step 2, relevant passages were linked to the nodes related to customers’ and vendors’ activities, where each child node represented one core idea. In Step 3, all child nodes were allocated to new parent nodes, respectively, and sorted into basic themes. In Step 4, a web-like map was developed reflecting child-parent relationships and nodes, e.g., [10]. In Step 5, the candidate themes were closely examined in order to ensure that they did not overlap. The examination resulted in three coherent patterns representing the key activities that make up the adoption decision process.

3.10. Micro Analysis and Coding Data

The transcripts were broken down into smaller parts called passages in order to classify and create meaningful concepts [104] from which appropriate themes were extracted. In order to analyse the data without missing useful material [105,106], six criteria were used to choose passages. This involved selecting:
(1)
words and sentences describing any aspects of the process [77]; an example is shown in Figure 9;
(2)
any incidents describing the process, to discover patterns and contrasts between groups that might be used ultimately to evaluate the process investigated [107];
(3)
causal sentences with signal words (e.g., because, reason, do you know why) that would be used in the next step of the analysis to identify relationships [108];
(4)
preceding and sequential statements that might have led to a process (e.g., before, after, then, the next stage);
(5)
any new idea or sentence related to the adoption decision [107];
(6)
paying attention to the participants’ use of special terms as NVivo codes [107]; these codes were created to ensure the concepts stayed close to participants’ own words, because participants captured a key element of a phenomenon that was being described [109].
These criteria were used to carefully investigate passages using line-by-line analysis, a process called micro analysis [65]. Each passage that fit one of these criteria was associated with a child node that represented the core idea.
In order to increase the reliability of the results, immediate analysis of the data took place by writing memos concurrently with the coding. Charmaz [107] recommends the memo as a useful technique for interpreting results. “Memos are the theorizing write-up of ideas about substantive codes and their theoretically coded relationships as they emerge during coding, collecting and analyzing data, and during memoing” [71].

3.11. Create Activity and Factor Nodes

At this point, the passages that indicated a part of the adoption process or a related activity (e.g., training, delivery and commissioning) had been identified by applying the six criteria from the previous section. In the next step, these passages were assigned to relevant child nodes, a process referred to as coding the passage. Each of these child nodes represented one activity related to the purchase decision. In order to increase the consistency of analysis, active words such as ‘I go’, ‘I try’ and ‘we ask’ were considered as signal words for coding the passage. If a node relevant to the passage did not exist, a new node was created. Figure 10 also shows the fishbone diagram including all factors proposed at the beginning of the study. The factors that were verified by participants were assigned to the nodes and used in this study.

4. Results

The purpose of the interview was also to identify the individuals assisting in developing the decision makers’ network. Previous literature indicates that the only individuals involved at this stage whose opinions are sought regarding the factors in the previous section are top managers. Top managers will then support the end users with training and rewards. This literature mainly focused on user behaviour in technology acceptance from a psychological perspective in terms of usefulness and ease of use. However, during the interviews, it emerged that engineers, operating crews and fitters are also consulted during the purchase decision-making process. Results of coding transcriptions and the questions about individuals who were involved in each phase are presented in this section.
Some participants pointed out that individuals from the production level may not be available when the purchase decision is being made. In this case, the company was advised by those who have performance expertise with similar technology use for similar tasks. For example, they ask advice from a plant manager of a similar project when they are going to buy a Tunnel Boring Machine (TBM). When the production team was appointed after technology delivery, they were responsible for any necessary customizing or upgrading of the technology.
Additional coding shows that there is a stage in the framework that refers to making a decision to commit to the technology use. The interview shows that the two previous stages, ‘analysis’ and ‘substantiation’, are the basis of the decision. For example, a customer describes that:
“evaluation of support is the most critical part. That will make or buy the sale. The sale will come later; it will come automatically”.
The main activity in this stage is that customers negotiate with at least two vendors, choose a technology and commit to use it by contract. Before any commitment, the customer and the vendor would negotiate in order to clarify and modify specifications and details about the proposed commercial terms and conditions. In addition, the customer negotiates with the candidate vendor about the price based on the competition’s price. For example, an experienced vendor describes:
“[In this stage,] he knows that he wants to buy [brand A], and he gets the price. The pricing difference between [brand A] and [brand B], for example, may only be marginally different. There might only be, let’s say $20,000 in it. [...] So, he will go back to the [brand A] bloke and he will say, ‘This is the quote from [brand B]. He is cheaper than you’. And he will haggle with the price. [The brand A vendor] then says ‘Well, I will I match that’ or he says ‘I cannot do it.’”
Based on the negotiation, the customer and vendor will finalize the contract. Thus, the customer has selected a particular technology to adopt. However, whether the customer continues to use this particular technology long term will depend on how it and the vendor perform once it has been implemented. It is important to focus on the implementation process in order to understand how the adoption process will be continued and assessed. The next sections will examine the factors used in the decision and the roles of the people involved.
Figure 11 shows that the participants were defined (using AHP analysis) to be involved in the adoption decision processes of analysis and making the final decision. Previous literature indicates that the only individuals involved at this stage whose opinions are sought regarding the factors in the previous section are top managers. Previous literature has only discussed the role of top managers in this process and has been silent regarding the involvement or lack thereof of other workers in different phases of the decision-making process separately [110,111]. For example, Peansupap and Walker [110] identified that top managers initiate the adoption process and support and encourage end users to use ICT. This study identifies other relevant individuals involved in the decision phase of the adoption. The interviews show that project managers have a critical input in the decision. For smaller purchases, they will have the authority to authorize the expenditure; for larger purchases, they will need to obtain approval from higher managers in the company, but the decision starts with them. In making the decision, the project manager will normally collect the opinions of operators and engineers (e.g., mechanics) to evaluate the technology and obtain suggestions about technology attributes and aspects of operation. A company manager states “These guys [pointing to operators] give advice, the guys that run the equipment.” Managers understand information systems as most of them have the basic skill of using the Internet and software, and also, they could easily imagine the output of the system based on experience. However, they do not necessarily know how a new construction technology (e.g., crane or tunnel boring machine) operates in different site conditions. This makes the decision difficult, as top managers may have limited skill and individual knowledge about such an expensive expenditure.
The interviews show that for smaller companies, the owner might be a machine operator, as well. However, he/she would still consult the expected future operator of the specific machine about to be purchased about the decision. If the operator of that specific machine is not available (e.g., the project has not yet started) or has not enough experience in a specific type of technology (yet), the company recruits an experienced individual outside the company. A participant stated:
“[We make the decision] based on experience; if you find someone that is an equipment person, that has been in the field for many years and has been around different pieces, you get his knowledge: ‘what have you experienced has been the best one for this?’.”
The interview shows that project managers also analyse financial and technical information of their technology choices using a comparison matrix to pick two of three technologies and vendors. A vendor describes:
“Normally, the purchasing guy takes three offers. This is the rule. Then he checks the three offers with the [project] manager, and says can we take all three? Do you have any problem if we take three? Or should any company [be] deleted because of technical competency (let’s say technically is not good)? and the manager says, for example, ‘from these three companies we took only two.’ Then they invite both companies.”
Some owners described getting advice from several individuals at the company to make the decision, because “Somebody else’s decision might be better”. In fact, the finding shows that the decision is a result of the interaction and consultation with individuals in different positions (top, middle and production levels) from the corporate head office, project and production levels, including both technical and financial discussions. For example, a contractor states that “Everybody [of our management team of five] is involved in the decision.” Another description of the decision makers’ arrangement is “The decision is a collaboration of all the ideas, of all the thinking, of all the people in our company.”
The results of coding transcriptions and the questions (see Figure 9) about factors (see Figure 12) and individuals who were involved in the decision phase were investigated, and examples are presented in Table 1.
The participants were also asked to define who was involved in the implementation phase. The purpose of this question was to find who might influence the adoption decision by providing feedback about the operation and maintenance of the technology, which helps to identify the individuals assisting in developing the decision makers’ network.
Top managers will then support the end users with training and rewards. This literature mainly focused on user behaviour in technology acceptance from a psychological perspective in terms of usefulness and ease of use. However, during the interviews, it emerged that engineers, operating crews and fitters were also consulted during the purchase decision-making process. Results of coding transcriptions and the questions about individuals who were involved in the implementation phase are presented in Table 2.

5. Discussion and Future Directions

This section is divided into two main parts. First, it discusses the findings of this exploratory study; and second, and more importantly, it provides a framework for future studies. To do so, Figure 13 illustrates the overall CTAP, based on the analysis method utilized, and refers to the objectives of CTAP studies. The major elements of CTAP are shown in Figure 13 including dissemination, investigation, adoption and implementation and are reviewed as follows.

5.1. Technology ‘Dissemination’ Strategy Patterns and Vendors’ Classification

This section discusses the distinction between ‘diffusion’, previously studied in the literature, and ‘dissemination’ in the construction technology market, referring to the proactive process of vendors’ activities for promoting their technology. The study suggests that there are clusters of vendor strategy patterns that different vendors use to a greater or lesser extent, forming a spectrum of vendor business behaviours. Future studies should discuss to what extent the physical appearance of the product of vendors’ dissemination booths can contribute to the adoption process, in addition to how technology demonstration can attract more customers and contribute to the adoption process. The interviews showed that vendors were actively involved in the adoption process and must be considered as a part of the adoption process for a more accurate prediction of technology adoption, which has been ignored in the construction literature.

5.2. The ‘Investigation’ Process

The interviews pointed out that the first phase, namely the pre-adoption process, might be relevant to the company readiness to use a new technology. This stage covers up to the point where the customer was in the position to shortlist the number of technology options and linked to needs and objectives. This stage might be relevant to: (a) the identification of possible solutions; (b) knowledge acquisition; and (c) the comparison of possible solutions. These three main elements should be mirrored by vendor activities, which respond to potential adopters by offering: (a) solutions; (b) knowledge; and (c) inducements. Conducting further analysis may show different applications of CTAP showing how customers start the process from different pathways comprising various mechanisms related to the identification of need.

5.3. The ‘Adoption Decision’ Process

The interviews show that the decision stage can be the complex part of the process referring to the point where the customer makes the decision to purchase a new technology. This phase of the CTAP may consist of: (a) analysis; (b) substantiation; and (c) final decision. These three stages should be mirrored by vendors’ corresponding activities, which respond to potential buyers by offering: (a) specific information about their technology; (b) trial demonstrations or access to referees; and (c) contracts of sale.

5.4. The ‘Implementation’ Process

The interviewees also discuss that the third phase of the CTAP covered from delivery of the technology up to the point where the technology either became part of the customers’ normal operations or was rejected. Following a similar procedure of thematic analysis employed for the previous phase, the findings present the original hypothesis that the implementation process consists of: (a) commencement of operation; (b) maintenance setup; and (c) assessment. These three stages should be mirrored by vendor activities, which respond to potential adopters by offering: (a) delivery and training; (b) repair support; and (c) feedback mechanisms. Further analysis of the interview data should distinguish four pathways that a customer would follow depending on the assessment outcome [10].

5.5. The ‘CTAP Cube’

The combination of this study and previous findings resulted in developing the CTAP cube representing a systematic adoption process. The CTAP cube was assembled by combining the three main phases and all relevant factors and individuals involved in the process as shown in Figure 14. Applications of the CTAP for different technology types should be investigated.
The CTAP cube shows the associated factors and individuals and also groups of decision makers. The relative importance of assessment criteria factors and personnel involved in the adoption process also should be specifically evaluated. The findings go beyond previous studies, which primarily considered one type of decision maker, by exploring different key themes referring to ‘pioneers vs. followers’, ‘corporations vs family businesses’ and ‘Australian vs. American buyers’.
The figure shows that the next step is to identify factors contributing to the adoption process and the people involved in it. This study identified the level of importance of each person.
This paper refers to the multi-stage framework assembled by the combination of the three phases explored in the previous papers as shown in Figure 14. In addition, applications of the CTAP were discussed previously to accurately predict the adoption process for each technology type.

5.5.1. Technology Exhibitions and Cluster Analysis

The first face of the cube refers to technology exhibition analysis using cluster analysis. Technology exhibitions have been overlooked for investigation in construction, although such a multibillion-dollar business has received intensive attention in the literature outside construction (e.g., [112,113,114,115,116,117,118,119,120,121,122,123,124,125,126]). Technology adoption studies can take advantage of technology exhibitions by immersing [127] in technology exhibitions in order to test the hypothesis that vendors proactively use a variety of dissemination strategies to support customers in the technology adoption process in the construction industry. The strength of the immersion is reinforced by the inherent ability to use the more flexible contextual data from technology exhibitions to lead an exploration of dissemination strategies. Such flexible first-hand data covering vendors with various businesses who are disseminating a wide range of technologies are not susceptible to being collected from other resources in the construction industry.
A large number of samples from technology exhibitions would enable the investigator to apply quantitative analysis techniques: hard (e.g., hierarchical and k-means) and fuzzy (c-means) clustering techniques (see: [128,129,130,131]) were used. The techniques are known powerful tools that were appropriate to examine the hypothesis of dissemination strategies’ pattern recognition by classifying vendors and customers based on their strategies, because the results of each singular analysis method cross-validated the others, and validity tests of the Partition Coefficient (PC) and the Classification Entropy (CE) were verified by the fuzzy methods.

5.5.2. Technology Exhibitions and Factor Analysis

The second face of the cube refers to technology exhibition analysis using principal component analysis. The principal component analysis (see: [132,133,134,135]) technique is the most appropriate technique to test the hypothesis of distinct stages of adoption processes, because it is an appropriate tool to find hidden patterns of activities of vendors in the samples.

5.5.3. Semi-Structured Interviews and Thematic Analysis

The third face of the cube refers to semi-structured interview analysis. The flexibility of the analysis allowed the interviewer to narrow down the questions when participants were senior with specific experience with the adoption process [136], particularly when it was about unique technologies such as tunnel boring machines. The researcher asked additional questions to fill the gaps in the collected data. This flexibility was also valuable to allow questions about new related themes that came up in the interviews and were not proposed in the framework, such as ‘maintenance set up’ and ‘pioneer’ themes. The open-ended questions allowed the practitioners to describe the process using their own words and special terms (e.g., ‘crane roadability’, ‘dry and wet test’) that were labelled later in the data analysis as NVivo codes [107]. This enriched the findings by keeping them close to participants’ own words, because participants “captured a key element of a phenomenon” that was being described [109].
Much research in the innovation adoption field (e.g., information systems, new materials and administrational technologies) was used in questionnaire surveys (e.g., [36,137,138,139]), and all suffer from the disadvantages that they did not allow the investigator to explore new activities in such a context-specific topic and the cause-and-effect relationship between them. Semi-structured interviewing has recently been used in construction as a flexible tool to collect rich data (e.g., [33,99,140,141]). However, often, small samples were used, preventing exploration of any patterns of similarities or differences, so suffering from observational bias in the sample [78].
A specific procedure of data analysis based on coding and thematic analysis was used because this allowed recognition of commonalities and discernible themes to constitute the framework [66,102,103]. The procedure enabled the researcher to break down the data to identify constructs first, and then, the framework was structured (bottom-up) rather than considering the whole framework and using data to prove it. The procedure was separately applied to both customer and vendor data (i.e., transcriptions), and corresponding themes were cross-validated with findings of the analysis. The thematic map [142] exercise revealed the richness of the data’s structure and its underlying patterns.

5.5.4. Structured Interviews and AHP Analysis

The fourth face of the cube refers to the Analytical Hierarchy Process (AHP) interview analysis. AHP is a powerful multi-criteria analysis technique that assists in evaluating the factors and people involved the adoption decisions. Using this reliable technique [143,144,145], the contributing factors to the decision were ranked in order to evaluate the importance of factors and individuals who were involved in the adoption process.

5.5.5. Cube Development

Overall, the combination of the four above-mentioned faces gave significantly greater originality to the CTAP cube analysis, since it provided extremely valuable data to explore the theory grounded in such rich data. For example, the technology exhibition visits gave the opportunity to identify experienced participants from both the customer and vendor sides by applying a combination of sampling strategies (e.g., chain sampling; refer to Figure 3). Identifying such eligible people who were recently involved in the adoption process would not be possible using other strategies. Such a combination with this purposeful sampling increased the transferability of the research findings. The large sample of data from different sources drastically increased the external validity [89].
The concept of the CTAP cube enabled the researchers to apply both data and methodological triangulation techniques [146,147,148] in order to attain the validity of the results by data and methodological triangulations.

5.6. Triangulation Techniques and Validation

The triangulation techniques, which included a variety of data types and participants, prevented observational bias [78]. Utilizing both prolonged engagement and persistent observation in TEs enabled the researcher to be confident in the accurate interpretation of the meaning of data both from TEs and interviews, which increased the credibility of the results [54,78,79,80,81,82].
Achieving such benefits (as discussed) from the CTAP cube gave a greater originality to the study due to the higher level of reliability and validity, rather than each singular method. According to Abowitz and Toole [86], using mixed methods increases the reliability and validity of the data and provides greater confidence in tests of the hypotheses compared to singular methods. However, the method utilized for the study of prolonged engagement with the technology market was costly and took four years of persistent investigation. It involved travel to five cities in two countries to collect data, transcribe voice records, edit transcriptions and coding thousands of words to identify themes. This was coupled with the study of many exhibition papers, reports, catalogues and photos, and entering data into four software programs, MATLAB, SPSS, NVivo and Expert Choice. That is why utilizing such methods to obtain substantial datasets is neither common, nor feasible in construction [86], and researchers collect “convenience” data [86] based on the availability of participants [91]. For example, Rahman [138] published the result of a survey on the barriers of an innovation adoption (i.e., modern methods in construction) in a credible journal in construction for which a reasonable response rate was not achieved. Seventy-five percent of participants in his sample had never experienced any type of modern methods in construction, and the average of the participants’ experience in the sample was 5.67 years as reported, showing that most of them were young participants. These participants could not disclose the actual barriers, as they had not experienced it, while the researcher could employ a mixed method approach and obtain other sources of data to triangulate the “true” result [86].
As a final validation of the findings, the developed CTAP was discussed with industry experts by showing them the framework. In the remaining interviews, the participants’ views about the developed CTAP were sought. The results of the industry experts were also used for validity purposes and added more information about the implementation process. This technique of validity has already been used in different forms in construction [149,150,151]. For example, Lucko and Rojas [149] suggest that interviews with industry experts allow a richer feedback, as the researcher can clarify and extend individual factors ad hoc in a semi-structured manner [152].
Each face of the cube shows an achievement of the study including: vendors’ dissemination strategy patterns, vendor classification and developing the dissemination framework. The objectives achieved are shown in Table 3.
The main method for proving the hypothesis of the staging process was to utilize the thematic analysis that systematically explored each possible stage of the process (for each phase). The exploratory analysis revealed details of customer-vendor activities, relationships and interactions that resulted in modifications to the hypothesized stages in terms of naming (to appropriately reflect the current practice), demarcating the boundaries of each stage (to represent distinctions) and identifying customer and vendor activities separately. The CTAP cube assisted the researchers to cross-validate the findings by:
  • Using different datasets from both sides involved in the process (i.e., customers and vendors); different companies (e.g., family business and corporations); diverse types of businesses (e.g., pumping and earthmoving); different regions (e.g., Australia and North America, see Figure 12);
  • Using different analysis methods (clustering, factor and thematic analysis); different data (e.g., photos, voice records, checking structured forms); and from different sources (e.g., TEs or outside the exhibitions).

5.7. Statement of CTAP Cube Novelty

The original contributions of the CTAP cube analysis lie in its combination methods of data collection and analysis of extensive empirical data to establish a scientifically-sound understanding of technology dissemination and adoption processes. The data cover a wide range of construction technologies rather than focusing on a single technology case study. The study is different from other studies as it considers customer-vendor activities and interactions instead of previous studies that focused only on the customer. In addition, the findings apply to organizations (i.e., technology adoption) as the potential adopter rather than individuals (i.e., technology acceptance).
The study is the first attempt to comprehensively investigate how construction technology follows a specific pattern for disseminating technologies by vendors and refers to the gap in knowledge of how classifying vendors based on their dissemination strategies would potentially shape the adoption process. The major contribution of the study is the creation of the methodological cube for investigating a Construction Technology Adoption Process (CTAP). This process goes beyond previous studies focusing on the individual customer’s intention to use a specific information technology at a single stage, by developing a systematic framework where:
  • the adoption process consists of separate stages;
  • each stage comprises unique activities;
  • the process steps of the decision makers (customers) are paralleled by clearly identifiable steps taken by vendors;
  • the characteristics of the technology (e.g., large crane) and the need of the customer (e.g., start new project) result in discernible pathways within the adoption process;
  • the applications of the CTAP cube provide discernible sub-patterns within the process, which would be applicable for different technology types;
  • the study also pointed out factors contributing to the adoption decision process, which would be mapped to the different phases of the adoption process in the future; and
  • the study identified the individuals (or more specifically, their roles) involved in different phases of the process.
Another contribution resides in the exploration of different patterns of decision makers and how this affects the details of their decision making. The original disparity of the behaviour of family businesses in adopting a new construction technology compared with construction contractors was also identified. This was traced to the different interaction networks that arise in the different organizational structures. The new comparison between Australian and American customers and vendors (see Figure 12) shows the similarities of the adoption process in different regions.
As with any empirical research, there are limitations to how representative the samples collected from the two developed regions (Australia and North America) were compared to the rest of the industry worldwide, including developing regions or even just the industry in Australia, Canada and/or the U.S. Each developing country has notable differences in the use of construction technologies. For example, the interviews show that Latin American customers are less feature oriented than North American customers. The interviews also show that in some regions of African and Middle Eastern countries, customers prefer to purchase second-hand technologies of famous brand manufactures rather than purchasing new technologies to obtain reliable equipment without paying high capital costs. Future studies should investigate different regions to find the most important factors in each to help vendors align their dissemination strategies with local customer preferences.
The data cover a wide range of construction technologies, but the total range of construction technologies is wider again; and other patterns may be found in larger samples in different regions. Therefore, more investigations are needed to generalize the findings of this study across different countries, as well as to explore new patterns of decision maker characteristics.
The scope of the study has been limited to the construction industry. Future research could examine if the framework is applicable to transportation and mining, due to the similarity in many of the technologies used. The study covered technological innovation (e.g., tools and heavy equipment); further research could examine the applications of the framework to other types of innovations in construction such as materials and temporary works.

6. Conclusions

The purpose of this paper was to develop a deep understanding of the possible methodologies for digital technology adoption investigating how customers make decisions for adopting the technology and how vendors support them in this decision process. This paper presents how the hypotheses were systematically tested, that the industry follows specific decision processes linked to the pre-adoption process for investigation and the post-adoption process for implementation. It showed that the literature about the construction technology adoption process concerning the decision process is scarce. For example, the existing studies in information technologies claim that the adoption decision only involves one stage, occurring after “persuasion”. A detailed explanation as to how the decision is being made and what would happen in this single stage has not been covered.
Significant and clear gaps of understanding construction technology adoption at the organizational level were identified, particularly regarding vendor involvement in the process. This led to hypothesizing a conceptual framework delineating the construction technology adoption process, which was the basis for the remainder of the study. In order to validate the framework, the main strategies were chosen: technology exhibition visits to immerse the investigator in the customer-vendor market community over a period of four years; semi-structured interviews to collect details of the current practices of the decision-making process; and an AHP survey to evaluate contributing factors. Using these strategies, substantial first-hand data were collected from technology exhibitions and through conducting semi-structured face-to-face interviews spread across Australia and North America involving both customers and vendors to cross-validate the results.
An important message that this study imparts is that construction technology adoption is largely subject to a systematic sequential process involving two sides, customers and vendors, and their interactions. The proposed CTAP, including its stages and customer-vendor interactions in each stage as major original hypotheses, was confirmed.
The original contributions of the findings of this paper lie in its careful design, collection and analysis of two different samples from both customers and vendors to establish a scientifically-sound understanding of the stages of adopting new technology. For example, the testing of the prepared hypotheses led to four key observations: each adoption phase consists of three stages; each stage comprises unique activities and tasks towards technology adoption; the process stages of the decision makers (customers) are paralleled by clearly-identifiable stages taken by vendors; and multiple individuals in the organization are commonly consulted.

Author Contributions

Samad Sepasgozar designed and conducted the research, and Steven Davis contributed to the interpretation of the data.

Funding

This research received no external funding.

Acknowledgments

There are no grants associated with this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Damanpour, F. Organizational Innovation: A Meta-Analysis of Effects of Determinants and Moderators. Acad. Manag. J. 1991, 34, 555–590. [Google Scholar]
  2. Lee, E.-J.; Lee, J.; Schumann, D.W. The Influence of Communication Source and Mode on Consumer Adoption of Technological Innovations. J. Consum. Aff. 2002, 36, 1–27. [Google Scholar] [CrossRef]
  3. Porter, M. Competitive Advantage: Creating and Sustaining Superior Performance; Simon and Schuster: New York, NY, USA, 2008. [Google Scholar]
  4. Skibniewski, M.J. Information Technology Applications in Construction Safety Assurance. J. Civ. Eng. Manag. 2014, 20, 778–794. [Google Scholar] [CrossRef]
  5. Heller, A.; Orthmann, C. Wireless Technologies for the Construction Sector—Requirements, Energy and Cost Efficiencies. Energy Build. 2014, 73, 212–216. [Google Scholar] [CrossRef]
  6. Goodrum, P.M.; Haas, C.T. Long-Term Impact of Equipment Technology on Labor Productivity in the U.S. Construction Industry at the Activity Level. J. Constr. Eng. Manag. 2004, 130, 124–133. [Google Scholar] [CrossRef]
  7. Hong, Y.; Sepasgozar, S.M.E.; Ahmadian, A.F.F.; Akbarnezhad, A. Factors influencing BIM adoption in small and medium sized construction organizations. In Proceedings of the 33th International Symposium on Automation and Robotics in Construction, Auburn, AL, USA, 18–21 July 2016. [Google Scholar]
  8. Sepasgozar, S.M.; Forsythe, P. Lifting and Handling Equipment: From Selection to Adoption Process. In Proceedings of the 40th Australasian Universities Building Education Association (AUBEA) Conference, Cairns, Australia, 6–8 July 2016. [Google Scholar]
  9. Ying, H.; Sepasgozar, S.M.; Akbar, N. Key Factors Affecting Construction Organizations’ Acceptance of BIM: A Comparative Study. In Proceedings of the 2016 Modular and Offsite Construction [MOC] Summit, Edmonton, AB, Canada, 29 September–1 October 2016. [Google Scholar]
  10. Sepasgozar, S.M.; Davis, S.R.; Li, H.; Luo, X. Modeling the Implementation Process for New Construction Technologies: Thematic Analysis Based on Australian and US Practices. J. Manag. Eng. 2018, 34, 05018005. [Google Scholar] [CrossRef]
  11. Haas, C.T.; Borcherding, J.D.; Allmon, E.; Goodrum, P.M. U.S. Construction Labor Productivity Trends, 1970–1998. J. Constr. Eng. Manag. 1999, 126. [Google Scholar] [CrossRef]
  12. Miettinen, R.; Paavola, S. Beyond the BIM utopia: Approaches to the development and implementation of building information modeling. Autom. Constr. 2014, 43, 84–91. [Google Scholar] [CrossRef]
  13. Skibniewski, M.J. Research Trends in Information Technology Applications in Construction Safety Engineering and Management. Front. Eng. Manag. 2015, 1, 246–259. [Google Scholar] [CrossRef]
  14. Shirowzhan, S.; Sepasgozar, S.M.E.; Zaini, I.; Wang, C. An integrated GIS and Wi-Fi based Locating system for improving construction labor communications. In Proceedings of the 34th International Symposium on Automation and Robotics in Construction (ISARC 2017), Taipei, Taiwan, 28 June–1 July 2017. [Google Scholar]
  15. Didehvar, N.; Teymourifard, M.; Mojtahedi, M.; Sepasgozar, S. An Investigation on Virtual Information Modeling Acceptance Based on Project Management Knowledge Areas. Preprints 2018, 2018050024. [Google Scholar] [CrossRef]
  16. Manley, K.; McFallan, S.; Kajewski, S. Relationship between Construction Firm Strategies and Innovation Outcomes. J. Constr. Eng. Manag. 2009, 135, 764–771. [Google Scholar] [CrossRef] [Green Version]
  17. Sexton, M.; Barrett, P. Appropriate innovation in small construction firms. Constr. Manag. Econ. 2003, 21, 623–633. [Google Scholar] [CrossRef]
  18. Goodrum, P.M.; Haas, C.T.; Caldas, C.; Zhai, D.; Yeiser, J.; Homm, D. Model to Predict the Impact of a Technology on Construction Productivity. J. Constr. Eng. Manag. 2010, 137, 678–688. [Google Scholar] [CrossRef]
  19. Sepasgozar, S.M.E.; Davis, S. Diffusion Pattern Recognition of Technology Vendors in Construction. In Proceedings of the Construction Research Congress, Atlanta, GA, USA, 19–21 May 2014; pp. 2106–2115. [Google Scholar]
  20. Statista. Largest Construction Machinery Manufacturers Worldwide—Sales 2013. 2014. Available online: http://www.statista.com/statistics/280343/leading-construction-machinery-manufacturers-worldwide-based-on-sales/ (accessed on 1 September 2014).
  21. Statista. Global Construction Machinery Market—Outlook through 2015. 2014. Available online: http://www.statista.com/statistics/280344/size-of-the-global-construction-machinery-market/ (accessed on 1 Sepember 2014).
  22. Association of Equipment Manufacturers (AEM). U.S. Construction Machinery Exports Down 21 Percent at Midyear 2013; Association of Equipment Manufacturers (AEM): Milwaukee, WI, USA, 2013. [Google Scholar]
  23. Frambach, R.T.; Schillewaert, N. Organizational Innovation Adoption: A Multi-level Framework of Determinants and Opportunities for Future Research. J. Bus. Res. 2002, 55, 163–176. [Google Scholar] [CrossRef]
  24. Devapriya, K.A.K.; Ganesan, S. Technology transfer subcontracting in developing countries through. Build. Res. Inf. 2002, 30, 171–182. [Google Scholar] [CrossRef]
  25. Drejer, I.; Vinding, A.L. Organisation, anchoring of knowledge, and innovative activity in construction. Constr. Manag. Econ. 2006, 24, 921–931. [Google Scholar] [CrossRef]
  26. MacGauran, P.; Macfarlane, I. Construction 2020 A Vision for Australias Property and Construction Industry; CRC for Construction Innovation: Brisbane, Australia, 2004. [Google Scholar]
  27. Business Council of Australia (BCA). Pipeline or Dream—Securing Australia’s Investment Future; Business Council of Australia: Sydney, Australia, 2012. [Google Scholar]
  28. GCS. Government Construction Strategy; UK Cabinet Office: London, UK, 2011; p. 14.
  29. Brewer, G.; Gajendran, T.; Goff, R.L. Building Information Modelling (BIM): Australian Perspectives and Adoption Trends; Tasmanian Building and Construction Industry Training Board: Battery Point, Australia, 2012. [Google Scholar]
  30. Barlow, J. Innovation and Learning in Complex Offshore Construction Projects. Rese. Policy 2000, 29, 973–989. [Google Scholar] [CrossRef]
  31. Dubois, A.; Gadde, L.-E. The Construction Industry as a Loosely Coupled System: Implications for Productivity and Innovation. Constr. Manag. Econ. 2002, 20, 621–631. [Google Scholar] [CrossRef]
  32. Straub, E.T. Understanding Technology Adoption: Theory and Future Directions for Informal Learning. Rev. Educ. Res. 2009, 79, 625–649. [Google Scholar] [CrossRef]
  33. Hinkka, V.; Tätilä, J. RFID Tracking Implementation Model for the Technical Trade and Construction Supply Chains. Autom. Constr. 2013, 35, 405–414. [Google Scholar] [CrossRef]
  34. Ozorhon, B.; Oral, K. Drivers of Innovation in Construction Projects. J. Constr. Eng. Manag. 2017, 143, 04016118. [Google Scholar] [CrossRef]
  35. Sardroud, J. Perceptions of Automated Data Collection Technology Use in the Construction Industry. J. Civ. Eng. Manag. 2014, 21, 54–66. [Google Scholar] [CrossRef]
  36. Lee, S.; Yu, J.; Jeong, D. BIM Acceptance Model in Construction Organizations. J. Manag. Eng. 2013, 31, 04014048. [Google Scholar] [CrossRef]
  37. Robertson, T.S.; Gatignon, H. Competitive Effects on Technology Diffusion. J. Mark. 1986, 50, 1–12. [Google Scholar] [CrossRef]
  38. Slaughter, E.S. Models of Construction Innovation. J. Constr. Eng. Manag. 1998, 124, 226–231. [Google Scholar] [CrossRef]
  39. Blayse, A.; Manley, K. Key Influences on Construction Innovation. Constr. Innov. 2004, 4, 143–154. [Google Scholar] [CrossRef]
  40. Damanpour, F.; Aravind, D. Managerial Innovation: Conceptions, Processes, and Antecedents. Manag. Organ. Rev. 2012, 8, 423–454. [Google Scholar] [CrossRef]
  41. Sepasgozar, S.; Lim, S.; Shirowzhan, S.; Kim, Y.; Nadoushani, Z.M. Utilisation of a New Terrestrial Scanner for Reconstruction of As-built Models: A Comparative Study. In Proceedings of the International Symposium on Automation and Robotics in Construction, Oulu, Finland, 15–18 June 2015. [Google Scholar]
  42. MacKenzie, D.; Wajcman, J. The Social Shaping of Technology; Open University Press: Buckingham, UK, 1999. [Google Scholar]
  43. Bijker, W.E. How Is Technology Made?—That Is the Question! Camb. J. Econ. 2010, 34, 63–76. [Google Scholar] [CrossRef]
  44. Arditi, D.; Kale, S.; Tangkar, M. Innovation in Construction Equipment and Its Flow into the Construction Industry. J. Constr. Eng. Manag. 1997, 123, 371–378. [Google Scholar] [CrossRef]
  45. Skibniewski, M.J.; Zavadskas, E.K. Technology Development in Construction: A Continuum From Distant Past into the Future. J. Civ. Eng. Manag. 2013, 19, 136–147. [Google Scholar] [CrossRef]
  46. Tatum, C.B. Technology and Competitive Advantage in Civil Engineering. J. Prof. Issues Eng. Educ. Pract. 1988, 114, 256–264. [Google Scholar] [CrossRef]
  47. Sepasgozar, S.M.E.; Loosemore, M.; Davis, S.R. Conceptualising information and equipment technology adoption in construction: A critical review of existing research. Eng. Constr. Architect. Manag. 2016, 23, 158–176. [Google Scholar] [CrossRef]
  48. Brandon, P.S.; Lu, S.-L. Clients Driving Innovation; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  49. Shirowzhan, S.; Sepasgozar, S.; Liu, C. Monitoring Physical Progress of Indoor Buildings Using Mobile and Terrestrial Point Clouds. In Proceedings of the Construction Research Congress, New Orleans, LA, USA, 2–4 April 2018. [Google Scholar]
  50. Hooper, B.; Haris, M. 2020 Vision; Royal Institution of Chartered Surveyors: London, UK, 2010; pp. 34–36. [Google Scholar]
  51. Shirowzhan, S.; Lim, S.; Trinder, J. Enhanced Autocorrelation-Based Algorithms for Filtering Airborne Lidar Data over Urban Areas. J. Surv. Eng. 2016, 142, 04015008. [Google Scholar] [CrossRef]
  52. Sepasgozar, S.M.E.; Forsythe, P.; Shirowzhan, S.; Norzahari, F. Scanners and Photography: A Combined Framework. In Proceedings of the 40th Annual Australasian Universities Building Education Association (AUBEA 2016) Conference, Cairns, Australia, 6–8 July 2016. [Google Scholar]
  53. Sepasgozar, S.; Loosemore, M. The role of customers and vendors in modern construction equipment technology diffusion. Eng. Constr. Archit. Manag. 2017, 24, 1–20. [Google Scholar] [CrossRef]
  54. Sepasgozar, S.M.; Bernold, L.E. Factors Influencing the Decision of Technology Adoption in Construction. In Proceedings of the Developing the Frontier of Sustainable Design, Engineering, and Construction—ICSDEC, Fort Worth, TX, USA, 7–9 November 2012. [Google Scholar]
  55. Rogers, E.M. Diffusion of Innovations, 4th ed.; Free Press: New York, NY, USA, 1995. [Google Scholar]
  56. Damanpour, F.; Schneider, M. Phases of the Adoption of Innovation in Organizations: Effects of Environment, Organization and Top Managers. Br. J. Manag. 2006, 17, 215–236. [Google Scholar] [CrossRef]
  57. Foroozanfar, M.; Sepasgozar Samad, M.E. Modeling Green Digital Technology Implementation in Construction. In Proceedings of the Construction Research Congress, New Orleans, LA, USA, 2–4 April 2018. [Google Scholar]
  58. Foroozanfar, M.; Sepasgozar, S.M.E.; Arbabi, H. An empirical investigation on construction companies’ readiness for adopting sustainable technology. In Proceedings of the 34th International Symposium on Automation and Robotics in Construction (ISARC 2017), Taipei, Taiwan, 28 June–1 July 2017. [Google Scholar]
  59. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
  60. Venkatesh, V.; Bala, H. Technology Acceptance Model 3 and a Research Agenda on Interventions. Decis. Sci. 2008, 39, 273–315. [Google Scholar] [CrossRef]
  61. Sepasgozar, S.; Shirowzhan, S.; Wang, C.C. A Scanner Technology Acceptance Model for Construction Projects. Procedia Eng. 2017, 180, 1237–1246. [Google Scholar] [CrossRef]
  62. Sepasgozar, S.; Bliemel, M.; Bemanian, M. Discussion of “Barriers of Implementing Modern Methods of Construction” by M. Motiar Rahman. J. Manag. Eng. 2015, 32, 07015001. [Google Scholar] [CrossRef]
  63. Ritchie, J.; Spencer, L. Qualitative Data Analysis for Applied Policy Research. In The Qualitative Researcher’s Companion; Routledge: London, UK, 2002; pp. 305–329. [Google Scholar]
  64. Cassell, C.; Symon, G. Essential Guide to Qualitative Methods in Organizational Research; Sage Publications: Thousand Oaks, CA, USA, 2004. [Google Scholar]
  65. Yin, R.K. Qualitative Research from Start to Finish; Guilford Press: New York, NY, USA, 2010. [Google Scholar]
  66. Gibbs, G.R. Qualitative Data Analysis: Explorations with NVivo; Open University Press: Buckingham, UK, 2002. [Google Scholar]
  67. Urquhart, C. Grounded Theory for Qualitative Research: A Practical Guide; Sage Publications: London, UK, 2012. [Google Scholar]
  68. Charmaz, K. Constructing Grounded Theory; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  69. Charmaz, K. Grounded Theory Methods in Social Justice Research. In The Sage Handbook of Qualitative Research; Denzin, N.K., Lincoln, Y.S., Eds.; Sage Publications: Thousand Oaks, CA, USA, 2011; pp. 359–380. [Google Scholar]
  70. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Transaction Publishers: Piscataway, NJ, USA, 2009. [Google Scholar]
  71. Belk, R.W. Handbook of Qualitative Research Methods in Marketing; Edward Elgar Publishing: Cheltenham, UK, 2008. [Google Scholar]
  72. Adriaanse, A.; Voordijk, H.; Dewulf, G. The Use of Interorganisational ICT in United States Construction Projects. Autom. Constr. 2010, 19, 73–83. [Google Scholar]
  73. Brockmann, C.; Brezinski, H.; Erbe, A. Innovation in Construction Megaprojects. J. Constr. Eng. Manag. 2016, 142, 04016059. [Google Scholar] [CrossRef]
  74. Leung, M.; Yu, J.; Chan, Y. Focus Group Study to Explore Critical Factors of Public Engagement Process for Mega Development Projects. J. Constr. Eng. Manag. 2014, 140, 04013061. [Google Scholar] [CrossRef]
  75. Singh, V. BIM and Systemic ICT Innovation in AEC: Perceived Needs and Actor’s Degrees of Freedom. Constr. Innov. 2014, 14, 292–306. [Google Scholar] [CrossRef]
  76. Onwuegbuzie, A.J.; Leech, N.L. Validity and Qualitative Research: An Oxymoron? Qual. Quant. 2007, 41, 233–249. [Google Scholar] [CrossRef]
  77. Shenton, A.K. Strategies for Ensuring Trustworthiness in Qualitative Research Projects. Educ. Inf. 2004, 22, 63–75. [Google Scholar] [CrossRef]
  78. Lincoln, Y.S. Emerging Criteria for Quality in Qualitative and Interpretive Research. Qual. Inq. 1995, 1, 275–289. [Google Scholar] [CrossRef]
  79. Morse, J.M.; Barrett, M.; Mayan, M.; Olson, K.; Spiers, J. Verification Strategies for Establishing Reliability and Validity in Qualitative Research. Int. J. Qual. Methods 2008, 1, 13–22. [Google Scholar] [CrossRef]
  80. Whittemore, R.; Chase, S.K.; Mandle, C.L. Validity in Qualitative Research. Qual. Health Res. 2001, 11, 522–537. [Google Scholar] [CrossRef] [PubMed]
  81. Irvine, A.; Drew, P.; Sainsbury, R. ‘Am I Not Answering Your Questions Properly?’Clarification, Adequacy and Responsiveness in Semi-structured Telephone and Face-to-Face Interviews. Qual. Res. 2013, 13, 87–106. [Google Scholar] [CrossRef]
  82. Takhar-Lail, A.; Ghorbani, A. Innovative Research Methodology. In Market Research Methodologies: Multi-Method and Qualitative; Wolfe, K., Ed.; Business Science Reference: Hershey, PA, USA, 2015. [Google Scholar]
  83. Sepasgozar, S.M.E.; Bernold, L.E. A Technology pre-adoption model for construction. In Proceedings of the 37th Annual Conference of Australasian University Building Educators Association (AUBEA), Sydney, Australia, 4–6 July 2012. [Google Scholar]
  84. Abowitz, D.; Toole, T. Mixed Method Research: Fundamental Issues of Design, Validity, and Reliability in Construction Research. J. Constr. Eng. Manag. 2009, 136, 108–116. [Google Scholar] [CrossRef]
  85. Patton, M.Q. Qualitative Evaluation and Research Methods; Sage Publications: Thousand Oaks, CA, USA, 1990. [Google Scholar]
  86. Mason, J. Qualitative Researching; Sage Publications: London, UK, 2002. [Google Scholar]
  87. Teddlie, C.; Yu, F. Mixed Methods Sampling: A Typology With Examples. J. Mixed Methods Res. 2007, 1, 77–100. [Google Scholar] [CrossRef]
  88. Schultze, U.; Avital, M. Designing Interviews to Generate Rich Data for Information Systems Research. Inf. Organ. 2011, 21, 1–16. [Google Scholar] [CrossRef]
  89. Cheng, E.; Li, H. Construction Partnering Process and Associated Critical Success Factors: Quantitative Investigation. J. Manag. Eng. 2002, 18, 194–202. [Google Scholar] [CrossRef]
  90. Venkatesh, V.; Brown, S.A.; Maruping, L.M.; Bala, H. Predicting different conceptualizations of system USE: The competing roles of behavioral intention, facilitating conditions, and behavioral expectation. MIS Q. Manag. Inf. Syst. 2008, 32, 483–502. [Google Scholar] [CrossRef]
  91. Bryman, A. Social Research Methods; Oxford University Press: New York, NY, USA, 2012. [Google Scholar]
  92. Stewart, R.A.; Mohamed, S.; Marosszeky, M. An Empirical Investigation into the Link Between Information Technology Implementation Barriers and Coping Strategies in the Australian Construction Industry. In Construction Innovation; Sage Publications, Ltd.: Thousand Oaks, CA, USA, 2004; pp. 155–171. [Google Scholar]
  93. Alkalbani, S.; Rezgui, Y.; Vorakulpipat, C.; Wilson, I.E. ICT adoption and diffusion in the construction industry of a developing economy: The case of the sultanate of Oman. Archit. Eng. Des. Manag. 2013, 9, 62–75. [Google Scholar] [CrossRef]
  94. Agapiou, A. Perceptions of gender roles and attitudes toward work among male and female operatives in the Scottish construction industry. Constr. Manag. Econ. 2002, 20, 697–705. [Google Scholar] [CrossRef]
  95. Sarshar, M.; Isikdag, U. A survey of ICT use in the Turkish construction industry. Eng. Constr. Archit. Manag. 2004, 11, 238–247. [Google Scholar] [CrossRef]
  96. Bassioni, H.A.; Price, A.D.; Hassan, T.M. Building a conceptual framework for measuring business performance in construction: An empirical evaluation. Constr. Manag. Econ. 2005, 23, 495–507. [Google Scholar] [CrossRef]
  97. Redmond, A.; Hore, A.; Alshawi, M.; West, R. Exploring How Information Exchanges can be Enhanced through Cloud BIM. Autom. Constr. 2012, 24, 175–183. [Google Scholar] [CrossRef]
  98. Aziz, N.; Salleh, H. Case studies of the human critical success factors in information technology (IT) implementation in Malaysian construction industry. J. Build. Perform. 2013, 5. [Google Scholar]
  99. Samuelson, O.; Björk, B.-C. Adoption processes for EDM, EDI and BIM technologies in the construction industry. J. Civ. Eng. Manag. 2013, 19 (Suppl. 1), S172–S187. [Google Scholar] [CrossRef]
  100. Boyatzis, R.E. Transforming Qualitative Information: Thematic Analysis and Code Development; Sage Publications: Thousand Oaks, CA, USA, 1998. [Google Scholar]
  101. Roulston, K. Data analysis and ‘theorizing as ideology’. Qual. Res. 2001, 1, 279–302. [Google Scholar] [CrossRef]
  102. Dey, I. Qualitative Data Analysis: A User Friendly Guide for Social Scientists; Routledge: London, UK, 1993. [Google Scholar]
  103. Tuckett, A.G. Applying thematic analysis theory to practice: A researcher’s experience. Contemp. Nurse 2005, 19, 75–87. [Google Scholar] [CrossRef] [PubMed]
  104. Wicks, D. Coding: Axial Coding. Encyclopedia of Case Study Research; Sage Publications: Thousand Oaks, CA, USA, 2010; pp. 154–156. [Google Scholar]
  105. Charmaz, K. Constructing Grounded Theory: A Practical Guide through Qualitative Analysis; Pine Forge Press: London, UK, 2006. [Google Scholar]
  106. Auerbach, C.F.; Silverstein, L.B. Qualitative Data: An Introduction to Coding and Analysis; NYU Press: New York, NY, USA, 2003. [Google Scholar]
  107. Given, L.M. The Sage Encyclopedia of Qualitative Research Methods; Sage Publications: Thousand Oaks, CA, USA, 2008. [Google Scholar]
  108. Peansupap, V.; Walker, D. Factors Affecting ICT Diffusion: A Case Study of Three Large Australian Construction Contractors. Eng. Constr. Architect. Manag. 2005, 12, 21–37. [Google Scholar] [CrossRef]
  109. Huang, L.K. Top Management Support and IT Adoption in the Taiwanese Small and Medium Enterprises: A Strategic View. Int. J. Enterp. Netw. Manag. 2008, 2, 227–247. [Google Scholar] [CrossRef]
  110. Tanner, J.F., Jr.; Chonko, L.B. Trade show objectives, management, and staffing practices. Ind. Mark. Manag. 1995, 24, 257–264. [Google Scholar] [CrossRef]
  111. Smith, T.M.; Hama, K.; Smith, P.M. The Effect of Successful Trade Show Attendance on Future Show Interest: Exploring Japanese Attendee Perspectives of Domestic and Offshore International Events. J. Bus. Ind. Mark. 2003, 18, 403–418. [Google Scholar] [CrossRef]
  112. Hansen, K. Measuring Performance at Trade Shows: Scale Development and Validation. J. Bus. R. 2004, 57, 1–13. [Google Scholar] [CrossRef]
  113. Golfetto, F.; Salle, R.; Borghini, S.; Rinallo, D. Opening the Network: Bridging the IMP Tradition and Other Research Perspectives. Ind. Mark. Manag. 2007, 36, 844–848. [Google Scholar] [CrossRef]
  114. Reychav, I. Antecedents to Acquisition of Knowledge in Trade Shows. Knowl. Process Manag. 2011, 18, 230–240. [Google Scholar] [CrossRef]
  115. Bettis-Outland, H.; Johnston, W.J.; Wilson, R.D. Using Trade Show Information to Enhance Company Success: An Empirical Investigation. J. Bus. Ind. Mark. 2012, 27, 384–391. [Google Scholar] [CrossRef]
  116. Lee, C.H.; Kim, S.Y. Differential Effects of Determinants on Multi-dimensions of Trade Show Performance: By Three Stages of Pre-show, At-show, and Post-show Activities. Ind. Mark. Manag. 2008, 37, 784–796. [Google Scholar] [CrossRef]
  117. Ling-yee, L. Marketing Resources and Performance of Exhibitor Firms in Trade Shows: A Contingent Resource Perspective. Ind. Mark. Manag. 2007, 36, 360–370. [Google Scholar] [CrossRef]
  118. Berne, C.; García-Uceda, M.E. Criteria involved in evaluation of trade shows to visit. Ind. Mark. Manag. 2008, 37, 565–579. [Google Scholar] [CrossRef]
  119. Yuksel, U.; Voola, R. Travel Trade Shows: Exploratory Study of Exhibitors’ Perceptions. J. Bus. Ind. Mark. 2010, 25, 293–300. [Google Scholar] [CrossRef]
  120. Ling-yee, L. Relationship Learning at Trade Shows: Its Antecedents and Consequences. Ind. Mark. Manag. 2006, 35, 166–177. [Google Scholar] [CrossRef]
  121. Tafesse, W. Understanding How Resource Deployment Strategies Influence Trade Show Organizers’ Performance Effectiveness. Eur. J. Mark. 2014, 48, 1009–1025. [Google Scholar] [CrossRef]
  122. Rinallo, D.; Borghini, S.; Golfetto, F. Exploring Visitor Experiences at Trade Shows. J. Bus. Ind. Mark. 2010, 25, 249–258. [Google Scholar] [CrossRef]
  123. Rinallo, D.; Golfetto, F. Exploring the Knowledge Strategies of Temporary Cluster Organizers: A Longitudinal Study of the EU Fabric Industry Trade Shows (1986–2006). Econ. Geogr. 2011, 87, 453–476. [Google Scholar] [CrossRef]
  124. Gottlieb, U.; Brown, M.; Ferrier, L. Consumer Perceptions of Trade Show Effectiveness: Scale Development and Validation within a B2C Context. Eur. J. Mark. 2014, 48, 89–107. [Google Scholar] [CrossRef]
  125. Schultze, U. Reflexive Ethnography in Information Systems Research. Qualitative Research in IS: Issues and Trends; Idea Group: Hershey, PA, USA, 2000; pp. 78–103. [Google Scholar]
  126. Tan, Y.; Shen, L.; Langston, C. Competition Environment, Strategy, and Performance in the Hong Kong Construction Industry. J. Constr. Eng. Manag. 2011, 138, 352–360. [Google Scholar] [CrossRef]
  127. Hambrick, D.C. Taxonomic Approaches to Studying Strategy: Some Conceptual and Methodological Issues. J. Manag. 1984, 10, 27–41. [Google Scholar] [CrossRef]
  128. Kaufman, L.; Rousseeuw, P.J. Finding Groups in Data: An Introduction to Cluster Analysis; John Wiley & Sons: Hoboken, NJ, USA, 2009; Volume 344. [Google Scholar]
  129. Bezdek, J.C. Pattern Recognition with Fuzzy Objective Function Algorithms; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1981. [Google Scholar]
  130. Gnanadesikan, R. Methods for Statistical Data Analysis of Multivariate Observations; Wiley-Interscience: Saint Nom, France, 1997; Volume 321. [Google Scholar]
  131. Park, H.S.; Dailey, R.; Lemus, D. The use of Exploratory Factor Analysis and Principal Components Analysis in Communication Research. Hum. Commun. Res. 2002, 28, 562–577. [Google Scholar] [CrossRef]
  132. Abdi, H.; Williams, L.J. Principal Component Analysis. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 433–459. [Google Scholar] [CrossRef]
  133. Davies, R.; Harty, C. Measurement and Exploration of Individual Beliefs about the Consequences of Building Information Modelling Use. Constr. Manag. Econ. 2013, 31, 1110–1127. [Google Scholar] [CrossRef]
  134. Rabionet, S.E. How I Learned to Design and Conduct Semi-Structured Interviews: An Ongoing and Continuous Journey. Qual. Rep. 2011, 16, 563–566. [Google Scholar]
  135. Nikas, A.; Poulymenakou, A.; Kriaris, P. Investigating Antecedents and Drivers Affecting the Adoption of Collaboration Technologies in the Construction Industry. Autom. Constr. 2007, 16, 632–641. [Google Scholar] [CrossRef]
  136. Rahman, M. Barriers of Implementing Modern Methods of Construction. J. Manag. Eng. 2014, 30, 69–77. [Google Scholar] [CrossRef]
  137. Cao, D.; Li, H.; Wang, G. Impacts of Isomorphic Pressures on BIM Adoption in Construction Projects. J. Constr. Eng. Manag. 2014, 140. [Google Scholar] [CrossRef]
  138. Fulford, R.; Standing, C. Construction Industry Productivity and the Potential for Collaborative Practice. Int. J. Proj. Manag. 2014, 32, 315–326. [Google Scholar] [CrossRef]
  139. Samuelson, O.; Björk, B. Adoption Processes for EDM, EDI and BIM Technologies in the Construction Industry. J. Civ. Eng. Manag. 2013, 19 (Suppl. 1), S172–S187. [Google Scholar] [CrossRef]
  140. Olawale, Y.; Sun, M. PCIM: Project Control and Inhibiting-Factors Management Model. J. Manag. Eng. 2013, 29, 60–70. [Google Scholar] [CrossRef]
  141. Attride-Stirling, J. Thematic Networks: An Analytic Tool for Qualitative Research. Qual. Res. 2001, 1, 385–405. [Google Scholar] [CrossRef]
  142. Saaty, T.L. What is the Analytic Hierarchy Process? Springer: Berlin/Heidelberg, Germany, 1988. [Google Scholar]
  143. Lai, V.S.; Wong, B.K.; Cheung, W. Group Decision Making in a Multiple Criteria Environment: A Case Using the AHP in Software Selection. Eur. J. Operational Res. 2002, 137, 134–144. [Google Scholar] [CrossRef]
  144. Hasnain, M.; Ullah, F.; Thaheem, M.J.; Sepasgozar, S.M.E. Prioritizing Best Value Contributing Factors for Contractor Selection: An AHP Approach. In Proceedings of the 21st International Symposium on Advancement of Construction Management and Real Estate; Chau, K.W., Chan, I.Y.S., Lu, W., Webster, C., Eds.; Springer: Singapore, 2018; pp. 1121–1131. [Google Scholar]
  145. Denzin, N.K. Interpretive Biography; Sage Publications: Thousand Oaks, CA, USA, 1989. [Google Scholar]
  146. Myers, M.D. Qualitative Research in Business and Management; Sage Publications: Thousand Oaks, CA, USA, 2013. [Google Scholar]
  147. Flick, U. An Introduction to Qualitative Research; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  148. Lucko, G.; Rojas, E.M. Research validation: Challenges and opportunities in the construction domain. J. Constr. Eng. Manag. 2009, 136, 127–135. [Google Scholar] [CrossRef]
  149. Ameyaw, E.E.; Chan, A.P. Risk allocation in public-private partnership water supply projects in Ghana. Constr. Manag. Econ. 2015, 33, 187–208. [Google Scholar] [CrossRef]
  150. Yang, Y.N.; Kumaraswamy, M.M.; Pam, H.J.; Mahesh, G. Integrated qualitative and quantitative methodology to assess validity and credibility of models for bridge maintenance management system development. J. Manag. Eng. 2010, 27, 149–158. [Google Scholar] [CrossRef]
  151. Sepasgozar, S.M.; Davis, S.R. Decision Analysis and Negotiations for Technology Adoption Decision: An Exploratory Study. In Proceedings of the 29th Australian and New Zealand Academy of Management (ANZAM), Queenstown, New Zealand, 2–4 December 2015. [Google Scholar]
  152. Rogers, E.M. Diffusion of Innovations; Free Press: New York, NY, USA, 2010. [Google Scholar]
Figure 1. The structure of the study on the technology adoption framework development including three main phases of ‘investigation’, ‘adoption’ and ‘implementation’.
Figure 1. The structure of the study on the technology adoption framework development including three main phases of ‘investigation’, ‘adoption’ and ‘implementation’.
Buildings 08 00074 g001
Figure 2. Schematic of the techniques employed to achieve the research objectives. Note: The full illustration of the framework (staging process) is on the front face of the cube. Pre-adoption, adoption and post-adoption refer to the investigation, adoption decision and implementation phases, respectively.
Figure 2. Schematic of the techniques employed to achieve the research objectives. Note: The full illustration of the framework (staging process) is on the front face of the cube. Pre-adoption, adoption and post-adoption refer to the investigation, adoption decision and implementation phases, respectively.
Buildings 08 00074 g002
Figure 3. The illustration of the criterion-chain sampling from the crane industry to identify the experienced specialists familiar with the crane technology adoption process in Sydney, Melbourne, Nevada and North Dakota.
Figure 3. The illustration of the criterion-chain sampling from the crane industry to identify the experienced specialists familiar with the crane technology adoption process in Sydney, Melbourne, Nevada and North Dakota.
Buildings 08 00074 g003
Figure 4. Interview samples: (a) The structure of two types of interviews for customers and vendors. CS refers to Customer Side, and VS refers to Vendor Side. (b) A diagram was also collected during the semi-structured interviews, when it was the best way of conveying information. The project manager of a residential tower commented on the stages and sketched what equipment was used in their project, Sydney. (c) Alternative view of the construction technology adoption process provided by a project manager. The project manager of an educational building sketched the sequential process of adopting a technology in their project, Sydney, Australia.
Figure 4. Interview samples: (a) The structure of two types of interviews for customers and vendors. CS refers to Customer Side, and VS refers to Vendor Side. (b) A diagram was also collected during the semi-structured interviews, when it was the best way of conveying information. The project manager of a residential tower commented on the stages and sketched what equipment was used in their project, Sydney. (c) Alternative view of the construction technology adoption process provided by a project manager. The project manager of an educational building sketched the sequential process of adopting a technology in their project, Sydney, Australia.
Buildings 08 00074 g004
Figure 5. Example of completed questions showing the structured questions to identify key factors influencing the technology adoption process and a general manager’s comments on the semi-structured questions.
Figure 5. Example of completed questions showing the structured questions to identify key factors influencing the technology adoption process and a general manager’s comments on the semi-structured questions.
Buildings 08 00074 g005
Figure 6. (a) The plant manager of an infrastructure company compared the hypothetical Construction Technology Adoption Framework (CTAP) with the company’s own procedures of expenditure for purchasing a truck mixer and a tunnel boring machine, Sydney. (b) Alternative view of the construction technology adoption process provided by a tunnel boring machine vendor. It shows a sample process sketched by the vice sales managers of a leader company visually explaining the process of technology purchase for a tunnel boring machine from the identification of solutions to delivery, Melbourne, Australia.
Figure 6. (a) The plant manager of an infrastructure company compared the hypothetical Construction Technology Adoption Framework (CTAP) with the company’s own procedures of expenditure for purchasing a truck mixer and a tunnel boring machine, Sydney. (b) Alternative view of the construction technology adoption process provided by a tunnel boring machine vendor. It shows a sample process sketched by the vice sales managers of a leader company visually explaining the process of technology purchase for a tunnel boring machine from the identification of solutions to delivery, Melbourne, Australia.
Buildings 08 00074 g006
Figure 7. Creating the dataset for technology adoption phases including the participants’ responses.
Figure 7. Creating the dataset for technology adoption phases including the participants’ responses.
Buildings 08 00074 g007
Figure 8. Systematic analysis flowchart to explore the CTAP framework.
Figure 8. Systematic analysis flowchart to explore the CTAP framework.
Buildings 08 00074 g008
Figure 9. A sample of coding analysis on the transcriptions in the dataset created with NVivo.
Figure 9. A sample of coding analysis on the transcriptions in the dataset created with NVivo.
Buildings 08 00074 g009
Figure 10. Fishbone diagram technique implemented to categorise factors from the literature review related to information system acceptance.
Figure 10. Fishbone diagram technique implemented to categorise factors from the literature review related to information system acceptance.
Buildings 08 00074 g010
Figure 11. (a) A sample sketch of an organizational chart to identify who was involved in the technology adoption process in the participants’ company. This is a typical hierarchy indicating the parties involved in the technology adoption process. (b) The participants explained the people involved in the adoption decision to be the base of the AHP questionnaire (GX and GY refer to two different technology Groups). Note: This structured diagram was developed from the responses. This is a sample of a comparison-based analysis enabling the researcher to compare to what extent mangers influence the decision for each type of technology.
Figure 11. (a) A sample sketch of an organizational chart to identify who was involved in the technology adoption process in the participants’ company. This is a typical hierarchy indicating the parties involved in the technology adoption process. (b) The participants explained the people involved in the adoption decision to be the base of the AHP questionnaire (GX and GY refer to two different technology Groups). Note: This structured diagram was developed from the responses. This is a sample of a comparison-based analysis enabling the researcher to compare to what extent mangers influence the decision for each type of technology.
Buildings 08 00074 g011
Figure 12. Applying the ‘matrix coding query’ on the stored data including created nodes in ‘NVivo’ for comparing Australian and North American contexts.
Figure 12. Applying the ‘matrix coding query’ on the stored data including created nodes in ‘NVivo’ for comparing Australian and North American contexts.
Buildings 08 00074 g012
Figure 13. A summary of the research methods, findings and achieved objectives.
Figure 13. A summary of the research methods, findings and achieved objectives.
Buildings 08 00074 g013
Figure 14. Schematic illustration of the methodological cube, showing the relationship between objectives and contributions. The data were entered into four software programs: MATLAB, SPSS, NVivo and Expert Choice.
Figure 14. Schematic illustration of the methodological cube, showing the relationship between objectives and contributions. The data were entered into four software programs: MATLAB, SPSS, NVivo and Expert Choice.
Buildings 08 00074 g014
Table 1. Individuals involved in the adoption decision phase.
Table 1. Individuals involved in the adoption decision phase.
NodeNew RoleConfirmed Role
Top manager
Middle manager (project or construction manager)
Production manager (e.g., fleet manager) and operators
Financial manager
Engineer (mechanical, electrical)
Project manager advisor
Table 2. Individuals involved in the implementation phase.
Table 2. Individuals involved in the implementation phase.
NodeNew IndividualConfirmed Individual
Top manager
Middle manager (project or construction manager)
Production manager (and/or fleet manager)
Fitter
Engineer (electrical and/or mechanic)
Crew (and/or operator)
Table 3. A summary of the accomplished objectives.
Table 3. A summary of the accomplished objectives.
DescriptionLimitations and/or Future Direction
To investigate the vendors’ activities to expose their technologyTo profile them based on their dissemination strategies in different contexts
To explore influential factors and barriers affecting the processTo examine the identified factors in different contexts
To identify individuals involved in the processTo investigate the interaction and relationships between the activities of customers and vendors
To formulate the understanding of these activities into a comprehensive frameworkTo investigate the process used by customers of new technologies as they move from recognizing a need to actually using a new technology

Share and Cite

MDPI and ACS Style

Sepasgozar, S.M.E.; Davis, S. Construction Technology Adoption Cube: An Investigation on Process, Factors, Barriers, Drivers and Decision Makers Using NVivo and AHP Analysis. Buildings 2018, 8, 74. https://doi.org/10.3390/buildings8060074

AMA Style

Sepasgozar SME, Davis S. Construction Technology Adoption Cube: An Investigation on Process, Factors, Barriers, Drivers and Decision Makers Using NVivo and AHP Analysis. Buildings. 2018; 8(6):74. https://doi.org/10.3390/buildings8060074

Chicago/Turabian Style

Sepasgozar, Samad M. E., and Steven Davis. 2018. "Construction Technology Adoption Cube: An Investigation on Process, Factors, Barriers, Drivers and Decision Makers Using NVivo and AHP Analysis" Buildings 8, no. 6: 74. https://doi.org/10.3390/buildings8060074

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop