Next Article in Journal
Nature of Attractive Multiplayer Games: Case Study on China’s Most Popular Card Game—DouDiZhu
Previous Article in Journal
The Dynamic Evolution Mechanism of Heterogeneous OWOM—An Improved Viral Marketing Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Analytics Maturity Models: An Overview

1
Department of Land Management and Landscape Architecture, Faculty of Environmental Engineering and Land Surveying; University of Agriculture in Kraków; Balicka 253c, 30-149 Kraków, Poland
2
Institute of Economics and Informatics, Faculty of Organization and Management, Silesian University of Technology in Gliwice; Akademicka 2A, 44-100 Gliwice, Poland
*
Author to whom correspondence should be addressed.
Information 2020, 11(3), 142; https://doi.org/10.3390/info11030142
Submission received: 17 December 2019 / Revised: 26 February 2020 / Accepted: 28 February 2020 / Published: 2 March 2020
(This article belongs to the Section Review)

Abstract

:
This paper aims to review, characterize and comparatively analyze selected organizations’ analytics maturity models. Eleven various organizations’ analytics maturity models (AMMs) were characterized. The models’ characteristics were developed based on an academic literature review as well as reports and publications shared by analytics sector operators. Most of the analyzed models comprised five analytics maturity levels. Comprehensive descriptions of an organization’s analytics maturity levels were available for all models. However, no detailed description of the assessment process or criteria for placing an organization at a specific analytics development level were available in all cases. Selected analytics maturity models were described in such a detailed manner that their application in an independent assessment of an organization’s analytics maturity was possible. In the future, an increase is expected in both the number and availability of new analytics maturity models, in particular those personalized and dedicated to a specific sector or business, and the number of entities involved in an assessment of an organization’s analytics maturity and the implementation of data analytics in organizations. The article presents and summarizes selected features of eleven various organizations’ analytics maturity models. This is the firstever such extensive review of those models.

1. Introduction

Analytics is everywhere, from consumer gadgets and intelligent things to the rapidly expanding applications. In a digital era fueled by data and automation, analytics has evolved from an afterthought to a necessity [1]. Some companies have built their very businesses on their ability to collect, analyze, and act on data. Every company can learn from what these firms do [2]. To survive and thrive, intelligence needs to be pervasively integrated into the entire customer journey, products, operations, and services [1]. Many functional areas within organizations increasingly look to data and analytics as sources of knowledge and influence [3].
Data analytics systems are an important strategic investment for many organizations, and can potentially contribute significantly to firm performance [4]. Advanced analytics is likely to become a decisive competitive asset in many industries and a core element in companies’ efforts to improve performance [5]. Recent developments such as real-time, predictive, and cloud business intelligence and analytics (BI&A) introduce extra ways for organizations to obtain insight and business value from an expanded range of data. Organizations have struggled with the strategy, implementation, and measurement of their BI&A efforts, and a series of business intelligence maturity models (BIMMs) and analytics maturity models (AMMs) has been introduced to identify strengths and weaknesses of their BI&A situation, and assist remedial action [6].
Today business intelligence and analytics generally refers to the technology, applications and processes used to gather, store, and analyze data, to assist people to make sound business decisions. For the past five years BI&A has been ranked a top priority for organizations globally as a tool for achieving a competitive advantage and attaining business value [7].
Recent years have seen an increase in interest in data analytics, mainly due to the business benefits it can provide. Currently, many organizations are aware that data analytics can bring them a significant competitive advantage. A mature analytics culture allows organizations to effectively use data in business decisions. An increasing number of entities decide to measure and assess the effectiveness of measures taken, as it allows business processes to be optimized based on the collected data and not on intuition or speculation. Moreover, it enables a dynamic response to the changing environment. More and more organizations wish to develop their analytics strategies besides spreadsheets or simple management dashboards. A growing number of organizations are attempting to build a widespread analytics culture in which data analysis will play an important role in the decision-making process [8]. Meanwhile, however, few organizations have sufficient analytics capacities, including infrastructure and human resources as well as skills in effectively managing them, to be able to meet today’s analytics needs. What is more, few organizations are able to estimate the extent to which they make use of data analytics or to answer the question of how to increase the effectiveness of business processes based on analytics. An increasing number of organizations are searching for methods for assessing analytics maturity that would help steer the development of an organization’s analytics competencies and analytics culture [9].
Generally, the notion of “maturity” is very broad and means “fully developed”, “perfected.” A maturity model indicates a path to perfection. It is a guidepost for the development with strictly defined criteria and indicators defining the current and target state from specific reference values. Organizations reach analytics maturity through evolution, which includes integration, management, and use of various data sources at key decision-making points. This paper aims to review, characterize, and comparatively analyze selected organizations’ analytics maturity models.

2. Materials and Methods

The interest in data analytics has been growing in recent years, mainly due to the advantages it provides to those who apply it skillfully. Many organizations are facing challenges that accompany the implementation of data analytics. The implementation effectiveness is largely determined by the assessment of an organization’s position in the analytics continuum. An organizations’ analytics maturity assessment enables planning the scope and rate of implementation work. This is a kind of “inventory of the existing state of affairs.” There are numerous models of an organization’s analytics maturity assessment. This assessment is frequently carried out in a pioneering manner, according to the subjectively adopted criteria, by various IT sector service providers. This paper describes and analyzes eleven different maturity models that can be used for the assessment of analytics maturity of organizations. The models’ characteristics were developed based on an academic literature review as well as reports and publications shared by analytics sector operators.

3. The Analytics Continuum—Analytics Maturity Path

Understanding an organization’s capability to make use of data analytics to increase innovation and the competitive advantage requires an assessment of the position it takes in the so-called analytics continuum. Currently, data analytics can be divided into five categories, which are defined, inter alia, by tools, techniques, and the approach to data analytics: (1) descriptive analytics, (2) diagnostic analytics, (3) predictive analytics, (4) prescriptive analytics, and (5) cognitive analytics. In analytics practice, these categories co-exist and complement each other.
Particular analytics categories are placed on the analytics development path. The analytics maturity path shows an organization’s analytics maturity stages starting from the application of descriptive analytics and ending with cognitive analytics (Figure 1). Each step on this path moves the organization towards the solutions that allow soundly based decisions to be taken faster (on-demand enterprise) [10].
The evolution of the use of analytics in an enterprise is not linear in nature. The implementation of organizational changes may vary in terms of the order and intensity, depending on both an organization’s specificity and the business context. In the implementation and development of analytics in an organization, openness to changes is paramount [11].
In the Analytics 1.0 era, enterprises used data warehouses and copies of operational data as the basis for analyses (Figure 2). In the Analytics 2.0 era, the focus was on Hadoop clusters and NoSQL databases. Analytics 3.0 makes use of new, “agile” analytics methods and machine learning techniques that ensure a significantly quicker insight into data. Analytics 3.0 environment does not reject the existing concepts of analytics but integrates new technologies with the tools used so far [12].
Descriptive analytics enables learning about and understanding the reality through the characterization of data and the isolation of patterns they contain (Hindsight—What Happened, technology: Files, RDBMS, ODS, Early Data Warehouse, OLAP). It is the primary source of information for the management team. Descriptive analytics provides an answer to the question “What happened?”, and is applied, e.g., in research into economic efficiency and the effectiveness of marketing activities, and in customer profiling.
Diagnostic analytics provides an answer to the question “Why did it happen”? It is often considered equivalent to traditional analytics in which decisions made with a certain delay are dominant. This delay results from the need to collect and systematize data, and then to analyze and interpret them. Data are collected and processed at defined time intervals, hence the actions taken based on their analysis are carried out with a certain delay. Diagnostic analytics enables the detection of regularities and quantitative relationships between variables through the analysis of historical data. It is the manager who interprets information and decides how to use it [11].
Predictive analytics falls into the category of advanced analytics. It involves modeling as well as the preparation of simulations and forecasts. Both current and historical data are analyzed in order to gain an insight into what may happen in the future (Foresight—What Will Happen, technology: No/NewSQL, Mature In-Memory, DB and Processing, Early Data Lake). Predictive analytics is focused on forecasts and provides answers to the question “What will happen in the future”? It is aimed at predicting future events and trends. It involves the search for patterns and relationships occurring in the past, and the use of resulting conclusions to produce forecasts. Predictive analytics “learns” based on experiencing data to predict the future behavior of individuals (users, customers) in order to make better decisions [13]. In the predictive model, the system can “make decisions,” allowing it to achieve the set objective based on a specific policy of action, i.e., previously adopted and delivered rules.
Prescriptive analytics uses simulation and machine learning to suggest actions to be taken in order to achieve desired results (Simulation-Driven Analysis and Decision-making, technology: Mature Data Lake). Prescriptive analytics supports the decision-making process with the aim to automate the actions taken. It provides an answer to the question “What actions should be taken”? In turn, cognitive analytics use the artificial intelligence (AI) technologies and high-performance data analysis to automate the decision-making process and increase the efficiency of decisions taken through the cooperation with intelligent machines (Self-Learning and Completely Automated Enterprise, Computerized Human Thought Simulation and Actions). Cognitive systems area based on real-time analytics. Data are collected, primarily in order to detect general regularities and patterns. On this basis, analytics models are created and then placed in the data stream. This significantly affects the interaction between the customer and the company, i.e., the way of communicating with the customer, and the reception of a brand. This involves monitoring the interaction between the customer and the company in real time, an analysis of the context and customer’s behavior patterns, and the selection of an action that is optimal at the given moment. This is the so-called “perishable insight” that can be discovered and used in action only in real time.
Currently, most organizations have considerable experience in the use of descriptive and diagnostic analytics. Thanks to these foundations, the organizations are ready to move to more advanced analytics that may increase the effectiveness of the actions they take. With an increase in an organization’s analytics maturity, the extent to which forecasts and simulations are used increases. This higher level of analytics makes use of the artificial intelligence techniques. Machine learning, a crucial technology for supporting artificial intelligence, is a computational method which allows machines to act or “think” without being specifically programmed to perform specific actions. Sets of algorithms of mathematical models “learn” from data. Future-oriented organizations are beginning to consider cognitive analytics. This level of analytics maturity is based on the human–machine interaction in which human capabilities are enhanced by the “machine intelligence.” This can provide groundbreaking (strategic) information [10].

3.1. Analytics Maturity Models

An assessment of an organization’s analytics maturity can be carried out in a variety of ways. Traditional methods of measuring analytics capacities include self-assessment, qualitative interviews, and quantitative studies. However, a traditional approach to the assessment of analytics capacities has certain limitations, mainly due to the lack of an opportunity to verify them using what is referred to as the “depth and width” of analytics capacity. Self-assessment and quantitative studies are usually carried out using a checklist. They enable an assessment as to whether specific technologies and analytics tools have been implemented. At the same time, however, they do not enable an assessment as to whether a particular organization uses them fully to make business decisions, and whether they affect the organization’s activities. In turn, qualitative interviews with the management can be anecdotal and selective in their scope. Moreover, studies conducted in this way may not reveal differences in analytics maturity that occur between various groups of employees [14]. An alternative to these methods is studies carried out using analytics maturity models.
The notion of maturity was first proposed by Phillip Crosby [15]. Maturity describes a “state of being complete, perfect or ready” [16]. To reach a desired state of maturity, an evolutionary transformation path from an initial to a target stage needs to be progressed [17].
A model is a schematic and simplified representation of a more complex reality. What is included or abstracted stems from hypothesis about what is essential or not. A model is a generalization of what we think we understand about a concept. It is elaborated and evolves through a process of theory development and validation [18].
Maturity models (MMs) have proliferated across a multitude of domains [19,20]. Initially proposed in the 1970’s [21], more than a hundred MMs have been published in the field of information systems up to date. Maturity models have been designed to assess the maturity (i.e., competency, capability, level of sophistication) of a selected domain based on a more or less comprehensive set of criteria. A maturity model consists of a sequence of maturity levels for a class of objects. It represents an anticipated, desired, or typical evolution path of these objects shaped as discrete stages. Typically, these objects are organizations or processes.
Maturity models are used to describe, explain, and evaluate growth life cycles. The basic concept of all models is based on the fact that things change over time and that most of these changes can be predicted and regulated [22]. The maturity model serves as the scale for the appraisal of the position on the evolution path. It provides criteria and characteristics that need to be fulfilled to reach a particular maturity level. During a maturity appraisal, a snapshot of the organization regarding the given criteria is made [23]. Important characteristics of MMs are the maturity concept, the dimensions, the levels, the maturity principle, and the assessment approach [9].
Analytics maturity can be described as the evolution of an organization to integrate, manage, and leverage all relevant internal and external data sources into key decision points. It means creating an ecosystem that enables insight and action. In other words, analytics maturity is not simply about having some technology in place; it involves technologies, data management, analytics, governance, and organizational components. It can take years to create and instill an analytics culture in an organization [8] (p. 9). Analytics maturity describes how deeply and effectively the organization uses tools, people, processes, and strategy to manage and analyze data for the purpose of informing business decisions. Maturity models are used to guide this transformation process.
Here, eleven types of analytics maturity models are described by different authors, and a summary is shown in Table 1: Analytic Processes Maturity Model (APMM); Analytics Maturity Quotient Framework; Blast Analytics Maturity Assessment Framework; DAMM—Data Analytics Maturity Model for Associations; DELTA Plus Model; Gartner’s Maturity Model for Data and Analytics; Logi Analytics Maturity Model; Online Analytics Maturity Model (OAMM); SAS Analytic Maturity Scorecard; TDWI Analytics Maturity Model; Web Analytics Maturity Model (WAMM). The analytics models were selected following a review of scientific literature and reports and publications from the analytics sector.

3.1.1. Analytic Processes Maturity Model (APMM)

Analytic Processes Maturity Model (APMM) is a framework for evaluating the analytic maturity of an organization. The framework is based upon a few basic concepts: analytic models, analytic infrastructure, and analytic operations. The APMM is broadly based upon the Capability Maturity Model that is the basis for measuring the maturity of processes for developing software and identifies analytic-related processes in six key process areas: (1) building analytic models; (2) deploying analytic models; (3) managing and operating analytic infrastructure; (4) protecting analytic assets through appropriate policies and procedures; (5) operating an analytic governance structure; and (6) identifying analytic opportunities, making decisions, and allocating resources based upon an analytic strategy (Figure 3). Based upon the maturity of these processes, the APMM divides organizations into five maturity levels: (1) organizations that can build reports; (2) organizations that can build and deploy models; (3) organizations that have repeatable processes for building and deploying analytics; (4) organizations that have consistent enterprise-wide processes for analytics; and (5) enterprises whose analytics is strategy driven [24].

3.1.2. Analytics Maturity Quotient Framework

Analytics Maturity Quotient is based on an assessment of: (1) data quality (DQ)—bad data quality can severely impair an organization’s ability to learn about their customers and their products through data. Hence, data quality is the foundation on which analytics stands; (2) data-driven leadership (L)—data-driven leaders not only trust data to prove/disprove their own beliefs about business opportunities but also are open to learn from data, irrespective of their beliefs; (3) people with analytics skills (P)—people with the right analytics skills as well as both technical skills to analyze data and interpersonal/business skills to bridge the gap from data to business; (4) data-driven decision-making process (D)—once leadership and people are in place, then data need to be inserted into the decision-making process; and (5) agile infrastructure (I). Speaking in mathematical terms, AMQ = DQ × (0.4 × L + 0.3 × P + 0.2 × D + 0.1 × I), where DQ represents data quality with a value between 0 and 10 L stands for the degree to which the leadership is data-driven and has a value between 0 and 10, 0 being organizations where there are no leaders who believe in leveraging data for decision-making, and 10 for organizations where all the leaders are data-driven. P is the degree to which organization has people with right analytics skills. D represents the degree to which the organization has data inserted within decision-making process, and takes on values from 0 to 10. In turn, I represents the organization’s readiness to instrument quickly, and takes on value from 0 to 10 [25].

3.1.3. Blast Analytics Maturity Assessment Framework

The Blast [26] analytics maturity assessment tests six key process areas and success factor dimensions—strategy, governance, data management, insights, evolution, and resources. In each assessment dimension, from 1 to 6 points are awarded, which allows an organization to be placed in a selected development stage (Figure 4). Blast Analytics Maturity Assessment Framework is based on the Online Analytics Maturity Model developed by S. Hamel [27].
An analytics maturity assessment includes a survey among employees carried out at quarterly intervals (Assessment). This provides an opportunity to assess the factual circumstances and progress in the implementation of analytics (Benchmark). The studies provide the basis for the development of an analytics development strategy, taking into account the current conditions (Strategic Roadmap). In turn, the strategy provides the basis for an action plan that is to ensure the fulfillment of set objectives (Figure 5).

3.1.4. DAMM—Data Analytics Maturity Model for Associations

In the information economy, data are king. But harnessing data is challenging, especially for associations [28]. The DAMM model was made available as a tool supporting an assessment of analytics maturity for associations and non-profit organizations (DAMM for Associations). It is a response to the demand for such analyses notified by members of various organizations at industry meetings (Association Analytics Network meeting). The DAMM model was developed by Association Analytics (A2), a provider of tools improving the management of associations. DAMM assesses the four key elements of data analytics: organization and culture, architecture/technology, data governance, and strategic alignment. An assessment of an organization in these dimensions allows it to be placed in one of the five analytics maturity stages (Figure 6).
Associations that are at the first stage (Learning) understand the data potential and value yet they lack knowledge, tools, and processes which prevents them from taking immediate actions. Departments often operate independently, and data are not integrated. Decisions are most frequently taken based on experience, the adopted policy, and tradition. Associations which are at the second stage (Planning) are aware of the benefits arising from the use of data analytics, and have employees willing to work with data who could form an analytics team.
Associations that are at the third stage (Building) have begun the implementation of data analytics throughout the organization. They have a strategy and an implementation plan that includes the central data repository and tools for data analysis and visualization. Associations at the fourth stage (Applying) use interactive visualizations and manager’s dashboards in obtaining information that is helpful in performance management in key business areas. They use data analysis to solve business problems. In turn, associations that are at the fifth stage (Leading) take decisions based on data analytics. The data-based decision-making process is widespread throughout the organization, which affects the effectiveness of actions taken.
The DAMM model distinguishes five areas of the improvement of an organization, which may affect its position in the analytics continuum [28]: (1) democratization of data—make your data accessible to everyone—according to an ASAE study, 68% employees of associations notice the potential of data while, however, reporting the lack of procedures, processes, and tools to make the data accessible and useful for all; (2) instilling an analytics-guided culture—many organizations employ business leaders who think analytically and know the value of data. Those are the champions driving your organization to be more data-guided. Those same staff members can form the core analytics team and begin to develop an effective data strategy; (3) developing clearly defined KPIs (Key Performance Indicators); (4) creating a central repository that encompasses all key data sources—it is advisable to store data in one data repository; (5) implement a data governance program—in the DAMM model, the final stage of the analytics continuum is the active use of data in the decision-making process.

3.1.5. DELTA Plus Model

Analytics Maturity Assessment (AMA) is a tool developed by the International Institute for Analytics (IIA), designed to examine an organization and to assess its capacity to apply corporate analytics. This assessment is carried out based on the DELTA Plus model and five analytics maturity stages [29]. AMA allows an organization to be placed on the analytics development path through an assessment of its analytics capacities, analytics culture, and the capacity to use analytics tools [14]. The DELTA model is based on five components relevant to the assessment of an organization’s analytics maturity [30]:
  • Data, D for available, high-quality data—in order to obtain valuable and reliable analysis results, data are required that are organized, integrated, available, and of a high quality.
  • Enterprise, E for an enterprise’s orientation towards analytics management—it includes the development of analytics culture in the organization (analytical ecosystem), the designing and implementation of a strategy for analytics, and the adoption of analytics goals.
  • Leadership, L for analytics leadership—analytical organizations have leaders who make full use of analytics and steer the organization’s development in such a manner that it makes use of the data analytics potential. It elevates the level of acceptance towards the analytics culture throughout the enterprise and streamlines the implementation of analytics initiatives.
  • Targets, T for strategic targets—analytics activities should be tailored to specific, strategic targets that should be in line with corporate objectives. The targets should be selected based on the organization’s advantages and potential. Analytics initiatives should correspond (be considered equivalent) to business goals.
  • Analysts, A for analysts. Organizations employ staff with various analytics skills, both those using spreadsheets (analytical amateurs) and experienced data scientists (analytical professionals).
The availability of big data coupled with new analytics techniques such as machine learning have resulted in the DELTA model being extended to include two additional components [29]:
  • Technology, T—An organization’s capacity to implement and manage the infrastructure, tools, and technologies is becoming increasingly important. With the emergence of big data, artificial intelligence, data clouds, and open source software, the development of an effective technological strategy for analytics is a key condition of success.
  • Analytics techniques, A—Falling costs of data storage, processing, and analysis, combined with widespread access to software, have resulted in the explosive development of analytics methods and techniques. At the same time, however, more traditional approaches to analytics, e.g., reporting and visual analyses, are still applied.
The application of AMA and the DELTA Plus model enables an assessment of an organization’s analytics maturity in seven dimensions. Depending on the scope of research, a full AMA is carried out based on surveys that may concern up to 33 unique competencies of the organization. The rating of analytics maturity in the DELTA model takes the form of a score awarded on a scale ranging from 1.00 to 5.99 points. Each score represents a specific stage of analytics maturity (Figure 7). Such an assessment indicates sensitive points which need to be improved [31].
In the DELTA Plus model, an organization is placed in one of the analytics continuum stages, which reflects the organization’s analytics maturity:
Stage 1
Analytically Impaired. Organizations that are “analytically lagging” are managed based on intuition, have no formal plans of becoming more analytical, and their leaders use no data analytics.
Stage 2
Localized Analytics. Analytics or reporting in such organizations takes place in the “back office”. It usually stays in the background of other activities and loses confrontation with the intuition-based management. Neither structures nor cooperation between particular units (management levels) in the use of data analytics are developed.
Stage 3
Analytical Aspirations. “Analytically ambitious” organizations recognize the value of data analytics and intend to make use of it to a greater extent. The progress they make, however, is slow and often insufficient.
Stage 4
Analytical Companies. Analytical organizations make effective use of data analytics. They are highly data-oriented, have analytics tools, and make extensive use of data analyses. At the same time, however, they are characterized by the lack of commitment sufficient to be able to fully compete in analytics or to use analytics strategically.
Stage 5
Analytical Competitors. These organizations use an analytics strategy that provides the basis for the operation of the entire enterprise. They use analytics skills to gain a competitive advantage.

3.1.6. Gartner’s Maturity Model for Data and Analytics

According to the Gartner’s Maturity Model for Data and Analytics, an organization is placed in one of the five analytics maturity stages that are characterized by selected attributes (Figure 8): Level 1: Basic—data are not exploited, D&A (Data Analytics) is managers in silos, people arguing about whose data are correct, analysis is ad hoc, spreadsheet and information firefighting. Level 2: Opportunistic—IT attempts to formalize information availability requirements, inconsistent incentives. Organizational barriers and lack of leadership, strategy is still only over hundred pages, not business-relevant, data quality and insight efforts, but still in silos. Level 3: Systematic—different content types are still treated differently, strategy and vision formed (five pages), agile emerges, exogenous data sources are readily integrated, business executives become D&A champions. Level 4: Differentiating—executives champions and communicate best practices; business-led/driven, with chief data officer (CDO); D&A is an indispensable fuel for performance and innovations, and linked across programs; link to outcome and data used for ROI (return on investment). Level 5: Transformational—D&A is central to business strategy, data value influences investments, strategy and execution aligned and continually improved, outside-in perspective, CDO sits on board [32].

3.1.7. Logi Analytics Maturity Model

The best way to make analytics a natural part of everyday work is to integrate it with business applications. In the LogiAnalytics Maturity Model, an assumption has been adopted that the more sophisticated opportunities for data analysis are implemented in the structure of applications used on a daily basis, the higher the extent to which analytics is used in everyday work will be. The Logi Analytics Maturity Model (LAMM) has five stages of analytics maturity, from Standalone Analytics (level 0) to Genius Analytics (level 4) (Figure 9). The Logi also provides a self-assessment tool.
At Level 0, Standalone Analytics, the primary application shares data with a standalone analytics application. End users have to switch to the standalone solution to analyze their data. At Level 1, Bolt-On Analytics, the addition of security integration provides single sign-on functionality. However, users still have to toggle from one application to another. At Level 2, Inline Analytics, the addition of user interface integration allows co-presentation of analytics content and functions with primary application content and functions. Companies embedding analytics at this level can offer business intelligence in context of an existing application—although they still have some limitations. At Level 3, Infused Analytics, the addition of workflow integration allows functional interactions between analytics and the primary application. At Level 4, Genius Analytics, the addition of embedded self-service analytics supports unanticipated use cases within a managed, seamless environment. With embedded self-service capabilities, users can ask new questions of the data as ideas occur to them. Companies have many opportunities here for competitive differentiation—their applications can deliver a new business value [33].

3.1.8. Online Analytics Maturity Model (OAMM)

In the digital age, the ability to process large data sets to extract meaning and insights is a competitive advantage. The Online Analytics Maturity Model (OAMM) helps organizations look in the mirror and understand who they are and what they are capable of.
The Online Analytics Maturity Model provides an unbiased and easy-to-understand representation of an organization’s expectations of, and commitment to, analytics infrastructure and initiatives. OAMM offers a benchmark (with free self-assessment survey) against extensive databases, to identify where the organization sits compared to other organizations in the industry. In the Online Analytics Maturity Model, the organization is evaluated in six dimensions: (1) management, governance, and adoption; (2) objectives definition (What is the primary objective of your current online analytics program?); (3) scoping (the scope defines the size of the playing field); (4) the analytics team and expertise (How is your online analytics team structured?); (5) continuous improvement process and analysis methodology (How do you develop a hypothesis, define problems and opportunities, analyze and provide insight?); and (6) tools, technology and data integration [27].

3.1.9. SAS Analytics Maturity Scorecard

An organization’s SAS Analytics Maturity Scorecard is prepared based on an analysis carried out in four dimensions: (1) Culture: Decision-Makers Use of Data and Analysis, (2) Internal Process Readiness, (3) Analytical Capabilities, and (4) Data Environment: Infrastructure and Software. According to the SAS Analytics Maturity Scorecard [34], an organization can be placed on one of the five analytics maturity stages (Figure 10): (level 1) Analytically Unaware—decision makers rely on perceptions, historical decisions, and non-validated beliefs. They have no defined data management or analytic processes to support insight development or business decisions. Organization has lack of analytics skills or executive interest; considers historical reporting to be analytics. Furthermore, some projects have defined scope and objectives; inconsistency and duplication of software; (level 2) Analytically Aware—decision-makers recognize benefits of analytics to support decision-making but do not leverage analytics consistently. Full benefits of analytics poorly understood, siloed, and ad hoc activities, yet reasonable results; (level 3) Analytically Astute—decision-makers adopt analytics for all decisions. Organization is characterized by common data management processes in place and use of data sets and analytics established for decision-making. Analytics capabilities are slow to change; analytics development is constrained, yet departments have own experts/plans; (level 4) Empowered—decision-makers leverage analytics across the organization to support business decisions. Widely deployed data processes support specific business insights. Management supports analytics to bring business units into alignment; (level 5) Explorative—decision makers search for new ways to use advanced analytics to support business decisions. Processes around data enhancement and analytic methods to optimize resources are continually refining. Analytical Capabilities: commits to innovative analytic use for future growth and draws on advanced analytics and advances in new techniques [34] (p. 5).

3.1.10. TDWI Analytics Maturity Model

The TDWI (Transforming Data With Intelligence) Analytics Maturity Model provides the methodology for measuring and monitoring the status of analytics implementation in the organization. It indicates the actions that should be taken to develop the organization’s analytics culture [35]. In the TDWI model, an analysis of the organization’s analytics maturity is carried out based on 35 questions asked in five areas of analytics maturity: organizational, infrastructural, data management, analysis, and management. In the organizational area, the questions focus inter alia on the extent to which the organizational strategy, culture, leadership, skills, and financing support the analytics program. In the infrastructural area, the questions focus on the issues related to the advancement and accessibility of infrastructure for analytics applications. In the data management area, the questions concern data quality, accessibility, and processing, as well as the company’s data management method. In turn, in the analytics area, the questions are focused on the assessment of analytics culture and the extent to which analytics tools are used (how does analytics contribute to the decision-making in the company?).
The TDWI Analytics Maturity Model consists of five stages: nascent, pre-adoption, early adoption, corporate adoption, and mature/visionary. As organizations move through these stages, they should gain greater value from their investments (Figure 11).
Organizations which are at the first stage (Stage 1: Nascent) most frequently use no analytics, except maybe a spreadsheet. Decision-makers do not invest in the development of analytics competencies, even though there are people working in an enterprise who may be interested in the potential of analytics. In these enterprises, decisions are made based on intuition. Organizations that are at the second stage (Stage 2: Pre-Adoption) begin the process of developing an analytics culture in the enterprise. This takes place in the field of training employees and forming an analyst team, and through the purchase of appropriate technologies. The management team recognizes the potential of analytics. In turn, in the initial implementation phase (Stage 3: Early Adoption) organizations make use of analytics tools and software. However, they are still achieving proficiency in how to use them and searching for processes in which they could be used.
The Chasm is a stage that symbolizes the obstacles and difficulties that an organization has to face on the analytics development path. At this stage, organizational solutions to ensure an opportunity to effectively use analytics are worked out. At the next stage (Stage 4: Corporate Adoption), organizations are characterized by a high analytics culture and the democratization of analytics. Data analytics shapes the way in which the entire organization operates.
Few organizations can be regarded as visionary in analytical terms (Stage 5: Mature/Visionary). At this stage, organizations efficiently carry out analytics programs using a highly-tuned infrastructure with well-established data management strategies. Due to the democratization of analytics users have an access to data, so that they can explore them independently with no participation of IT.
In the TDWI model, organizations are assessed in five dimensions. In each dimension, a maximum of 20 points can be awarded (score per dimension: 4.0–7.1—Nascent; 16.7–20.0—Mature/Visionary). The result of the evaluation is a final score. The survey also includes questions that are not scoring but are used as guidelines in the process of data analytics implementation [8].

3.1.11. Web Analytics Maturity Model (WAMM)

An assessment of the extent to which Web analytics is used in an organization is enabled by the so-called Web Analytics Maturity Model, WAMM) [18]. The WAMM model includes the scoring of an organization in six dimensions, which places it in a specific stage of analytics development: (1) management, governance, and adoption; (2) objectives definition; (3) scoping; (4) the analytics team and expertise; (5) the continuous improvement process and analysis methodology; and (6) tools, technology, and data integration.
While assessing each of these areas, a score ranging from 0 to 5 points is awarded, with the lowest score indicating an organization that is “analytically impaired” in a particular analytics area, and the higher score (“analytically addicted”) indicating that Web analytics is the main source of the competitive advantage.
According to the WAMM model, an organization that is not analytically mature (Stage 1—data collection) has no clear objectives or measures of their fulfillment, even if its activities on the Internet are monitored, and data are collected. In such an organization, data analysis typically involves browsing through reports that usually concern basic metrics such as the number of users or website hits. All people in the organization have access to data; meanwhile, no persons have been indicated as those directly responsible for their analysis. An organization’s employees have only basic knowledge of analytics tools, which is often reflected in their incorrect configuration and leads to conclusions arising from analyses that are not always correct.

4. Results

Comprehensive descriptions of analytics maturity levels were available for all organization’s analytics maturity models; however, a detailed description of the assessment process and the criteria for assigning a specific level of analytics development to an organization was not always available. It can be observed that a detailed description of analytics maturity is not made available when the assessment is carried out as part of a service provided on a commercial basis.
Selected analytics maturity models are described in such a detailed manner that their application in an independent assessment of an organization’s analytics maturity is possible. An assessment of analytics maturity is usually carried out in a few dimensions, which most often include: (1) an assessment of technical infrastructure, including the equipment and software, as well as data collection method; (2) an assessment of organizational issues, the so-called analytics culture, the degree of support and democratization of analytics, and the level of acceptance towards an analytics culture in the entire enterprise; and (3) an assessment of human resources, including the staff’s analytics competencies.
A certain synthetic (illustrative) result of an analytics maturity assessment is placing an organization on the analytics development path, which involves appropriate conclusions and a plan of actions which are supposed to lead to analytics development of the organization. Sometimes, however, the result of an analytics maturity assessment is additionally presented using synthetic indicators, e.g., AMQ or DELTA Score (Table 2). This is a good practice that renders the overall analysis result more understandable. Indicators and point (final) scores are more appealing to the customer’s imagination than extensive tabular summaries.
Most of the analyzed analytics maturity models comprised five analytics maturity levels, starting from organizations using no data at all or only collecting data (Analytically Impaired), and ending with an analytics organization (Analytical Nirvana, Genius Analytics). Such a shape of the analytics development (analytics continuum) path results, inter alia, from the so-called five-point Likert scale [36], one of the most common scaling devices in social research. The five-point scale might become 3 points or 7 points, and the agree–disagree scale may become approve–disapprove, favor–oppose, or excellent–bad, but the principle is the same. These are all Likert-type scales [37] (p. 328).
An analysis of the analytics development path for eleven models of an organization’s analytics maturity (Table 3) enabled the creation of a single coherent image of the organization at each of five analytics maturity stages (Figure 12).
At the first stage of the development of analytics competencies, organizations are focused on data collecting and reporting or do not carry out such activities at all. In most cases, the management team may be unaware of the possibilities arising from the use of data analytics or even have a negative attitude towards such activities. Such an organization is referred to as analytically lagging behind, learning, or unaware of the possibilities and advantages offered by analytics. The first stage is the initiation stage at which the organization is “infected” with the idea of data analytics (Figure 12).
The organization that has been “infected” with the idea of data analytics builds analytics momentum and executives’ interest through applications of basic analytics. It builds and develops models. It recognizes the opportunities and possibilities arising from data analytics. It becomes an enthusiast and proponent of data analytics. However, it is still in the process of searching and learning. It usually lacks adequate infrastructure, software, and staff capable of developing analytics competencies.
An organization at the third stage (Acceleration) uses data analytics to gain a competitive advantage. It develops analytically, and this development is gaining momentum and dynamics. It draws inspiration from data analysis. It enhances the organization’s analytics culture as well as strengthens analytics infrastructure and its employees’ analytics competencies. It strengthens the participation of data analytics in the decision-making process. In an organization at the fourth stage (Momentum, Impulse), data analytics is developed and implemented throughout the organization, and its implementation becomes a priority. Decisions are taken based on data analysis. Data analytics provides motivation and inspiration. The organization operates “in the spirit of data analytics,” and analytics operations are “the present and the future.” At the fifth stage of analytics development (Ahead, On the front), the company routinely reaps benefits of its enterprise wide analytics capability and focuses on continuous analytics review. The organization is analytically mature and “addicted” to data analytics. It is characterized by a high analytics culture and democratization of analytics. Data analytics indicates strategies and directions of development, and is a source of innovation.

5. Discussion

Analytics software offers more and more opportunities. However, it is often not used due to the lack of ability to use the software, to analyze data, and to draw right conclusions. Meanwhile, data analytics is created by humans [13,38]. A common mistake made by organizations is to consider analytics systems equivalent to the tools for data analysis, and to ignore the development of employees’ analytics skills. Tools are supposed to unlock the user’s potential. This is related to the democratization of data analytics, which involves constructing an analytics environment available to each employee, accessible in the way it is used and devoid of restrictions on the opportunities to receive answers to business questions. Democratization of analytics involves breaking down barriers arising from the implementation of analytics solutions. Moreover, due to the growing demand for data analysts and, consequently, their growing financial expectations, it is increasingly unprofitable for companies to employ them on a full-time basis. Instead, many organizations work to enhance analytics competencies of the already employed staff.
The use of data analytics in order to streamline and automate business processes is associated with replacing humans with machines and employment reduction [11]. Meanwhile, with the development of analytics tools, the number of new professions and specializations will increase as well. The use and analysis of data as well as the design of new products create numerous new jobs. This is confirmed by results of the survey “Analytics as a Source of Business Innovation,” in which 2500 respondents from all over the world participated. The survey included questions concerning enterprises’ approach to automation and the extension of functionality of business processes arising from the use of data analytics and the relevant conclusions and observations. The study showed that a significant percentage of companies either automated (36%) or extended the functionality (41%) of business processes with the use of analytics. At the same time, 34% of them observed the emergence of new tasks for employees, while 16% noted cases of assigning previously automated tasks to employees. Due to the growing analytics culture, more and more employees are relieved from tedious, routine, and repetitive activities, which allows them to respond to new, creative challenges [3,11].
A report of the survey “Analytics as a Source of Business Innovation” divided companies into three types according to their analytics maturity: (1) Analytical Innovators—companies with a strong analytics culture, that make decisions mainly based on data; (2) Analytical Practitioners—companies that make use of analytics mainly to improve operational efficiency or, less frequently, to work out innovative solutions; and (3) Analytically Challenged—companies that find analytics a challenge, that are beginning their “love affair” with analytics and make most decisions based on intuition [3]. In turn, the Accenture study revealed a specific landscape of new tasks resulting from the interaction between the human and an intelligent machine. New tasks may result in the emergence of new professions and specializations, for example, coaches to teach machines how to better understand the context and specificity of particular tasks; interpreters to explain the results of machine operation from both human and business perspectives, or evaluators to optimize machine operation [39]. New professions related to data analytics also include CAO (Chief Analytics Officer) or Chief Digital Officer, Chief Data Officer (CDO), and Chief Information Officer (CIO). CAO focuses mainly on business needs and the ways to satisfy them using both data and analytics, and develops the organization’s analytics culture. CDO focuses mainly on data management as well as data quality and availability [40]. One oft-cited goal of the chief information officer is “to get the right information to the right person at the right time” [3].
Research conducted by IBM and Burning Glass, as well as the organization Business-Higher Education Forum, revealed a quick increase in the number of new jobs related to data analytics. In 2015, there were approximately 2.35 million jobs related to BI&A in the United States. According to forecasts, demand for data analysts will increase by as much as approximately 40% by the year 2020. Moreover, research showed that job positions related to data analytics were better paid [41]. It is forecasted that the job positions whose number will be increasing the most, and which will be most difficult to fill (Top Analytical Occupations) while generating high labor costs, will include: Data Scientist, Data Engineer, Database Architects and Developers, Statistician, Chief Analytics Executives, Directors of Analytics/Data, and Database Administrator [41].
It is difficult to assess an organization’s analytics maturity without assessing its employees’ analytics competencies. For example, the Delta Plus model divided employees of an organization into four types related to the assessment of their analytics competencies: (1) analytical champions, (2) analytical professionals, data scientists, (3) analytical semiprofessionals, and (4) analytical amateurs [31]. Analysts should have business knowledge and interdisciplinary (including interpersonal and team management) skills.
Where appropriate people are in the right place, maintaining motivation in projects that require commitment and creativity is getting increasingly important [30,31]. At this point, it is worth mentioning the 10/90 rule (The 10/90 Rule for Magnificent Web Analytics Success). The rule was formulated by Avinash Kaushik and attained a global recognition [42]. According to the 10/90 rule, humans are responsible for the analytics success (effectiveness of the use of analytics tools), even though the rule itself applies to the issues of financing analytics tasks: 10% of funds for Web analytics should be allocated for the purchase of industry software, while 90% for the employment of appropriate individuals with the ability to use the software.

6. Conclusions

On the analytics services market, more and more organizations’ analytics maturity models are developed, and the number of entities interested in carrying out such an assessment is on the increase. Selected models take a dedicated and personalized form, and are prepared in order to assess the analytics maturity of entities with a specific business profile. An assessment of an organization’s analytics maturity is usually a service provided on a commercial basis, and the presentation of the advantages of the model and data analytics is supposed to encourage making use of this service. Analytics maturity assessment models are developed based on generally available frameworks, and use checklists that are frequently prepared based on subjective experiences and observations. The assessment itself is most frequently carried out in the form of an audit.
An organization’s analytics maturity assessment models that have been made fully available usually have a simplified form that is limited to providing responses to a few or several control questions. An overall result is generated based on the responses, which allows the organization to be placed on the analytics development path. Simplified assessment patterns as well as self-assessment tools can be used in the assessment of analytics maturity of small entities, e.g., in the micro, small, and medium enterprise sectors. However, a result obtained in this way primarily has an illustrative value. Checklists used in advanced commercial models are not generally available. They are usually used in the assessment of corporate analytics maturity where wider-scale research is necessary. In all models, without exception, the role of three factors that are crucial for analytics development of an organization, namely human resources, infrastructure (equipment and software), and appropriate organization (resource management), is emphasized.
The dynamics of data analytics development is a certain limitation for the research. The way data are collected and processed changes very dynamically. This means that some data analytics models can age quickly and be replaced by newer, more perfected models that are more relevant to new tools and measurement techniques.
In the future, an increase is expected in both the number and availability of new analytics maturity models, in particular, those personalized and dedicated to a specific sector or business, and the number of entities involved in the implementation of data analytics in organizations. It can also be predicted that more self-service tools enabling an independent assessment of an organization’s analytics maturity will emerge on the Internet, similar to the case of marketing tools, e.g., automated SEO audits generally available in the Web browser window. However, such tools produce an illustrative result that opens a discussion and encourages making use of a full service that is usually provided against payment.

Author Contributions

Conceptualization, K.K. methodology, K.K. validation, K.K. and D.Z. formal analysis, K.K. and D.Z. resources, K.K. data curation, K.K. and D.Z. writing-original draft preparation, K.K. writing-review and editing, K.K. visualization, K.K. supervision, K.K. project administration, K.K. funding acquisition, D.Z. All authors have read and agreed to the published version of the manuscript.

Funding

Funded with a subsidy of the Ministry of Science and Higher Education for the Silesian University of Technology in Gliwice and the University of Agriculture in Kraków for 2020.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Logi Analytics. 2018 State of Embedded Analytics. The Sixth Annual Review of Embedded Analytics Trends and Tactics; Logi Analytics: McLean, VA, USA, 2018; Available online: https://goo.gl/BL7euZ (accessed on 28 September 2019).
  2. Davenport, T.H. Competing on Analytics. Harv. Bus. Rev. 2006, 84, 98–107. [Google Scholar] [PubMed]
  3. Ransbotham, S.; Kiron, D. Analytics as a Source of Business Innovation. The Increased Ability to Innovate is Producing a Surge of Benefits across Industries. 2017. Available online: http://ilp.mit.edu/media/news_articles/smr/2017/58380.pdf (accessed on 2 March 2020).
  4. Cosic, R.; Shanks, G.; Maynard, S. Towards a business analytics capability maturity model. In Proceedings of the ACIS 2012: The 23rd Australasian Conference on Information Systems ACIS, Geelong, Australia, 3–5 December 2012; pp. 1–11. [Google Scholar]
  5. Barton, D.; Court, D. Making advanced analytics work for you. Harv. Bus. Rev. 2012, 90, 78–83. [Google Scholar] [PubMed]
  6. Muller, L.; Hart, M. Updating business intelligence and analytics maturity models for new developments. In International Conference on Decision Support System Technology; Springer: Cham, Switzerland, 2016; pp. 137–151. [Google Scholar] [CrossRef]
  7. Kappelman, L.; McLean, E.; Johnson, V.; Gerhart, N. The 2014 SIM IT key issues and trends study. MIS Q. Exec. 2014, 13, 237–263. [Google Scholar]
  8. Halper, F.; Stodder, D. TDWI Analytics Maturity Model Guide. Interpreting Your Assessment Score; TDWI Research; The Data Warehousing Institute: Renton, WA, USA, 2014. [Google Scholar]
  9. Lahrmann, G.; Marx, F.; Winter, R.; Wortmann, F. Business Intelligence Maturity Models: An Overview. In Information Technology and Innovation Trends in Organizations; D’Atri, A., Ferrara, M., George, J., Spagnoletti, P., Eds.; Italian Chapter of AIS: Naples, Italy, 2010. [Google Scholar]
  10. Intel. Getting Started with Advanced Analytics: How to Move forward with a Successful Deployment. Intel: Planning Guide. Available online: https://goo.gl/4jTB7S (accessed on 28 January 2019).
  11. Wiecka, A. Gospodarka Analityczna. 2018. Available online: https://goo.gl/FSrf4r (accessed on 28 January 2019).
  12. Davenport, T.H. The Rise of Analytics 3.0. How to Compete in the Data Economy; eBook; International Institute for Analytics: Portland, OR, USA, 2013. [Google Scholar]
  13. Siegel, E. How to Implement Predictive Analytics in the Organization? Available online: https://goo.gl/wy36u6 (accessed on 28 January 2019).
  14. Alles, D. Analytics Maturity Powers Company Performance. Available online: https://pages.dataiku.com/hubfs/IIA%20Report/iia-report-final.pdf (accessed on 2 March 2020).
  15. Crosby, P.B. Quality is Free; McGraw-Hill: New York, NY, USA, 1979. [Google Scholar]
  16. Simpson, J.A.; Weiner, E.S.C. The Oxford English Dictionary; Oxford University Press: Oxford, UK, 1989. [Google Scholar]
  17. Fraser, P.; Moultrie, J.; Gregory, M. The Use of Maturity Models/Grids as a Tool in Assessing Product Development Capability. In Proceedings of the IEMC 2002, Cambridge, UK, 18–20 August 2002; pp. 244–249. [Google Scholar]
  18. Hamel, S. The Web Analytics Maturity Model. A Strategic Approach Based on Business Maturity and Critical Success Factors. Available online: https://goo.gl/96nPyr (accessed on 28 January 2019).
  19. De Bruin, T.; Rosemann, M.; Freeze, R.; Kulkarni, U. Understanding the main phases of developing a maturity assessment model. In Australasian Conference on Information Systems, Proceedings of the 16th Australasian Conference on Information Systems (ACIS), Sydney, Australia, 29 November–2 December 2005; Campbell, B., Underwood, J., Bunker, D., Eds.; Australasian Chapter of the Association for Information Systems: Sydney, Australia, 2005. [Google Scholar]
  20. Tarhan, A.; Turetken, O.; Reijers, H.A. Business process maturity models: A systematic literature review. Inf. Softw. Technol. 2016, 75, 122–134. [Google Scholar] [CrossRef] [Green Version]
  21. Gibson, C.F.; Nolan, R.L. Managing the four stages of EDP growth. Harv. Bus. Rev. 1974, 52, 76–88. [Google Scholar]
  22. Hribar, R.I. Overview of business intelligence maturity models. Manag. J. Contemp. Manag. Issues 2010, 15, 47–67. [Google Scholar]
  23. Becker, J.; Knackstedt, R.; Pöppelbuß, J. Developing Maturity Models for IT Management—A Procedure Model and its Application. Bus. Inf. Syst. Eng. 2009, 1, 213–222. [Google Scholar] [CrossRef]
  24. Grossman, R.L. A framework for evaluating the analytic maturity of an organization. Int. J. Inf. Manag. 2018, 38, 45–51. [Google Scholar] [CrossRef]
  25. Piyanka, J. The Analytics Maturity Quotient Framework. Aryng LLC. Available online: https://goo.gl/3RwiYJ (accessed on 28 September 2019).
  26. Blast. Analytics Maturity Assessment. Blast Analytics & Marketing. Available online: https://goo.gl/v13P5r (accessed on 28 January 2019).
  27. Cardinal Path. Benchmarking Your Organization’s Analytics Maturity? ebook; Cardinal Path: New York, NY, USA; Available online: https://goo.gl/KgeAwH (accessed on 28 September 2019).
  28. 5 Areas to Assess Using the DAMM—Data Analytics Maturity Model. Association Analytics. Available online: https://goo.gl/jZBpgc (accessed on 28 September 2019).
  29. Davenport, T.H.; Harris, J.G. Competing on Analytics: Updated, with a New Introduction: The New Science of Winning; Harvard Business Review Press: Brighton, MA, USA, 2017. [Google Scholar]
  30. Davenport, T.H.; Harris, J.G.; Morison, R. Analytics at Work: Smarter Decisions, Better Results; Harvard Business School Publishing: Brighton, MA, USA, 2010. [Google Scholar]
  31. Davenport, T.H. DELTA Plus Model & Five Stages of Analytics Maturity: A Primer; ebook; International Institute for Analytics: Portland, OR, USA, 2018. [Google Scholar]
  32. Gartner Survey Shows Organizations Are Slow to Advance in Data and Analytics. Gartner Newsroom. Available online: https://goo.gl/pAhfbt (accessed on 28 January 2019).
  33. Logi Analytics. The 5 Levels of Analytics Maturity: From Basic BI to Sophisticated Differentiators; ebook; Logi Analytics: McLean, VA, USA; Available online: https://goo.gl/8x7vgQ (accessed on 28 September 2019).
  34. Five Steps to Analytical Maturity. A Guide for Pharma Commercial Operations, White Paper; SAS & PharmaVOICE: Stockholm, Sweden, 2014. [Google Scholar]
  35. Chuah, M.H.; Wong, K.L. A review of business intelligence and its maturity models. Afr. J. Bus. Manag. 2011, 5, 3424–3428. [Google Scholar] [CrossRef]
  36. Likert, R. A technique for the measurement of attitudes. Arch. Psychol. 1932, 22, 55. Available online: http://psycnet.apa.org/record/1933-01885-001 (accessed on 28 September 2019).
  37. Bernard, H.R. Research Methods in Anthropology: Qualitative and Quantitative Approaches; Rowman & Littlefield: Lanham, MD, USA, 2017. [Google Scholar]
  38. Davenport, T.H.; Harris, J.G.; Shapiro, J. Competing on talent analytics. Harv. Bus. Rev. 2010, 88, 52–58. [Google Scholar] [PubMed]
  39. Wilson, H.J.; Daugherty, P.R.; Morini Bianzino, N. The Jobs That Artificial Intelligence will Create. Mit Sloan Manag. Rev. 2017, 58, 14–16. [Google Scholar]
  40. Morgan, L. What A Chief Analytics Officer Really Does, InformationWeek. Available online: https://www.informationweek.com/big-data/what-a-chief-analytics-officer-really-does/a/d-id/1328200 (accessed on 28 September 2019).
  41. Miller, S.; Hughes, D. The Quant Crunch: How the Demand for Data Science Skills is Disrupting the Job Market; Burning Glass Technologies: Boston, MA, USA, 2017. [Google Scholar]
  42. Kaushik, A. Web Analytics: An Hour a Day; Wiley Publishing: Indianapolis, IN, USA, 2007. [Google Scholar]
Figure 1. Advanced Analytics Maturity Path: Moving to Real-Time Enterprise. Source: own study based on [10].
Figure 1. Advanced Analytics Maturity Path: Moving to Real-Time Enterprise. Source: own study based on [10].
Information 11 00142 g001
Figure 2. Analytics continuum. Source: own elaboration.
Figure 2. Analytics continuum. Source: own elaboration.
Information 11 00142 g002
Figure 3. Analytic Processes Maturity Model (APMM)—five maturity levels of organizations. Source: own elaboration based on APMM.
Figure 3. Analytic Processes Maturity Model (APMM)—five maturity levels of organizations. Source: own elaboration based on APMM.
Information 11 00142 g003
Figure 4. Stages of analytical maturity (Blast Model). Source: own elaboration based on Blast Model.
Figure 4. Stages of analytical maturity (Blast Model). Source: own elaboration based on Blast Model.
Information 11 00142 g004
Figure 5. Analytics Maturity Assessment framework (Blast Model). Source: own elaboration based on Blast Model.
Figure 5. Analytics Maturity Assessment framework (Blast Model). Source: own elaboration based on Blast Model.
Information 11 00142 g005
Figure 6. DAMM for Associations—Five Stages of Data Analytics Maturity. Source: own elaboration based on ASAE.
Figure 6. DAMM for Associations—Five Stages of Data Analytics Maturity. Source: own elaboration based on ASAE.
Information 11 00142 g006
Figure 7. Analytics continuum—Delta Plus Model. Source: own elaboration based on [29,31].
Figure 7. Analytics continuum—Delta Plus Model. Source: own elaboration based on [29,31].
Information 11 00142 g007
Figure 8. Gartner’s Maturity Model for Data and Analytics. Source: own elaboration based on [32].
Figure 8. Gartner’s Maturity Model for Data and Analytics. Source: own elaboration based on [32].
Information 11 00142 g008
Figure 9. Logi Analytics Maturity Model—five stages of analytics maturity. Source: own study based on LAMM.
Figure 9. Logi Analytics Maturity Model—five stages of analytics maturity. Source: own study based on LAMM.
Information 11 00142 g009
Figure 10. SAS Analytics Maturity Scorecard—stages of analytics maturity. Source: own elaboration based on [34].
Figure 10. SAS Analytics Maturity Scorecard—stages of analytics maturity. Source: own elaboration based on [34].
Information 11 00142 g010
Figure 11. TDWI Analytics Stages of Maturity. Source: own elaboration based on [8].
Figure 11. TDWI Analytics Stages of Maturity. Source: own elaboration based on [8].
Information 11 00142 g011
Figure 12. An organization’s analytics maturity at each of the five stages of analytics continuum. Source: own elaboration.
Figure 12. An organization’s analytics maturity at each of the five stages of analytics continuum. Source: own elaboration.
Information 11 00142 g012
Table 1. Analytics maturity models.
Table 1. Analytics maturity models.
ItemModelKey ReferenceDeveloper
1Analytic Processes Maturity Model (APMM)10.1016/j.ijinfomgt.2017.08.005Grossman, R.L.
2Analytics Maturity Quotient Frameworkhttps://goo.gl/3RwiYJAryng LLC
3Blast Analytics Maturity Assessment Frameworkhttps://goo.gl/v13P5rBlast Analytics & Marketing
4DAMM—Data Analytics Maturity Model for Associationshttps://goo.gl/jZBpgcAssociation Analytics
5DELTA Plus ModelAnalytics at work: Smarter decisions, better results. Harvard Business School Publishing. https://goo.gl/LkutrUDavenport, T.H., Harris, J., and Morison, B.
6Gartner’s Maturity Model for Data and Analyticshttps://goo.gl/pAhfbtGartner, Inc.
7Logi Analytics Maturity Modelhttps://goo.gl/8 × 7vgQLogi Analytics
8Online Analytics Maturity Modelhttps://goo.gl/KgeAwHCardinal Path
9SAS Analytics Maturity ScorecardFive Steps to Analytical Maturity. A Guide for Pharma Commercial Operations. White Paper. SAS & PharmaVOICE. https://goo.gl/sKPqdQSAS Institute Inc.
10TDWI Analytics Maturity ModelTDWI Analytics Maturity Model Guide. TDWI Research. The Data Warehousing Institute. https://goo.gl/UVH3iaTDWI, Halper, F., Stodder, D.
11Web Analytics Maturity Modelhttps://goo.gl/96nPyrHamel, S.
Source: Own elaboration.
Table 2. A comparative analysis of selected attributes of an organization’s analytics maturity models (AMMs).
Table 2. A comparative analysis of selected attributes of an organization’s analytics maturity models (AMMs).
AMM AttributesAnalytics Maturity Models
1234567891011
Public availability of the methodology
Number of maturity levels51 *555555555
Number of assessment dimensions (key process areas, key elements)6564751 **6456
ScoreAMQDELTA ScoreBenchmark ScoresScore
1. Analytic Processes Maturity Model (APMM). 2. Analytics Maturity Quotient Framework. 3. Blast Analytics Maturity Assessment Framework. 4. DAMM—Data Analytics Maturity Model for Associations. 5. DELTA Plus Model. 6. Gartner’s Maturity Model for Data and Analytics. 7. Logi Analytics Maturity Model. 8. Online Analytics Maturity Model. 9. SAS Analytics Maturity Scorecard. 10. TDWI Analytics Maturity Model. 11. Web Analytics Maturity Model. *Overall (total) score of analytics maturity on a scale ranging from 0 to 100 points (AMQ score). ** An assessment of analytics maturity is carried out by means of surveys, in the “embedded analytics” dimension. Source: own elaboration.
Table 3. Comparison of the nomenclature of stages in the analytics continuum.
Table 3. Comparison of the nomenclature of stages in the analytics continuum.
AMMA Stage in the Analytics Continuum
12345
1Building reportsBuilding and deploying modelsBuilding and deploying analyticsEnterprise-wide processes for analyticsAnalytics is strategy driven
2
3LaggardFollowerCompetitorLeaderInnovator
4LearningPlanningBuildingApplyingLeading
5Analytically Impaired (Not Data Driven)Localized Analytics (Use Reporting)Analytical Aspirations (See the Value of Analytics)Analytical Companies (Good at Analytics)Analytical Competitors (Analytical Nirvana)
6BasicOpportunisticSystematicDifferentiatingTransformational
7Standalone AnalyticsBolt-On AnalyticsInline AnalyticsAnalytics InfusedGenius Analytics
8
9Analytically UnawareAnalytically AwareAnalytically AstuteEmpoweredExplorative
10NascentPre-AdoptionEarly AdoptionCorporate AdoptionMature Visionary
11Impaired
Initiated
OperationalIntegratedCompetitorAddicted
1. Analytic Processes Maturity Model (APMM). 2. Analytics Maturity Quotient Framework. 3. Blast Analytics Maturity Assessment Framework. 4. DAMM—Data Analytics Maturity Model for Associations. 5. DELTA Plus Model. 6. Gartner’s Maturity Model for Data and Analytics. 7. Logi Analytics Maturity Model. 8. Online Analytics Maturity Model. 9. SAS Analytics Maturity Scorecard. 10. TDWI Analytics Maturity Model. 11. Web Analytics Maturity Model. Source: own elaboration.

Share and Cite

MDPI and ACS Style

Król, K.; Zdonek, D. Analytics Maturity Models: An Overview. Information 2020, 11, 142. https://doi.org/10.3390/info11030142

AMA Style

Król K, Zdonek D. Analytics Maturity Models: An Overview. Information. 2020; 11(3):142. https://doi.org/10.3390/info11030142

Chicago/Turabian Style

Król, Karol, and Dariusz Zdonek. 2020. "Analytics Maturity Models: An Overview" Information 11, no. 3: 142. https://doi.org/10.3390/info11030142

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop