Next Article in Journal
Multimodal Guidance for Enhancing Cyclist Road Awareness
Previous Article in Journal
On the Application of DiffusionDet to Automatic Car Damage Detection and Classification via High-Performance Computing
Previous Article in Special Issue
Proof-of-Friendship Consensus Mechanism for Resilient Blockchain Technology
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Evaluation Framework for Cybersecurity Maturity Aligned with the NIST CSF

by
Luís Bernardo
1,†,
Silvestre Malta
1,† and
João Magalhães
2,*,†
1
ADiT-Lab, ESTG, Instituto Politécnico de Viana do Castelo, 4900-348 Viana do Castelo, Portugal
2
CIICESI, ESTG, Instituto Politécnico do Porto, Rua do Curral, 4610-156 Felgueiras, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2025, 14(7), 1364; https://doi.org/10.3390/electronics14071364
Submission received: 7 February 2025 / Revised: 19 March 2025 / Accepted: 26 March 2025 / Published: 28 March 2025
(This article belongs to the Special Issue Recent Advances in Information Security and Data Privacy)

Abstract

:
Cybersecurity is critical for mitigating the economic and reputational impacts of cyberattacks. To address these risks, frameworks like the NIST Cybersecurity Framework (NIST CSF) provide standardized guidelines for managing and reducing cybersecurity threats. This paper presents a maturity assessment approach aligned with the NIST CSF, incorporating a dual-survey methodology. The first survey engages cybersecurity experts to calibrate question importance, while the second targets organizations across management, IT staff, and other roles. The approach employs algorithms to deliver consistent evaluations and facilitate cross-organization comparisons. Results from case studies illustrate cybersecurity maturity levels for each NIST CSF function and highlight priority controls for enhancing organizational cybersecurity.

1. Introduction

Cybersecurity threats are continuously evolving and becoming increasingly complex and worrisome, posing a significant challenge to organizations around the world. Through data breaches, cyberattacks, and other ongoing malicious activities, it is essential that organizations adopt tools to improve cybersecurity in general. The COVID-19 pandemic has made cybersecurity even more essential in organizations, accelerating a digital transformation process. Another contributing factor is the fact that remote work has become relevant, and organizations have been forced to allow this new way of working, which further increases risks. These factors have led to significant increases in the number of cyberattacks that cybercriminals are exploiting that use the vulnerabilities left behind to break into systems and cause often irreparable damage. There are various types of cyberattacks that have various consequences and can be severe. Impacts range from financial losses and damage to companies’ reputations but also involve legal liabilities, making business nonviable and permanently closing the business. Due to this, cybersecurity has become a top priority for organizations, regardless of the industry in which they operate.
As the challenge of protecting organizations is great, several have adopted cybersecurity frameworks to ensure the integrity of their operations, as well as to minimize the risks currently generated via these cyber threats. The NIST Cybersecurity Framework (NIST CSF) is probably the most widely used by companies because it provides a flexible framework for managing cyber risks and is relatively easy to implement. However, despite the availability of these resources to manage cyber risks and keep the organization secure, not all organizations are able to adopt measures that minimize risks, as this requires resources, an awareness of the risks involved, or a lack of commitment from people in leadership positions who could change the direction of the organization on this issue. Therefore, assessing the maturity level of an organization’s cybersecurity practices is critical, and it must include identifying gaps and weaknesses in its framework that encompasses cybersecurity. Through this, organizations can develop efficient and effective strategies to improve their cybersecurity performance and mitigate cyber risks.
The main objective of this work is to evaluate and improve the maturity of cybersecurity within an organization. This effort seeks to identify areas that can be improved, identifying potential changes in attitude and behavior, with the ultimate goal of consistently advancing cybersecurity management practices.
To attain this goal, NIST CSF serves as a supporting tool, with two distinct surveys designed for different target groups. First, insights from cybersecurity experts (Group I) were sought to refine and validate the conclusions, ultimately culminating in the development of a new methodology for assessing cybersecurity maturity. Second, a comprehensive survey was conducted at various organization levels (Group II), using online questionnaires as the main data collection tool. Based on a consideration of expert surveys, it was possible to develop a new framework, which indicates the level of cybersecurity organizations to improve their cybersecurity capabilities through the implementation of controls that improve the tree pillars of cybersecurity: people, processes, and technology.
The remainder of this paper is structured as follows: Section 2 explores related work, focusing on the nuances of NIST CSF and studies related to cybersecurity maturity assessment. Section 3 outlines the methodology employed to evaluate maturity levels. Section 4 discusses use cases and presents the results obtained. Finally, Section 5 provides the conclusion.

2. Literature Review

In this section is presented the literature review focused on the cybersecurity maturity assessment. Cybersecurity is a critical aspect of personal, organizational, and national security in today’s digital world. It encompasses the protection of computer systems, networks, and digital information from unauthorized access, use, disclosure, disruption, modification, or destruction. The increasing prevalence of cyber threats, including phishing, malware, ransomware, data breaches, and denial-of-service attacks, underscores the need for robust cybersecurity measures.
Cybersecurity frameworks provide a structured approach to organizing and implementing best practices to safeguard systems and networks against these threats. Some of the most widely used frameworks include International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) 27001, NIST CSF, Center for Internet Security (CIS), and PCI DSS.
The (NIST CSF) is a widely adopted framework that offers a common language and approach for organizations to manage and reduce cybersecurity risk. It is divided into six main functions: governance, identify, protect, detect, respond, and recover. Organizations can effectively identify, prioritize, and manage cybersecurity risks by utilizing this framework.

Cybersecurity Maturity Assessment—Related Work

In [1], the authors propose a new framework called Cyber Trust Index (CTI) to address the lack of a standardized and consistent approach to measuring cybersecurity performance in organizations. Their research found that most organizations lack a formal security performance rating system and struggle to consistently measure their security posture. The CTI framework aims to provide a structured approach to assessing and improving cybersecurity performance.
In [2], the use of capability levels (CLs) to enhance the implementation of the NIST CSF is proposed. This approach enables organizations to tailor their NIST CSF implementation to their specific needs, improving their overall cybersecurity posture and aligning cybersecurity efforts with business objectives. Organizations employing Capability Levels (CLS) reported an improved alignment between cybersecurity efforts and business objectives, enhanced communication and collaboration among departments, a better understanding of cybersecurity risks, and a more formalized cybersecurity program.
A study [3] conducted at a university in Malaysia evaluated the security of academic information systems using NIST CSF SP 800-26. The study found that, while security policies were well established, there were significant gaps in implementation. User awareness of security policies was lacking, technical controls were deficient, software and hardware were outdated, passwords were weak, and wireless networks were insecure. These issues led to a moderate security posture for the systems. The study recommended improving security measures, implementing stronger technical controls, increasing user awareness, providing training and education programs, and conducting regular audits and reviews to ensure the effectiveness of security measures. The study highlights the importance of adopting and maintaining cybersecurity measures for the protection of information and demonstrates how NIST CSF SP 800-26 can be used to evaluate and improve security.
In [4], the authors proposed the adoption of NIST CSF as a methodological approach to cybersecurity management at government organizations. The NIST CSF, a widely recognized framework, offers guidance on managing and mitigating cybersecurity risks. To evaluate the effectiveness of this methodology, a case study was conducted at a government organization. The study employed a mixed-methods approach that combined employee interviews, observations, and document analysis. The results demonstrated that the implementation of the NIST CSF effectively improved the organization’s cybersecurity posture. The framework provided a common language and a structured approach to cybersecurity management, allowing the organization to identify and prioritize cybersecurity risks, implement appropriate controls, and strengthen cybersecurity practices. The authors advocate for adopting the NIST CSF as a methodological approach to cybersecurity management at government organizations and recommend further research to explore the framework’s effectiveness in other organizational contexts.
The authors in [5] proposed the Cybersecurity Maturity Assessment Framework (CMAF), that introduces a structured approach to evaluating and enhancing the cybersecurity posture of organizations, particularly operators of essential services (OESs) and digital service providers (DSPs), in compliance with the EU NIS Directive. The framework was designed to standardize feedback collection, assign security levels based on implemented controls, and provide benchmarking and comparative analysis across different entities. It incorporates 20 baseline security requirements categorized into identification, protection, and reaction and a six-level maturity scale ranging from incomplete to optimized. Validated through pilot implementations in healthcare, digital infrastructures, and air-transport sectors, the CMAF demonstrated its effectiveness in offering a comprehensive and adaptable method for cybersecurity assessment. This framework supports self-assessment and external audits and facilitates continuous improvement and alignment with best practices, making it a valuable tool for national competent authorities aiming to enhance cybersecurity across various sectors.
In [6], the authors analyze NIST CSF for its applicability within critical infrastructure sectors and propose a new information security maturity model (ISMM) to address gaps in measuring implementation progress. The study identifies that, while NIST CSF provides a comprehensive set of guidelines and best practices, it lacks mechanisms to measure the maturity levels of information security processes. By comparing NIST CSF with other frameworks such as Control Objectives for Information and Related Technologies (COBIT) [7], ISO/IEC 27001 [8], and the Information Security Forum (ISF) Standard of Good Practice (SoGP), the authors highlight the necessity of a maturity model that complements NIST CSF. They propose a five-level maturity model that includes 23 areas assessed, addressing the compliance and capability measurement needs overlooked with NIST CSF.
The authors of [9] provide a comprehensive evaluation of various cybersecurity frameworks, with a focus on the NIST CSF. These frameworks are essential for organizations to improve security, efficiently manage workloads, and minimize cyber risks. The paper discusses the five core functions of these frameworks: identify, protect, detect, respond, and recover, emphasizing their application across different domains such as information technology, cyber–physical systems, industrial control systems, and the Internet of Things. The authors highlight the increasing importance of protecting critical infrastructure from cyberattacks, citing recent incidents like the ransomware attack on the Colonial Pipeline. They advocate for continuous updates and employee training to maintain cybersecurity resilience and suggest the establishment of a national cyber-governance bureau to improve cooperation among academia, industry, and government entities for enhanced cyber defense.
According to the study by [10], the authors emphasize the critical necessity of small and medium-sized enterprises (SMEs) improving their cybersecurity protocols to align with technological progress and reduce associated risks. According to historical records, small and medium-sized enterprises have been slow to implement essential security measures [11,12], a concern that is growing in importance with the introduction of the proposed Consumer Data Protection Act. The authors advocate for the adoption of their Cybersecurity Evaluation Tool (CET), a practical framework for SMEs to assess and enhance their cybersecurity maturity. The CET enables IT leaders to identify vulnerabilities, benchmark their practices against industry standards, and prioritize key improvement areas. Given that surveyed SMEs fall short of the ideal maturity levels, it is imperative for these companies to proactively engage in robust cybersecurity practices to effectively safeguard sensitive data and demonstrate diligence in post-breach scenarios.
In [13], the authors present the Cybersecurity Vulnerability Mitigation Framework via Empirical Paradigm (CyFEr) and its advanced prioritized gap analysis (EPGA) approach, which leverages existing frameworks such as the NIST CSF and the Cybersecurity Capability Maturity Model (C2M2). CyFEr differentiates itself from traditional frameworks by not only focusing on qualitative analysis and vulnerability identification but also providing a mechanism for prioritized mitigation to achieve the desired cybersecurity maturity. The effectiveness of CyFEr is demonstrated by benchmarking against established models and being subjected to actual cyber-attacks targeting industrial control systems in critical infrastructure environments.
According to [14], the authors underscore the vital significance of data and information security for organizations in the digital age, emphasizing its pivotal role in attaining good corporate governance. The focus is on the ABC organization, a government body overseeing Indonesia’s critical infrastructure and digital economy which, despite adopting recognized international security standards such as NIST CSF, ISO 27002 [15], COBIT, and PCI DSS, has not yet achieved optimal information security management readiness. The research evaluates these standards and introduces a comprehensive cybersecurity maturity framework encompassing twenty-one integrated categories. This framework aims to assess and enhance the cybersecurity maturity of ABC’s ICT management.
In [16], a cybersecurity maturity model is presented to address the growing threat of cyberattacks on web applications. This model, grounded in NIST CSF and ISO 27032 [17] standards, outlines four maturity levels: initial, defined, established, and improved. It comprises twelve domains and four categories, forming a comprehensive framework. The model integrates two quality standards and is deployed as a web application, allowing automated, user-friendly evaluations with instant results and improvement suggestions. Validation by fifteen experts confirmed the model’s effectiveness in quickly assessing and enhancing web application security. Its dual-standard foundation, ease of use, and accessibility underscore its superiority over other models.
In [18], a conceptual framework termed the Cybersecurity Resilience Maturity Measurement (CRMM) is introduced. It is designed to mitigate the significant risk of cybersecurity breaches prevalent in African nations. The CRMM framework quantifies the effectiveness of cybersecurity controls in organizations, especially for critical information infrastructure (CII), before and after cyberattacks. It leverages the NIST CSF Cybersecurity Framework and other standards to assess an organization’s cybersecurity practices and resilience maturity. It defines four cybersecurity resilience quadrants (CRQs) showing levels of organizational preparedness and resilience. The framework’s model quantifies cybersecurity performance, identifying gaps, and improvement areas. It uses a performance-rating technique to compare current cybersecurity status with best practices and desired future performance.
The literature review on cybersecurity maturity assessment highlights the critical need for robust cybersecurity measures in today’s digital landscape, emphasizing the importance of frameworks such as ISO/IEC 27001 NIST CSF, CIS, and PCI DSS.
Notably, the NIST CSF, with its six core functions, offers a comprehensive approach to managing and reducing cybersecurity risks. Various studies propose enhancements and new models to address gaps in current practices, such as the CTI framework for consistent cybersecurity performance measurement [1], the use of capability levels (CLs) to tailor NIST CSF implementations [2], and the Cybersecurity Maturity Assessment Framework (CMAF) for compliance with the EU NIS Directive [5]. Additionally, the research underscores the need for tailored implementations, increased user awareness, and continuous improvement in cybersecurity practices across sectors, including government organizations, academic institutions, and critical infrastructure [3,4,6].

3. Work Methodology

The methodology adopted in this work uses the NIST CSF and proposes an additional tool to assess cybersecurity maturity. This methodology was initially conceptualized and partially validated in a master’s thesis [19], which served as the basis for this expanded and refined method. This approach aims to cover two distinct and essential audiences in the context of cybersecurity: the common user, who plays a crucial role in preventing threats, and cybersecurity experts, who have in-depth knowledge of the challenges and strategies in the area. To achieve this objective, personalized questionnaires have been developed for each group. Based on the analysis of the responses obtained, a maturity score is calculated, using the weighting between the questionnaires, which reflects the current state of the organization concerning the security of its data and systems.

3.1. Survey Design

To accurately assess cybersecurity maturity, a comprehensive survey was carried out involving system users, service routines, cybersecurity professionals, and a wide range of organization staff.
Participants were divided into two main groups: experts (Group I), made up of people with experience in cybersecurity, and the general group. The general group (Group II) was further subdivided into three subgroups: management, comprising directors or managers; technical (IT), covering IT infrastructure personnel; and non-technical, covering other organization staff.
The 108 NIST CSF cybersecurity benchmarks were consolidated into more general statements to facilitate the administration of the questionnaire. Participants rated their level of agreement with each statement using a five-point scale.
In addition, a separate survey was distributed to cybersecurity experts via LinkedIn. These experts are divided into two groups according to their level of experience (less than or more than five years). The objective for this case is to identify what is most relevant and a priority for each of those who answered the questions, assigning a score to each sentence ranging from 1 to 10, with 1 being the least relevant and 10 being very relevant.
This approach aims to gain a more comprehensive view of perceptions and knowledge related to cybersecurity. Data collection through multiple groups and with different experiences is expected to contribute to obtaining more robust and representative insights in the context of the study.
Figure 1 presents the survey structure that has been used. Group I survey, with 15 questions, focuses on cybersecurity experts, gathering opinions on the importance levels attributed to NIST CSF functions (identify, protect, detect, respond, and recover), their categories, and associated controls. Group II survey is uniquely designed for three distinct audiences: management board (24 questions), IT technicals (53 questions), and non-technical stakeholders (20 questions).
It is important to point out that, for the three groups, there were sentences from the five categories, which allows a more global analysis. At all levels, participants could rate their degree of agreement with each statement using a five-point scale: strongly disagree (SD)/ disagree (D)/ neither agree nor disagree (NAD)/ agree (A)/ strongly agree (SA).
Concerning the survey statistical data, questionnaires were distributed across three distinct organizational groups (management, IT staff, and other stakeholders) in four small and medium-sized enterprises (SMEs), each comprising approximately 50 employees. Across the three surveyed organizational roles (management, IT, and others), response rates were substantial, reaching approximately 70.42% overall. The breakdown by company is as follows:
  • Company A contributed with 3 management respondents (100% response rate within group), 2 IT respondents (100%), and 16 others (80%).
  • Company B provided one management response (100%), one IT response (100%), and eight others (45%).
  • Company C provided responses from 1 management participant (100%), 1 IT respondents (100%), and 10 other stakeholders (93%).
  • Company D included one management respondent (100%), one IT respondent (100%), and seven others (78%).
To ensure the integrity of our statistical analysis, incomplete responses were addressed using available data for each respondent, filling in missing values with neutral scores. Of the cybersecurity experts, half had over five years of experience, while the other half had less. These details underscore the credibility of calculating the importance levels as outlined in Algorithm 1.
Algorithm 1 Importance-degree calculation.
  • Input: 15 statements, based on NIST CSF.
  • for each statement do
  •     A value is assigned ( 1 10 ) .
  • end for
  • for each NIST function do
  •     The sum of all results is calculated.
  •     Calculate the average experience:
  •          E = K · 1 + Z · 0.5 K + Z where:
  •         K is the number of experts with experience of greater than or equal to 5 years.
  •         Z is the number of experts with experience of less than 5 years.
  •     Calculate the importance degree:
  •          G = of all results E
  • end for
The assessment conducted adhered strictly to the guidelines outlined in the European General Data Protection Regulation (GDPR). As per these regulations, the identities of both experts and organization participants remain anonymized throughout the entire assessment process and subsequent reporting of results. This ensures that individual privacy and confidentiality are rigorously maintained, reflecting our commitment to upholding the highest standards of data protection and ethical practice mandated by the GDPR framework.

3.2. Cybersecurity Maturity Assessment

The calculation of cybersecurity maturity is the result of integrating two different matrices: one derived from responses collected through Group II surveys and the other consisting of contributions from cybersecurity experts (Group I). The inclusion of cybersecurity experts alongside distinct organizational groups such as management, IT personnel, and general staff provides a multi-dimensional perspective, reinforcing the scalability and adaptability of the proposed framework across organizations of varying sizes and sectors.

3.2.1. Importance-Level Determination

The Group I survey used a structured questionnaire that assigned questions to NIST CSF functions, categories, and controls. Cybersecurity experts provided professional opinions, assigning importance levels to each element. The subsequent analysis ensured a nuanced and weighted perspective, reflecting the relative significance of these elements in the context of cybersecurity maturity. Figure 2 illustrates the methodology used to calculate the value of each question, which is based on the importance levels established via the Group I survey.
Importance levels derived from the Group I survey responses become integral to the maturity calculation presented in Algorithm 1. The maturity level is not only influenced by direct survey responses but is also weighted by the relative importance assigned by cybersecurity experts. This layered approach ensures a balanced and informed evaluation. As stated in Algorithm 1, experts assign a score from 1 (least relevant) to 10 (most relevant) for each cybersecurity statement aligned with the NIST CSF functions. These scores are aggregated by function, taking into account the experience of the experts (full weight for experts with 5 years experience and half weight otherwise). The aggregated scores yield an importance degree per function that influences subsequent maturity calculations. For clarity, consider a hypothetical scenario: if 10 experts (6 experienced 5 years and 4 less experienced) rated the “Identify“ function with an average score of 8, the weighted importance calculation would be as in Equations (1) and (2).
E = 6 · 1 + 4 · 0 . 5 10 = 0 . 8
Theoverall importance degree (G) per NIST CSF function is computed via the following:
G = Total Score from all experts E

3.2.2. Audience-Specific Survey for Maturity Assessment

The Group II survey, directed at the board, IT staff, and other stakeholders, aims to assess the perceived effectiveness and priority of specific controls within the NIST functions. Tailored questions address the unique perspectives and priorities of each audience, allowing for a comprehensive understanding of the organization’s cybersecurity posture. Figure 3 presents the flow methodology for the value calculation on each question present in the survey.
Algorithm 2 outlines the computations performed at each stage to derive the weighted matrix. Each response is weighted by the recalculated importance levels derived from Algorithm 1. For example, if the importance degree for the “Protect“ function is calculated to be 0.03 , and the organization’s average response score for a particular control in the Protect function is 4, the final weighted value for that question would be computed as in Equation (3):
Weighted Response Value = 4 × 0 . 03 = 0 . 12
In practice, if the “Protect“ function had an importance degree of 0.028, and the organization’s average response score for a relevant question was 4, the weighted result would be 4 × 0.028 = 0.112 .
Algorithm 2 Calculation of the importance degree.
  • Step 1: Calculate the value of each question (Organizations)
  •     Using the following formula: Value = 1 Number of questions per NIST Group and per survey type
  • Step 2: Construction of the Organizations Matrix
  •     Each questionnaire will have its own matrix, based on the provided answers which have different weights (SD-1 | D-2 | NAD-3 | A-4 | SA-5).
  • Step 3: Calculate the value of each question (experts)
  • R = ( Result obtained in each question in Step 1 )
  • I = (% of importance degree of experts)
  •     Using the following formula:
    Value = R × I
  • Step 4: Construction of the expert Matrix
  •     The matrix constructed in Step 2 is weighted according to the output of Step 3, generating a new matrix.

3.2.3. Determining Final Maturity Level

Following the creation of the matrices (for the management group, for those who work in the IT area, for the group of other stakeholders, and for experts), which provided the values of the responses for each question, the calculation phase begins. With this calculation, it is possible to suggest, as long as the organization has responded at all levels, in which subcategory of the NIST CSF function there is space for improvements, or what the main points of attention are.
Figure 4 presents the workflow used in the process of final maturity level calculations.
It consists of the following:
  • Consolidation of research results.
    • The outcome is derived from combining the matrices based on the type of questionnaire (management board, IT, and others).
    • At this point, just the results (the computed value per response) from each respondent’s answers in the questionnaires are taken into account.
  • Average calculation per NIST CSF function.
    • The simple average of all responses is calculated.
  • Recalculation of the degree of importance of experts
    • The experts’ importance score is recalculated based on the obtained responses.
  • Application of the degree of importance of experts.
    • The new importance score of experts is used to consolidate the research results.
  • Average results considering the recalculated expert index.
    • The simple average of all responses is calculated after the application of the recalculated expert index.
  • Calculation of the maturity index considering the recalculated expert index.
    • With the expert index recalculated, it is possible to calculate the maturity index from the experts’ perspectives.
    • This index is obtained by multiplying the new expert index by the total results of the initial research by NIST CSF function.
  • Average results considering the recalculated expert index.
    • The simple average considering the recalculated expert index of all responses is recalculated.
As demonstrated in Table 1, after all calculations are performed, it becomes feasible to compare the responses gathered from the questionnaires with those enhanced by the experts’ significance ratings. With these values, the organization’s standing on the cybersecurity maturity scale can be assessed.

4. Experimentation and Results Analysis

This section presents the results of applying the methodology presented in Section 3. The experimental analysis of the usefulness of the methodology includes the following: (i) consultation with experts and respective analysis of results; (ii) consultation with organizations and the respective maturity assessment by NIST CSF function; and (iii) an analysis of the results based on the percentage obtained by function, category, subcategory, and controls, thus allowing the identification of controls to be implemented or improved (GAP analysis).

4.1. The Experts Analysis

To effectively capture the diverse perspectives and knowledge levels within the cybersecurity domain, a tailored questionnaire approach was adopted. This questionnaire targets experts in the field, and the aim is to capture the full breadth and depth of the NIST CSF, focusing on advanced concepts and best practices. Its main function is to calibrate the answers obtained in the other surveys, bringing a perspective based on the experiences of these experts. The survey with experts consists of 15 questions. Each respondent was assigned a score between 1 and 10 (total score) for each statement, reflecting the perceived importance of each cybersecurity aspect. The analysis of the results follows that presented in Algorithm 1.
To determine the individual value of each expert question, we consider the results of their completed surveys. Figure 5 presents a compilation of the importance assessments attributed by experts to each of the NIST CSF functions.
To determine the weight of each function, which is useful for use in the final matrix calculation, the percentage is converted according to the formula presented in Equation (4).
Value of each question : % i m p o r t a n c e 100 Number of questions per function
This equation takes into account the number of questions that exist in the survey directed to organizations via the target group (management board: 24; IT: 53; l: 20). The final result is presented in Table 2.
For example, the value 0.022 in the identify function for the management target group was achieved using Equation (5) (identify function).
Value of each question ( example ) = 29.0 100 13 0.022
Using the value of each question, the final value to be attributed considering the answers was determined. For example, in Table 3 are presented the values to account for answers provided via the management group in the identify function. Note that the * 0.022 value is related with the calculation of the value of each NIST CSF function for the management group in the identify function.

4.2. Cybersecurity Maturity Assessment

The calculation of cybersecurity maturity is the result of integrating two different matrices (as illustrated in Figure 4): one derived from responses collected through the surveys made to organizations and the other consisting of contributions from cybersecurity experts.
The experimental analysis involved four different organizations. The surveys were made available for 40 days.
The results were determined using Algorithm 2. The results of step 3 of the algorithm are presented in Table 4. Q1, Q2, and Qn represent the different questions, and the values are aggregated per NIST CSF function, considering the target groups surveyed (management, IT, and others). The results are calculated per organization.
Given the results obtained from step 3 of Algorithm 2, the next step was to determine maturity by function, taking into account the weight that experts attribute to each of the functions. This was achieved using Equation (6) (where i is the question number, Id, Pr, De, Res, and Rec are the NIST CSF functions, and the results are the recalculated maturity score (b)).
RecQi = % E x p e r t s 100 · ( Q i _ I d + Q i _ P r + Q i _ D e + Q i _ R e s + Q i _ R e c )
The maturity average (a) corresponds to the arithmetic mean of the questions’ maturity scores per function. The calculated values are presented in Table 5 and identified in red. For presentation purposes, only two questions were used.
The global maturity score results from the arithmetic mean of the maturity average score per function result. In the scenario presented, it is 3.33 on a scale of 0–5, where 5 is the best maturity score, and 0 is the worst.
Through this analysis, and considering all the responses, the analysis allowed us to conclude that, for example, company B has different levels of maturity by function and has an overall maturity of 3.67, when considering only its perception and cybersecurity maturity score of 3.52, which takes into account the experts’ opinions.
The results obtained by function and overall can be seen in Figure 6.
For presentation reasons, only organization B was detailed. The final results obtained for organization A were 1.48 both in terms of the organization’s cybersecurity maturity and when calibrated with the experts’ opinions. Company C obtained maturity of 4.45 and 3.81, according to the experts’ calibration. Company D obtained 1.66 and 1.62, respectively.
The results obtained allow us to identify the functions where the organization is performing best and worst, and they also allow for a considered analysis based on the opinion of experts.

4.3. Practical Implementation Challenges and Recommendations

Implementing cybersecurity frameworks in practice frequently encounters notable challenges, especially for small enterprises or those with constrained resources. Typical obstacles encompass financial limitations, a shortage of skilled cybersecurity staff, and inadequate organizational awareness and dedication. To address these problems, various strategies can be suggested, including the following:
  • Cost-effective tools: utilize open-source cybersecurity tools and platforms (e.g., OpenVAS v23.16.1 for vulnerability assessments, Suricata v7.0.10 for intrusion detection, and OpenSCAP v1.3 for compliance checks), which can substantially reduce financial barriers.
  • Training programs: implement structured training and awareness programs leveraging freely available cybersecurity resources from reputable entities such as NIST and ENISA. Online training courses, webinars, and workshops can effectively increase organizational cybersecurity awareness without significant investment.
  • Phased adoption plans: recommend a phased adoption strategy, prioritizing critical controls first, based on the identified maturity gaps. Organizations can start with high-impact, low-cost initiatives to achieve immediate improvements, progressively addressing more complex areas as resources permit.
  • Collaboration and partnerships: encourage collaboration with industry groups, cybersecurity communities, or academia to leverage shared resources, expertise, and best practices, thus minimizing the internal resources required.
  • Continuous evaluation and feedback loops: establish an iterative process of continuous assessment and feedback to regularly adjust strategies based on lessons learned, ensuring incremental and sustainable improvements without overwhelming organizational resources.

4.4. GAP Analysis

From the results presented in the previous subsection, company B has a satisfactory level of maturity in cybersecurity. Functions, such as identify and detect, are very close to the maximum level of maturity, but the respond function is very poor; i.e., it has a very low level of maturity in terms of incident response. The overall findings indicate that the company is inadequately prepared to handle cyber threats. The analysis carried out using the methodology presented here also allows comparison between organizations. This comparison can act as extra motivation for applying controls to improve cybersecurity posture.
The analysis adopted and presented in this paper allows the identification of the controls that need to be implemented or improved to enhance the cybersecurity posture. According to the example and based on the results obtained, the cybersecurity maturity analysis of company B identified that the protect, respond, and recover functions have maturity levels below expectations. Based on these results, the improvement suggestions focus on the sub-groups of the NIST CSF that were identified as areas with potential for improvement. So, the suggestions, focused on the sub-groups with the worst results, are as follows:
  • Function: protect|category: data security.
    • Data classification: implement an effective data classification system, assigning sensitivity and restriction levels to each type of information.
    • End-to-end encryption: adopt end-to-end encryption to protect data in transit and at rest.
    • Granular access control: implement an access control system with levels of granularity that allow only authorized personnel to have access to data relevant to their functions. This will minimize the risk of information being leaked or misused.
    • Ongoing monitoring: establish an ongoing monitoring system to identify unusual behavior or suspicious activity in relation to the data.
    • Data retention policies: define clear data retention policies, determining how long different types of information will be kept.
    • Anonymization and pseudonymization: explore anonymization and pseudonymization techniques to reduce the identifiability of personal data while maintaining its usefulness for analysis and internal operations.
    • Vulnerability tests: conduct regular vulnerability tests on the systems that house the data, identifying possible security breaches and correcting them promptly.
    • Awareness training: promote regular training on data security awareness for all employees, ensuring that they understand the importance of protecting information and know how to do so properly.
  • Function: protect|category: identity management and access control.
    • Multi-factor authentication (MFA): Expand the adoption of multi-factor authentication across all critical systems and resources. MFA provides an additional layer of security by requiring users to provide multiple authentication factors before gaining access.
    • Strong password policies: Implement strict password creation and update policies, promoting strong and complex passwords. In addition, encourage the use of password managers to prevent unsafe practices.
    • Principle of least privilege: Adopt the principle of least privilege when granting access to resources and systems. Ensure that users only have the permissions they need to perform their roles, thereby reducing the potential attack surface.
    • Access monitoring: establish a real-time access monitoring system to detect unusual patterns or suspicious activity. This will allow for the early detection of unauthorized access attempts.
    • Regular access review: Perform regular access permission reviews, ensuring that only active and authorized users have access to resources. This will minimize the risk of unauthorized access by former employees or inactive users.
    • Identity management tools: implement identity management tools that facilitate efficient and secure user provisioning and de-provisioning.
    • Segregation of duties: ensure that users have distinct role assignments, preventing the overlapping of permissions and minimizing the risk of internal abuse.
    • Awareness training: promote regular awareness training on secure identity management and access control practices. This will ensure that employees understand the importance of protecting access credentials.
  • Function: respond|category: response planning.
    • Incident response team (IRT): strengthen the structure and capacity of the incident response team, including defining clear roles and responsibilities, as well as ongoing training to keep skills up to date.
    • Updated response plan: Keep an incident response plan up to date and accessible to all team members. This will ensure that everyone knows their roles and knows how to act in different scenarios.
    • Training scenarios: conduct regular incident simulations to train personnel and test the response plan in controlled situations.
    • Effective communication: define clear internal and external communication procedures during an incident, ensuring that all relevant parties are informed on time.
    • Recovery and mitigation: integrate recovery and mitigation measures into a comprehensive response plan to restore operational normality and minimize damage.
    • Impact assessment: incorporate a detailed assessment of the potential impact of incidents, considering operational, financial, and reputational aspects.
    • Post-incident communication strategy: define a post-incident communication strategy to manage information disclosure and mitigate the impact on the reputation of the organization.
    • External collaboration: establish collaboration protocols with external entities, such as suppliers, partners, and regulatory authorities, for a coordinated response.
    • Ongoing review: conduct periodic reviews of the response plan to identify areas for improvement and adjust strategies based on lessons learned.
  • Function: respond|category: improvements.
    • Structured post-incident assessment: institute a structured post-incident assessment, involving all relevant parties, to identify strengths and weaknesses of the response and identify opportunities for improvement.
    • Clear corrective actions: define specific and measurable corrective actions based on lessons learned from previous incidents.
    • Action tracking: implement a corrective action tracking system to ensure that they are implemented effectively and produce the desired results.
    • Best practice standards: incorporate best practice standards, such as NIST CSF recommendations, to consistently and comprehensively drive improvements.
    • Innovation and technology: explore the use of emerging technologies, such as artificial intelligence and automation, to optimize post-incident response and analysis processes.
    • Ongoing training: provide regular training to the incident response team, incorporating the lessons learned and the best practices identified.
    • Periodic reviews: conduct periodic reviews of implemented improvements to assess their effectiveness and make adjustments as necessary.
    • Knowledge sharing: foster a culture of knowledge sharing that allows security teams to learn from incidents collaboratively.
    • Transparent communication: maintain transparent communication about the improvements implemented and the results achieved, promoting trust between the interested parties.
  • Function: recover|category: communications.
    • Pre-defined communication plan: develop a pre-defined communication plan that covers the different recovery scenarios, identifying the interested parties, the communication channels, and the messages to be transmitted.
    • Assigned communications team: designate a team responsible for coordinating communication during the recovery process, ensuring that information is transmitted in a clear and coordinated manner.
    • Internal and external communication: establish protocols for communication both internally, with the teams involved in the recovery, and externally, with partners, suppliers, customers, and regulatory authorities.
    • Consistent messages: ensure that the messages transmitted are consistent and aligned with the current situation, avoiding conflicting information.
    • Transparency and regular updates: maintain transparent communication by providing regular updates on recovery progress, even if it means no significant updates.
    • Diverse communication channels: utilize a variety of communication channels, such as email, instant messaging, and collaboration platforms, to effectively reach stakeholders.
    • Feedback and questions: provide channels to receive feedback from stakeholders and respond to questions, demonstrating accountability and commitment to resolution.
    • Communication of completion: communicate clearly when the recovery process is complete and normal operations are restored.
    • Post-recovery assessment: conduct post-recovery assessments to review communication effectiveness, identify areas for improvement, and adjust processes as needed.
The application of these controls can now be framed in an implementation plan allowing the organization to improve its cybersecurity posture. The continuous assessment of cybersecurity maturity will contribute to security in cyberspace, digital trust, and protections against economic and reputational losses resulting from cyberattacks.

5. Conclusions

This study aimed to introduce a supplementary tool for assessing cybersecurity maturity levels, guided by the NIST CSF. The findings indicate that organizations typically exhibit a moderate level of cybersecurity maturity, though there is considerable diversity among different organizations.
The study underscores the importance of cybersecurity prevention, urging organizations to take proactive steps, such as establishing strong security policies, improving employee education, and consistently updating systems and software. It stresses that every organization encounters distinct cybersecurity challenges, necessitating bespoke prevention strategies and measures tailored to their particular needs and resources.
According to the study, employing the maturity score as an additional approach can help assess and improve organizations’ security posture. By frequently assessing their maturity, organizations can detect and address weaknesses before their exploitation by attackers. Additionally, investing in preventive strategies is vital for safeguarding organizations from cyberattacks. Enforcing strong security protocols, increasing employee awareness, and keeping systems and software current significantly lower the chance of becoming vulnerable to cyber threats.
This study significantly augments the existing body of cybersecurity knowledge, providing critical insights into the developmental stages of organizational cybersecurity maturity. The results will aid organizations in evaluating their security stance and enhancing their defenses against cyber threats. Additionally, the results will help pinpoint strengths and weaknesses in cybersecurity, enabling organizations to formulate improvement strategies.
While the study makes significant contributions, it acknowledges some limitations, including the small sample size of companies. Future research should examine how cybersecurity maturity indices affect organizational security posture, assess the comparative effectiveness of various preventive measures across different organizations, and explore future directions in cybersecurity maturity.
While emphasizing the contributions of this study, it is equally important to recognize its limitations and suggest directions for future research. Apart from the small sample size, we acknowledge that our methodology significantly depends on subjective expert opinions, which can result in biases arising from individual experiences and viewpoints. These biases may affect assessment results, making them somewhat dependent on the expertise involved. To address this, future research could employ objective metrics and broaden the diversity of expert panels. Additionally, integrating thorough objective metrics like incident response times or the frequency and severity of security breaches was beyond our current scope due to organizational data limitations. However, with a recognition of the value of these metrics for enhancing the framework’s reliability, upcoming research will aim to include and assess these quantitative factors, enriching expert analyses with practical incident data and standard industry benchmarks.
Additionally, another relevant aspect is evaluating how our model aligns with emerging technological trends and frameworks. In particular, it is necessary to assess whether the NIST CSF remains effective in addressing specific security challenges associated with advanced technologies. It is possible that emerging frameworks may be more suitable for assessing maturity in highly dynamic or technologically complex contexts, such as environments involving the Internet of Things (IoT) and critical infrastructures. Therefore, future studies should directly compare the performance of our framework against emerging approaches like the proposal in [20] in order to validate or recommend evolutions in our current methodology.
Also, emerging technologies, particularly Artificial Intelligence (AI) and Machine Learning (ML), play a pivotal role in addressing contemporary cybersecurity challenges, yet their integration into our current evaluation framework remains underexplored. To bridge this gap and enhance our methodology’s effectiveness, we propose explicitly aligning our framework with AI and ML capabilities. Specifically, future enhancements can leverage AI-driven analytics to automate real-time threat detection, anomaly identification, and predictive incident analysis, thus significantly improving accuracy and response efficiency. For instance, ML-driven anomaly detection tools can process extensive log data to quickly identify unusual patterns indicative of potential security incidents, while AI-driven automation can streamline incident response procedures. Additionally, employing predictive analytics powered by AI could anticipate vulnerabilities and inform proactive measures, thus advancing organizational cybersecurity maturity substantially. Future iterations of our framework will explicitly integrate these advanced technological approaches, thereby bolstering its applicability and effectiveness across increasingly complex cyber-threat landscapes.
Although this study incorporates robust metrics and results derived from these metrics, we recognize that our current analysis primarily relies on averages for assessing cybersecurity maturity. Future research could benefit from integrating additional statistical measures such as median, mode, standard deviation, variance, and distribution analysis. Furthermore, the introduction of ML techniques to validate results and provide automated recommendations for cybersecurity enhancements represents an exciting and valuable direction for future work.
In summary, this study offers an extensive evaluation framework, along with insightful suggestions for enhancing cybersecurity maturity within organizations. By implementing proactive measures and continuously evaluating their maturity levels, organizations can greatly reduce the likelihood of cyberattacks and protect their digital assets.

Author Contributions

L.B.: Conceptualization of this study, Methodology, Writing. S.M. and J.M.: Writing—review & editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Malaivongs, S.; Kiattisin, S.; Chatjuthamard, P. Cyber Trust Index: A Framework for Rating and Improving Cybersecurity Performance. Appl. Sci. 2022, 12, 11174. [Google Scholar] [CrossRef]
  2. Dedeke, A. Cybersecurity Framework Adoption: Using Capability Levels for Implementation Tiers and Profiles. IEEE Secur. Priv. 2017, 15, 47–54. [Google Scholar] [CrossRef]
  3. Poningsih, P.; Lubis, M.R. Analysis and evaluation of academic information system security using NIST SP 800-26 framework. Sink. J. Dan Penelit. Tek. Inform. 2022, 7, 267–273. [Google Scholar] [CrossRef]
  4. Frayssinet Delgado, M.; Esenarro, D.; Juárez Regalado, F.F.; Reátegui, M.D. Methodology based on the NIST cybersecurity framework as a proposal for cybersecurity management in government organizations. Cuad. Desarro. Apl. Las Tic 2021, 10, 123–141. [Google Scholar] [CrossRef]
  5. Drivas, G.; Chatzopoulou, A.; Maglaras, L.; Lambrinoudakis, C.; Cook, A.; Janicke, H. A NIS Directive Compliant Cybersecurity Maturity Assessment Framework. In Proceedings of the 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), Madrid, Spain, 13–17 July 2020; pp. 1641–1646. [Google Scholar] [CrossRef]
  6. Almuhammadi, S.; Alsaleh, M. Information Security Maturity Model for Nist Cyber Security Framework. In Proceedings of the Sixth International Conference on Information Technology Convergence and Services, Sydney, Australia, 25–26 February 2017; pp. 51–62. [Google Scholar] [CrossRef]
  7. COBIT 5; A Business Framework for the Governance and Management of Enterprise IT. ISACA: Rolling Meadows, IL, USA, 2012.
  8. ISO/IEC 27001:2022; Information Security, Cybersecurity and Privacy Protection—Information Security Management Systems—Requirements. ISO: Geneva, Switzerland, 2022.
  9. Saritac, U.; Liu, X.; Wang, R. Assessment of Cybersecurity Framework in Critical Infrastructures. In Proceedings of the 2022 IEEE Delhi Section Conference (DELCON), New Delhi, India, 11–13 February 2022; pp. 1–4. [Google Scholar] [CrossRef]
  10. Benz, M.; Chatterjee, D. Calculated risk? A cybersecurity evaluation tool for SMEs. Bus. Horizons 2020, 63, 531–540. [Google Scholar] [CrossRef]
  11. Bougaardt, G.; Kyobe, M. Investigating the factors inhibiting SMEs from recognizing and measuring losses from cybercrime in South Africa. In Proceedings of the ICIME 2011-Proceedings of the 2nd International Conference on Information Management and Evaluation: ICIME 2011 Ryerson University, Toronto, ON, Canada, 27–28 April 2011; p. 62. [Google Scholar]
  12. Stasiak, K. Middle Market Companies Underestimate Cybersecurity Risks, 2018. Available online: https://www.industryweek.com/leadership/article/22026028/middle-market-companies-underestimate-cybersecurity-risks (accessed on 15 July 2024).
  13. Gourisetti, S.; Mylrea, M.; Patangia, H. Cybersecurity vulnerability mitigation framework through empirical paradigm: Enhanced prioritized gap analysis. Future Gener. Comput. Syst. 2020, 105, 410–431. [Google Scholar] [CrossRef]
  14. Sulistyowati, D.; Handayani, F.; Suryanto, Y. Comparative Analysis and Design of Cybersecurity Maturity Assessment Methodology Using NIST CSF, COBIT, ISO/IEC 27002 and PCI DSS. JOIV Int. J. Inform. Vis. 2020, 4, 225–230. [Google Scholar] [CrossRef]
  15. ISO/IEC 27002:2022; Information Security, Cybersecurity and Privacy Protection—Information Security Controls. ISO: Geneva, Switzerland, 2022.
  16. Arenas, E.; Palomino, J.; Mansilla, J.P. Cybersecurity Maturity Model to Prevent Cyberattacks on Web Applications Based on ISO 27032 and NIST. In Proceedings of the 2023 IEEE XXX International Conference on Electronics, Electrical Engineering and Computing (INTERCON), Lima, Peru, 2–4 November 2023; pp. 1–8. [Google Scholar] [CrossRef]
  17. ISO/IEC 27032:2012; Information Technology—Security Techniques—Guidelines for Cybersecurity. ISO: Geneva, Switzerland, 2012.
  18. Mbanaso, U.; Abrahams, L.; Apene, O.Z. Conceptual Design of a Cybersecurity Resilience Maturity Measurement (CRMM) Framework. Afr. J. Inf. Commun. 2019, 23, 1–26. [Google Scholar] [CrossRef]
  19. L.Bernardo. Proposta de uma metodologia de avaliação da maturidade da cibersegurança com base no NIST CSF. Mestrado em engenharia informática, Escola Superior de Tecnologia e Gestão, Instituto Politécnico de Viana do Castelo, 2023. Available online: http://hdl.handle.net/20.500.11960/3989 (accessed on 18 March 2025).
  20. Qureshi, S.U.; He, J.; Tunio, S.; Zhu, N.; Nazir, A.; Wajahat, A.; Ullah, F.; Wadud, A. Systematic review of deep learning solutions for malware detection and forensic analysis in IoT. J. King Saud Univ.—Comput. Inf. Sci. 2024, 36, 102164. [Google Scholar] [CrossRef]
Figure 1. Survey structure.
Figure 1. Survey structure.
Electronics 14 01364 g001
Figure 2. Workflow of experts matrix creation.
Figure 2. Workflow of experts matrix creation.
Electronics 14 01364 g002
Figure 3. Workflow of general public matrix creation.
Figure 3. Workflow of general public matrix creation.
Electronics 14 01364 g003
Figure 4. Workflow of final maturity-level calculation. Where (a) represents the workflow for organization members (management board, IT, and others) and (b) for the cybersecurity experts.
Figure 4. Workflow of final maturity-level calculation. Where (a) represents the workflow for organization members (management board, IT, and others) and (b) for the cybersecurity experts.
Electronics 14 01364 g004
Figure 5. Degree of importance of NIST CSF functions.
Figure 5. Degree of importance of NIST CSF functions.
Electronics 14 01364 g005
Figure 6. Company B: cybersecurity maturity per function and global (company vision versus experts).
Figure 6. Company B: cybersecurity maturity per function and global (company vision versus experts).
Electronics 14 01364 g006
Table 1. Maturity scale.
Table 1. Maturity scale.
Search ResultMaturity
≤1.99Very poor
≤2.99Poor
≤3.99Fair
≤4.99Good
=5.00Excellent
Table 2. Calculation of the value of each NIST CSF function.
Table 2. Calculation of the value of each NIST CSF function.
FunctionImportance %Number of QuestionsValue for Each Question
ManagementOthersITManagementOthersIT
Identify29.01311150.0220.0260.019
Protect14.255260.0280.0280.005
Detect30.61140.3060.3060.076
Respond5.82150.0280.0570.011
Recover20.43230.0670.1010.067
Total100.02420530.4510.5180.178
Table 3. Example of distribution by response type.
Table 3. Example of distribution by response type.
Answer% Value for the Response Type
Strongly disagree0 0
Disagree25 0.005
Neutral50* 0.0220.011
Agree75 0.016
Strongly agree100 0.022
Table 4. Consolidation of results.
Table 4. Consolidation of results.
FunctionQ1Q2...Qn
Identify3.733.45
Protect4.404.20
Detect3.003.00
Respond3.003.00
Recover3.003.00
Table 5. Calculation of the maturity score.
Table 5. Calculation of the maturity score.
FunctionMaturity
Average (a)
Importance %Initial
Maturity Score
Recalculated
Maturity Score (b)
Q1Q2Q1Q2
Identify4.8929.03.733.454.964.82
Protect5.0030.64.404.205.005.00
Detect2.3514.23.003.002.432.36
Respond0.985.83.003.000.990.97
Recover3.4520.43.003.003.493.40
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bernardo, L.; Malta, S.; Magalhães, J. An Evaluation Framework for Cybersecurity Maturity Aligned with the NIST CSF. Electronics 2025, 14, 1364. https://doi.org/10.3390/electronics14071364

AMA Style

Bernardo L, Malta S, Magalhães J. An Evaluation Framework for Cybersecurity Maturity Aligned with the NIST CSF. Electronics. 2025; 14(7):1364. https://doi.org/10.3390/electronics14071364

Chicago/Turabian Style

Bernardo, Luís, Silvestre Malta, and João Magalhães. 2025. "An Evaluation Framework for Cybersecurity Maturity Aligned with the NIST CSF" Electronics 14, no. 7: 1364. https://doi.org/10.3390/electronics14071364

APA Style

Bernardo, L., Malta, S., & Magalhães, J. (2025). An Evaluation Framework for Cybersecurity Maturity Aligned with the NIST CSF. Electronics, 14(7), 1364. https://doi.org/10.3390/electronics14071364

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop