A Comprehensive Data Maturity Model for Data Pre-Analysis
Abstract
:1. Motivation
2. Related Work
3. Methodology
3.1. Questionnaire
- C1
- CompletenessThe completeness criterion describes whether sufficient data have been collected for the analysis and whether the dataset is populated accordingly. This criterion also includes the completeness of the paradata and metadata.
- C2
- UniquenessThe uniqueness criterion describes the uniqueness of the data. The goal of this criterion is to record all entries in the dataset only once to avoid duplicates within the data.
- C3
- TimelinessThe timeliness criterion describes the delay between the receipt of the data and the inclusion of the data in the corresponding dataset. This criterion is highly dependent on the application of the data preparation process and the subsequent data analysis.
- C4
- Validity/InteroperabilityThe validity/interoperability criterion describes the correctness of the data in terms of formats, syntax, and further processing steps. Different formats in the datasets, text responses that could have been converted to numbers, or incorrectly selected scaling can make further processing of the data very difficult.
- C5
- AccuracyThe accuracy criterion describes how accurately the data reflect the underlying reality. In other words, this criterion is used to assess whether the collected data accurately represent the real world or the event that is being described by the data. In addition, this criterion also indicates whether potential influences on the data have occurred during the data preparation process.
- C6
- ConsistencyThe consistency criterion describes whether the data are logically interconnected. This means that there are no contradictions in the dataset, but when there are contradictions, it is important to distinguish between possible and valid contradictions in the records and true contradictions in the data themselves, which are contradictions that contradict related data. The consistency criterion focuses only on true inconsistencies, such that similar records provide the same information and the data are consistent.
- C7
- Credibility/Accessibility/FindabilityThe credibility/accessibility/findability criterion describes the reliability of the data and the underlying data sources. This criterion depends on factors such as the verifiability of the data sources, the use of known and widely used methods in the data preparation process, and the authorization to use the collected and prepared data.
- C8
- Relevance/InterpretabilityThe relevance/interpretability criterion describes the relevance and affiliation of the data to the defined objectives of the subsequent data analysis. Accordingly, this criterion is concerned with the comprehension of the defined objectives of the subsequent analysis in order to prepare the data in the best possible way during the data preparation process.
- C9
- ReusabilityThe reusability criterion describes whether the data can be reused for other analyses. Reusability does not only depend on the publication of the prepared data, paradata, and metadata, but also on the nature of the dataset. This means that complete, cleaned, and organized datasets are well suited for reuse.
- Each question in the questionnaire is clearly assigned to only one criterion.
- The criteria are not communicated to the respondent in advance; therefore, the respondent is unaware of which question is assigned to which criterion.
- The order of the questions gives no indication of the criteria.
- All questions are decision questions that can be answered with either a yes, a no, or a not relevant.
- All questions are formulated independently of any topic.
- If a question is unclear, an explanation or an example is provided.
3.2. Importance Weighting via Expert Interviews
- How important are the predefined individual criteria in the data preparation process?
3.3. Qualitative Content Analysis
3.4. Implementation
4. Determination of the Level of Maturity
- If all criteria are completely fulfilled, the level of maturity is 100%.
- A question of a criterion influences the criterion itself as well as all other criteria that depend on this criterion (see Figure 1).
- The effect of a question on the criteria is referred to as influence factor in this model.
- Each question and, thus, each influence factor is weighted according to the weighting scheme.
- The influence factor is equal to 1 if the criterion corresponds to the criterion to which the question is assigned.
- The influence factor is halved per dependency.
5. Result Representation
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Interview Guide
Appendix B. Coding Frame
Name of the code | Completeness |
Content description | Describes if sufficient data have been collected for the analysis, if the dataset is filled accordingly, and if the corresponding paradata and metadata are available. |
Use of the code | Completeness is to be coded if the following aspects are mentioned:
|
Name of the code | Uniqueness |
Content description | Describes the impact of duplicates in the dataset |
Use of the code | Uniqueness is to be coded if the following aspects are mentioned:
|
Name of the code | Timeliness |
Content description | Describes effects of delays between the receipt and the recording of the data |
Use of the code | Timeliness is to be coded if the following aspects are mentioned:
|
Name of the code | Validity/Interoperability |
Content description | Describes the correctness of the data in terms of formats, syntax, and further processing steps |
Use of the code | Validity/Interoperability is to be coded if the following aspects are mentioned:
|
Name of the code | Accuracy |
Content description | Describes how accurately the data reflect the underlying event that is being described |
Use of the code | Accuracy is to be coded if the following aspects are mentioned:
|
Name of the code | Consistency |
Content description | Describes if the data are logically interconnected and the influences of genuine contradictions |
Use of the code | Consistency is to be coded if the following aspects are mentioned:
|
Name of the code | Credibility/Accessibility/Findability |
Content description | Describes the reliability of the data and the underlying data sources |
Use of the code | Credibility/Accessibility/Findability is to be coded if the following aspects are mentioned:
|
Name of the code | Relevance/Interpretability |
Content description | Describes the affiliation of the data to the objectives of the data analysis |
Use of the code | Relevance/Interpretability is to be coded if the following aspects are mentioned:
|
Name of the code | Reusability |
Content description | Describes the aim to reuse the data for other analyses or projects |
Use of the code | Reusability is to be coded if the following aspects are mentioned:
|
Appendix C. Checklist
References
- CrowdFlower. 2017 Data Scientist Report; CrowdFlower: San Francisco, CA, USA, 2017. [Google Scholar]
- Anaconda. 2022 State of Data Science. Available online: https://www.anaconda.com/resources/whitepapers/state-of-data-science-report-2022. (accessed on 9 February 2025).
- Woodie, A. Data Prep Still Dominates Data Scientists’ Time, Survey Finds. Available online: https://www.bigdatawire.com/2020/07/06/data-prep-still-dominates-data-scientists-time-survey-finds/ (accessed on 9 February 2025).
- Anwar, M. Was ist Datenbereinigung? Ein vollständiger Leitfaden. Available online: https://www.astera.com/de/type/blog/data-cleansing/ (accessed on 9 February 2025).
- Documentation, I. SPSS Modeler. Available online: https://www.ibm.com/docs/de/spss-modeler/18.4.0?topic=preparation-data-overview (accessed on 9 February 2025).
- Dasu, T.; Johnson, T. Exploratory Data Mining and Data Cleaning; John Wiley & Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
- Matzer, M. Datenaufbereitung ist ein Unterschätzter Prozess. Available online: https://www.bigdata-insider.de/datenaufbereitung-ist-ein-unterschaetzter-prozess-a-803469/ (accessed on 9 February 2025).
- Institute, P. Overcoming the 80/20 Rule in Data Science. Available online: https://www.pragmaticinstitute.com/resources/articles/data/overcoming-the-80-20-rule-in-data-science/ (accessed on 9 February 2025).
- Lohr, S. For Big-Data Scientists, ‘Janitor Work’ Is Key Hurdle to Insights. The New York Times. Available online: https://www.nytimes.com/2014/08/18/technology/for-big-data-scientists-hurdle-to-insights-is-janitor-work.html (accessed on 9 February 2025).
- Wirth, R.; Hipp, J. CRISP-DM: Towards a Standard Process Model for Data Mining. In Proceedings of the 4th International Conference on the Practical Applications of Knowledge Discovery and Data Mining, Manchester, UK, 11–13 April 2000; Volume 1, pp. 29–39. [Google Scholar]
- Shearer, C. The CRISP-DM Model: The New Blueprint for Data Mining. J. Data Warehous. 2000, 5, 13–22. [Google Scholar]
- Fayyad, U.; Piatetsky-Shapiro, G.; Smyth, P. From Data Mining to Knowledge Discovery in Databases. AI Mag. 1996, 17, 37–54. [Google Scholar] [CrossRef]
- Azevedo, A.; Santos, M.F. KDD, SEMMA and CRISP-DM: A Parallel Overview. In Proceedings of the IADIS European Conference Data Mining, Amsterdam, The Netherlands, 24–26 July 2008. [Google Scholar]
- Dåderman, A.; Rosander, S. Evaluating Frameworks for Implementing Machine Learning in Signal Processing: A Comparative Study of CRISP-DM, SEMMA and KDD. Master’s Thesis, KTH Royal Institute of Technology, Stockholm, Sweden, 2018. [Google Scholar]
- Center, S.H. Introduction to SEMMA. Available online: https://documentation.sas.com/doc/en/emref/14.3/n061bzurmej4j3n1jnj8bbjjm1a2.htm (accessed on 9 February 2025).
- Shafique, U.; Qaiser, H. A Comparative Study of Data Mining Process Models (KDD, CRISP-DM and SEMMA). Int. J. Innov. Sci. Res. 2014, 12, 217–222. [Google Scholar]
- Jackson, J. Data Mining; A Conceptual Overview. Commun. Assoc. Inf. Syst. 2002, 8, 19. [Google Scholar] [CrossRef]
- Wendler, R. The Maturity of Maturity Model Research: A Systematic Mapping Study. Inf. Softw. Technol. 2012, 54, 1317–1339. [Google Scholar] [CrossRef]
- Paulk, M.; Curtis, B.; Chrissis, M.; Weber, C. Capability Maturity Model, Version 1.1. IEEE Softw. 1993, 10, 18–27. [Google Scholar] [CrossRef]
- Al-Sai, Z.A.; Abdullah, R.; Husin, M.H. A Review on Big Data Maturity Models. In Proceedings of the 2019 IEEE Jordan International Joint Conference on Electrical Engineering and Information Technology (JEEIT), Amman, Jordan, 9–11 April 2019; pp. 156–161. [Google Scholar] [CrossRef]
- Moore, D.T. Roadmaps and Maturity Models: Pathways toward Adopting Big Data. In Proceedings of the Conference for Information Systems Applied Research, Baltimore, MD, USA, 6–9 November 2014; Volume 2167. [Google Scholar]
- Gökalp, M.O.; Gökalp, E.; Kayabay, K.; Koçyiğit, A.; Eren, P.E. The Development of the Data Science Capability Maturity Model: A Survey-Based Research. Online Inf. Rev. 2021, 46, 547–567. [Google Scholar] [CrossRef]
- Al-Sai, Z.A.; Husin, M.H.; Syed-Mohamad, S.M.; Abdullah, R.; Zitar, R.A.; Abualigah, L.; Gandomi, A.H. Big Data Maturity Assessment Models: A Systematic Literature Review. Big Data Cogn. Comput. 2023, 7, 2. [Google Scholar] [CrossRef]
- Coleman, S.; Göb, R.; Manco, G.; Pievatolo, A.; Tort-Martorell, X.; Reis, M.S. How Can SMEs Benefit from Big Data? Challenges and a Path Forward. Qual. Reliab. Eng. Int. 2016, 32, 2151–2164. [Google Scholar] [CrossRef]
- Comuzzi, M.; Patel, A. How Organisations Leverage Big Data: A Maturity Model. Ind. Manag. Data Syst. 2016, 116, 1468–1492. [Google Scholar] [CrossRef]
- Davenport, T.; Harris, J. Competing on Analytics: Updated, with a New Introduction: The New Science of Winning; Harvard Business Press: Brighton, MA, USA, 2017. [Google Scholar]
- Król, K.; Zdonek, D. Analytics Maturity Models: An Overview. Information 2020, 11, 142. [Google Scholar] [CrossRef]
- Halper, F. TDWI Analytics Maturity Model. TDWI Res. 2020, 22, 22. [Google Scholar]
- Korsten, G.; Aysolmaz, B.; Ozkan, B.; Turetken, O. A Capability Maturity Model for Developing and Improving Advanced Data Analytics Capabilities. Pac. Asia J. Assoc. Inf. Syst. 2024, 16, 1. [Google Scholar]
- Spruit, M.; Pietzka, K. MD3M: The Master Data Management Maturity Model. Comput. Hum. Behav. 2015, 51, 1068–1076. [Google Scholar] [CrossRef]
- Crowston, K.; Qin, J. A Capability Maturity Model for Scientific Data Management: Evidence from the Literature. Proc. Am. Soc. Inf. Sci. Technol. 2011, 48, 1–9. [Google Scholar] [CrossRef]
- CMMI Institute. Data Management Maturity (DMM) Model. Available online: https://stage.cmmiinstitute.com/getattachment/cb35800b-720f-4afe-93bf-86ccefb1fb17/attachment.aspx (accessed on 14 April 2025).
- Yang, B.; Wu, H.; Zhang, H. Research and Application of Data Management Based on Data Management Maturity Model (DMM). In Proceedings of the ICMLC 2018: 2018 10th International Conference on Machine Learning and Computing, Macau, China, 26–28 February 2018; pp. 157–160. [Google Scholar] [CrossRef]
- Belghith, O.; Skhiri, S.; Zitoun, S.; Ferjaoui, S. A Survey of Maturity Models in Data Management. In Proceedings of the 2021 IEEE 12th International Conference on Mechanical and Intelligent Manufacturing Technologies (ICMIMT), Cape Town, South Africa, 13–15 May 2021; pp. 298–309. [Google Scholar] [CrossRef]
- Ryu, K.S.; Park, J.S.; Park, J.H. A Data Quality Management Maturity Model. ETRI J. 2006, 28, 191–204. [Google Scholar] [CrossRef]
- Hüner, K.M.; Ofner, M.; Otto, B. Towards a Maturity Model for Corporate Data Quality Management. In Proceedings of the 2009 ACM Symposium on Applied Computing, Honolulu, HI, USA, 8–12 March 2009; SAC ’09. pp. 231–238. [Google Scholar] [CrossRef]
- Ofner, M.; Otto, B.; Österle, H. A Maturity Model for Enterprise Data Quality Management. Enterp. Model. Inf. Syst. Archit. 2013, 8, 4–24. [Google Scholar] [CrossRef]
- Kim, S.; Pérez-Castillo, R.; Caballero, I.; Lee, D. Organizational Process Maturity Model for IoT Data Quality Management. J. Ind. Inf. Integr. 2022, 26, 100256. [Google Scholar] [CrossRef]
- Kirikoglu, O. A Maturity Model for Improving Data Quality Management. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2017. [Google Scholar]
- Twilt, S. A Data Analytics Maturity Assessment Model for Data-Intensive Organizations. Master’s Thesis, Utrecht University, Utrecht, The Netherlands, 2023. [Google Scholar]
- Giovannini, E.; Ward, D. Quality Framework for OECD Statistics. In Proceedings of the Conference on Data Quality for International Organizations, Wiesbaden, Germany, 27–28 May 2004. [Google Scholar]
- Durand, M. Quality Framework and Guidelines for OECD Statistical Activities, Version 2011/1; OECD: Paris, France, 2012. [Google Scholar]
- Askham, N.; Cook, D.; Doyle, M.; Fereday, H.; Gibson, M.; Landbeck, U.; Lee, R.; Maynard, C.; Palmer, G.; Schwarzenbach, J. The Six Primary Dimensions for Data Quality Assessment; Data Management Association: Bristol, UK, 2013. [Google Scholar]
- RDA FAIR Data Maturity Model Working Group. FAIR Data Maturity Model: Specification and Guidelines. Research Data Alliance. Available online: https://zenodo.org/records/3909563#.YGRNnq8za70 (accessed on 14 April 2025). [CrossRef]
- Hameed, M.; Naumann, F. Data Preparation: A Survey of Commercial Tools. ACM SIGMOD Rec. 2020, 49, 18–29. [Google Scholar] [CrossRef]
- Kuckartz, U.; Rädiker, S. Qualitative Inhaltsanalyse. Methoden, Praxis, Umsetzung Mit Software Und Künstlicher Intelligenz, 6th ed.; Beltz Juventa: Weinheim, Germany, 2024. [Google Scholar]
- Schultz, D.; Cook, C. Client-Side Scripting Basics. In Beginning HTML with CSS and XHTML: Modern Guide and Reference; Apress: New York, NY, USA, 2007; pp. 251–279. [Google Scholar] [CrossRef]
- Marcotte, E. Responsive Web Design. Available online: https://alistapart.com/article/responsive-web-design/ (accessed on 9 February 2025).
- Giurgiu, L.; Gligorea, I. Responsive Web Design Techniques. In International Conference Knowledge-Based Organization; Sciendo: Warsaw, Poland, 2017; Volume 23, pp. 37–42. [Google Scholar] [CrossRef]
- Spurlock, J. Bootstrap: Responsive Web Development; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2013. [Google Scholar]
- Gaikwad, S.S.; Adkar, P. A Review Paper On Bootstrap Framework. IRE J. 2019, 2, 349–351. [Google Scholar]
- Liu, W.Y.; Wang, B.W.; Yu, J.X.; Li, F.; Wang, S.X.; Hong, W.X. Visualization Classification Method of Multi-Dimensional Data Based on Radar Chart Mapping. In Proceedings of the 2008 International Conference on Machine Learning and Cybernetics, Kunming, China, 12–15 July2008; Volume 2, pp. 857–862. [Google Scholar] [CrossRef]
- Porter, M.M.; Niksiar, P. Multidimensional Mechanics: Performance Mapping of Natural Biological Systems Using Permutated Radar Charts. PLoS ONE 2018, 13, e0204309. [Google Scholar] [CrossRef] [PubMed]
- Green-Armytage, P. A Colour Alphabet and the Limits of Colour Coding. JAIC-J. Int. Colour Assoc. 2010, 5, 1–23. [Google Scholar]
- Mazel, J.; Fontugne, R.; Fukuda, K. Visual Comparison of Network Anomaly Detectors with Chord Diagrams. In Proceedings of the SAC 2014: Symposium on Applied Computing, Gyeongju, Republic of Korea, 24–28 March 2014; pp. 473–480. [Google Scholar] [CrossRef]
- Keahey, T.A. Using Visualization to Understand Big Data. In IBM Business Analytics Advanced Visualisation; IBM Corporation: New York, NY, USA, 2013; Volume 16. [Google Scholar]
- Teller, S. Data Visualization with D3.Js; Packt Publishing: Birmingham, UK, 2013. [Google Scholar]
Criterion | Number of Question in the Questionnaire |
---|---|
Completeness | 12 |
Uniqueness | 4 |
Timeliness | 1 |
Validity/Interoperability | 6 |
Accuracy | 8 |
Consistency | 6 |
Credibility/Accessibility/Findability | 7 |
Relevance/Interpretability | 6 |
Reusability | 2 |
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
− | − | − | − | − | − | − | − | − | |||||||||||
Interview | B1 | 5 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 | 1 | 0 | 0 | 0 | 0 | 12 | 0 | 0 | 0 |
B2 | 5 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 6 | 1 | 4 | 0 | 9 | 0 | 1 | 0 | 0 | 0 | |
B3 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 2 | 0 | 4 | 0 | 10 | 1 | 2 | 0 | |
B4 | 0 | 1 | 0 | 0 | 0 | 0 | 4 | 0 | 6 | 0 | 4 | 0 | 3 | 0 | 8 | 0 | 1 | 0 | |
B5 | 6 | 0 | 1 | 1 | 1 | 0 | 5 | 0 | 2 | 0 | 1 | 0 | 10 | 0 | 2 | 0 | 1 | 0 | |
B6 | 4 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 3 | 0 | 1 | 0 | 7 | 0 | 3 | 0 | 1 | 0 | |
B7 | 3 | 1 | 0 | 1 | 0 | 0 | 5 | 0 | 2 | 1 | 7 | 0 | 3 | 0 | 8 | 0 | 0 | 0 | |
B8 | 13 | 1 | 0 | 1 | 0 | 0 | 2 | 0 | 4 | 0 | 2 | 1 | 2 | 0 | 6 | 0 | 0 | 0 | |
B9 | 1 | 0 | 2 | 0 | 1 | 0 | 3 | 1 | 10 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 4 | 1 | |
B10 | 5 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 | 1 | 4 | 0 | 4 | 0 | 3 | 2 | 0 | 0 | |
B11 | 10 | 1 | 0 | 0 | 0 | 0 | 3 | 0 | 2 | 0 | 6 | 1 | 5 | 0 | 8 | 0 | 0 | 2 | |
B12 | 5 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 6 | 0 | 0 | 1 | 1 | 0 | 6 | 1 | 0 | 1 | |
B13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | |
B14 | 2 | 2 | 0 | 2 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 3 | 2 | 7 | 0 | 0 | 0 | |
B15 | 16 | 0 | 0 | 2 | 0 | 0 | 3 | 2 | 8 | 1 | 4 | 1 | 4 | 0 | 3 | 2 | 2 | 0 | |
∑ | 80 | 8 | 6 | 9 | 2 | 0 | 31 | 4 | 70 | 5 | 36 | 4 | 56 | 3 | 82 | 6 | 11 | 4 |
Criteria | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Completeness | Uniqueness | Timeliness | Validity/Interoperability | Accuracy | Consistency | Credib./Accessib./Findab. | Relevance/Interpretability | Reusability | ||
Question category | Completeness | 1 | 0.5 | 0.0625 | 0.5 | 0.25 | 0.5 | 0.25 | 0.125 | 0.5 |
Uniqueness | 0.25 | 1 | 0.015625 | 0.5 | 0.0625 | 0.125 | 0.0625 | 0.03125 | 0.5 | |
Timeliness | 0.25 | 0.125 | 1 | 0.125 | 0.5 | 0.125 | 0.25 | 0.5 | 0.25 | |
Validity/Interoperability | 0.5 | 0.25 | 0.03125 | 1 | 0.125 | 0.25 | 0.125 | 0.0625 | 0.5 | |
Accuracy | 0.25 | 0.125 | 0.25 | 0.125 | 1 | 0.125 | 0.5 | 0.5 | 0.5 | |
Consistency | 0.25 | 0.125 | 0.125 | 0.5 | 0.5 | 1 | 0.5 | 0.25 | 0.5 | |
Credib./Accessib./Findab. | 0.25 | 0.125 | 0.25 | 0.125 | 0.5 | 0.125 | 1 | 0.5 | 0.5 | |
Relevance/Interpretability | 0.5 | 0.25 | 0.5 | 0.25 | 0.25 | 0.25 | 0.125 | 1 | 0.25 | |
Reusability | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Criterion | Importance | Scheme Step 1 | Scheme Step 2 (Maximum) | Final Weights |
---|---|---|---|---|
Completeness | 2.25 | 2.25 | 3.21 | 0.70 |
Uniqueness | −0.09 | 0.00 | 0.00 | |
Timeliness | 0.07 | 0.07 | 0.02 | |
Validity/Interop. | 0.93 | 0.93 | 0.29 | |
Accuracy | 2.57 | 2.57 | 0.80 | |
Consistency | 1.03 | 1.03 | 0.32 | |
Credib./Access./Find. | 1.81 | 1.81 | 0.56 | |
Relevance/Interpret. | 3.21 | 3.21 | 1.00 | |
Reusability | 0.23 | 0.23 | 0.07 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Knoflach, L.; Shao, L.; Ullrich, T. A Comprehensive Data Maturity Model for Data Pre-Analysis. Data 2025, 10, 55. https://doi.org/10.3390/data10040055
Knoflach L, Shao L, Ullrich T. A Comprehensive Data Maturity Model for Data Pre-Analysis. Data. 2025; 10(4):55. https://doi.org/10.3390/data10040055
Chicago/Turabian StyleKnoflach, Lukas, Lin Shao, and Torsten Ullrich. 2025. "A Comprehensive Data Maturity Model for Data Pre-Analysis" Data 10, no. 4: 55. https://doi.org/10.3390/data10040055
APA StyleKnoflach, L., Shao, L., & Ullrich, T. (2025). A Comprehensive Data Maturity Model for Data Pre-Analysis. Data, 10(4), 55. https://doi.org/10.3390/data10040055