Next Article in Journal
Managing Food Waste Through Gamification and Serious Games: A Systematic Literature Review
Previous Article in Journal
A Comparative Study of Privacy-Preserving Techniques in Federated Learning: A Performance and Security Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AHP-Based Evaluation of Discipline-Specific Information Services in Academic Libraries Under Digital Intelligence

1
Faculty of Humanities and Social Sciences, Macao Polytechnic University, Macao 999078, China
2
Fujian Vocational and Technical College of Forestry, Nanping 353000, China
*
Author to whom correspondence should be addressed.
Information 2025, 16(3), 245; https://doi.org/10.3390/info16030245
Submission received: 11 February 2025 / Revised: 6 March 2025 / Accepted: 15 March 2025 / Published: 18 March 2025

Abstract

:
Over recent years, digital and intelligent technologies have been driving the transformation of discipline-specific information services in academic libraries toward user experience optimization and service innovation. This study constructs a quality evaluation framework for discipline-specific information services in academic libraries, incorporating digital-intelligence characteristics to provide theoretical references and evaluation guidelines for enhancing service quality and user satisfaction in an information-ubiquitous environment. Drawing on LibQual+TM, WebQUAL, and E-SERVQUAL service quality evaluation models and integrating expert interviews with the contextual characteristics of academic library discipline-specific information services, this study develops a comprehensive evaluation system comprising six dimensions—Perceived Information Quality, Information Usability, Information Security, Interactive Feedback, Tool Application, and User Experience—with fifteen specific indicators. The analytic hierarchy process (AHP) was applied to determine the weight of these indicators. To validate the practicality of the evaluation system, a fuzzy comprehensive evaluation method was employed for an empirical analysis using discipline-specific information services at Tsinghua University Library in China as a case study. The evaluation results indicate that the overall quality of discipline-specific information services at Tsinghua University Library is satisfactory, with Tool Application, Perceived Information Quality, and Information Usability identified as key factors influencing service quality. To further enhance discipline-specific information services in academic libraries, emphasis should be placed on service intelligence and precision-driven optimization, strengthening user experience, interaction and feedback mechanisms, and data security measures. These improvements will better meet the diverse needs of users and enhance the overall effectiveness of discipline-specific information services.

1. Introduction

For a long time, discipline-specific information services in academic libraries have been profoundly influenced by the transformation of information technology and the evolving demands of users, leading to continuous advancements in service models and quality. Particularly with the rise of big data, cloud computing, the Internet of Things (IoT), and artificial intelligence (AI), academic libraries are transitioning from traditional resource management models to intelligent, personalized, and proactive information service models. In recent years, AI has achieved breakthroughs in knowledge representation, speech recognition, semantic analysis, and computer vision, driving transformative advancements across various industries. These developments have also led to an increasing adoption of intelligent service approaches in the field of discipline-specific knowledge consulting in academic libraries. The integration of generative AI and large language models (LLMs), such as ChatGPT, has fundamentally reshaped academic users’ information retrieval methods. With their capabilities in semantic understanding and natural language processing (NLP), LLMs have enabled more efficient information extraction and knowledge generation, further advancing the intelligent acquisition of academic resources. This transformation has enhanced ubiquitous information access, significantly influencing user search behaviors and information demand patterns, while also posing new challenges for traditional discipline-specific information services in academic libraries. As information-ubiquitous environments continue to evolve, these shifts in information-seeking behaviors and retrieval methods demand that academic libraries enhance the quality of discipline-specific information services. While generative-AI-driven service functionalities bring technological disruptions to traditional library services, they also present opportunities to address users’ diverse and personalized information needs. The ability to deliver sustainable and effective services, while reinforcing the library’s role in knowledge dissemination and academic support, has become a critical challenge for academic libraries in the digital-intelligence era [1].
In the modern information environment, quality is critical in shaping users’ expectations and satisfaction with the services provided by academic libraries [2]. Conducting information service performance evaluations based on service quality is a key initiative for shifting academic libraries from resource-centered institutions to user-centered service providers. This transition plays a practical and strategic role in enhancing the core competitiveness, collaboration capacity, and academic service value of libraries. For a long time, libraries have been under pressure to prove their worth [3], particularly as commercial information service providers (e.g., Elsevier, Springer, and AI-driven intelligent academic search engines) increasingly challenge traditional models of academic information services. To remain competitive and enhance the user experience, academic libraries must establish a refined discipline-specific information service quality evaluation system based on a profound understanding of service quality and its influencing factors from a user perspective [4]. The rapid advancement of information technology has not only reshaped our understanding and organization of data and knowledge but has also profoundly impacted the role of academic libraries as information hubs. As the core of the knowledge innovation ecosystem in higher education institutions, academic libraries provide an integrated environment for both formal and informal learning. Moreover, their discipline-specific information services serve as a critical support system for research innovation and the enhancement of teaching quality.
In the digital-intelligence era, where AI serves as the core driving force, Chinese academic libraries are actively supporting university innovation and the development of specialized disciplines. Leading institutions such as Tsinghua University Library have been at the forefront of integrating AI-driven information consultation systems to optimize users’ knowledge discovery processes. For instance, Tsinghua University Library has introduced the “AI+ Workshop”, which offers AI-powered academic tool training to help students and researchers develop skills in intelligent search and information analysis. Similarly, the University of Chinese Academy of Sciences Library has developed the “ChatLibrary AI Intelligent Service Platform” (ChatLibrary), which integrates AI-assisted Q&A, intelligent recommendations, and automated information analysis to enhance user interaction with library resources. Additionally, Peking University Library has launched a specialized program titled “AI for Academic Research”, aimed at assisting faculties and students in leveraging large language models (LLMs) as academic assistants for efficient information retrieval and research support. However, it is important to note that while the rise of LLMs has significantly enhanced proactive knowledge acquisition, libraries remain irreplaceable in systematic knowledge management, deep discipline-specific information services, reliability filtering, and knowledge integration. At the same time, in the AI era, the rapid evolution and integration of digital-intelligence technologies are accelerating the transformation of traditional information services. In particular, in the post-pandemic era, the emergence of generative AI platforms has led to more specialized and profound changes in user information behaviors [5]. In the midst of societal transformation and waves of innovation, academic libraries face the critical challenge of whether their internal transformation capabilities can meet the evolving demands of academic information systems [6].
To promote the sustainable development of discipline-specific information services in academic libraries and enhance their institutional value proposition, this study aims to construct and validate a quality evaluation system tailored for academic library discipline-specific information services in an intelligent environment. With the rapid advancement of information technology, academic libraries are transitioning from traditional resource management-driven models to user experience optimization and intelligent service integration. To ensure the adaptability and sustainability of library information service quality in the information-ubiquitous environment of the digital-intelligence era, this study proposes a multidimensional evaluation framework for discipline-specific information services.
This study first draws on established service quality evaluation models, including LibQUAL+TM, WebQUAL, and E-SERVQUAL, and integrates expert interviews with the operational characteristics of discipline-specific information services in academic libraries. As a result, a quality evaluation framework was developed, covering six core dimensions—Perceived Information Quality, Information Usability, Information Security, Interactive Feedback, Tool Application, and User Experience—with a total of 15 specific evaluation indicators. Subsequently, the analytic hierarchy process (AHP) was applied to determine the weighting of each indicator, and a fuzzy comprehensive evaluation method was employed to conduct an empirical analysis on discipline-specific information services at Tsinghua University Library, one of China’s leading academic institutions. This empirical analysis was designed to validate the scientific rigor, applicability, and operational feasibility of the proposed evaluation framework. The constructed evaluation system not only serves as a benchmark for assessing the quality of discipline-specific information services in academic libraries in the digital-intelligence era but also provides data-driven support for library administrators, enabling them to identify key service areas for improvement and optimize the allocation of critical academic information resources.

2. Literature Review

2.1. Research on the Evaluation of Discipline-Specific Information Services

Services are considered intangible entities [7], and their conceptualization has been extensively explored across various fields [8]. Service quality, as the core aspect of service evaluation, is defined as users’ subjective judgment of the superiority and excellence of a product or service [9]. In the information services domain, service quality has garnered significant attention [10]. With the advancement of computing and network technologies, the scope and content of information services have continuously expanded, evolving from traditional document delivery and information retrieval into a more diversified and interactive service landscape, encompassing information consultation, training, and knowledge management. At its core, the fundamental goal of information services is to facilitate an effective match between information needs and information acquisition [11]. As user needs have become the central driving force in both the research and practice of information services, the supply–demand logic has laid a solid foundation for subsequent theoretical advancements and has also provided a theoretical basis for the development of subsequent service evaluation models.
While examining the intrinsic principles of information services, information need-driven service models have gained increasing research attention. Early-stage information service evaluations primarily focused on the effectiveness of information production and utilization, contributing to the formalization of information services as an academic discipline. The introduction of the ISE model, which integrates both subjective factors and objective methodologies, has further refined information service evaluation research [12]. By placing users at the center of the evaluation framework, the model incorporates technological, social, and individual psychological factors to explain and predict users’ behavior and attitudes toward information services, thereby providing valuable analytical perspectives for subsequent user-perception-oriented studies.
Discipline-specific information services represent the application of information services in the academic and educational domain. As early as the 1970s, academic discussions on disciplinary specialization in academic libraries had already emerged [13]. With higher education libraries being profoundly influenced by technological transformations, the emergence of discipline-specific services, driven by information technology advancements, has facilitated the progressive evolution of service models. As digitalization and networking continue to deepen, numerous evaluation models related to information services have emerged, continuously enriching measurement dimensions. The library and information science community has actively adopted the best practices from the commercial and marketing sectors [14], including SERVQUAL, which has been utilized to analyze service quality and customer satisfaction in academic libraries [15]. Studies suggest that user satisfaction measurement in information services functions should integrate traditional user satisfaction models with the “reliability” and “empathy” dimensions of the SERVQUAL model, thereby providing a more comprehensive reflection of user perceptions and expectations [16]. Building upon SERVQUAL, the LibQUAL+™ model was further developed to assess user perception of service quality through three core dimensions: Affect of Service, Information Control, and Library as Place. Moreover, scholars have attempted to optimize the LibQUAL+™ model using fuzzy linguistic approaches to enhance the precision of user perception measurement [17]. Additionally, the E-SERVQUAL model has provided new insights into evaluating the quality of digital resources and remote services in libraries, while the WebQual model, which emphasizes user experience assessment, has been widely adopted and expanded across both traditional and digital library contexts [18,19]. Given the unique characteristics of academic libraries, targeted evaluation tools have been employed, incorporating elements from both SERVQUAL and E-S-QUAL to assess library e-services [20].
The development and application of these models have further enriched the theoretical framework of discipline-specific information service quality evaluation, providing a solid theoretical foundation and practical tools for assessing the quality of academic library information services.

2.2. Transformation of Discipline-Specific Information Services Driven by Digital and Intelligent Technologies

In the traditional model of discipline-specific information services, librarians or subject experts primarily relied on bibliographic resources and manual consultation methods to provide embedded and personalized reference services to faculties and students [21]. However, the growing scale and diversity of data in the era of ubiquitous information have introduced high-frequency, dynamic, and precise characteristics in information supply and demand. This shift has driven university libraries to transition from “human-driven” to “human–machine collaboration” and even “intelligence-driven” service models [22].
On the one hand, in an information-ubiquitous environment, academic libraries need to enhance their sensitivity to factors such as service quality and value [23]. Big data technologies have been progressively integrated into the core information service domains of many academic libraries [24]. By performing semantic mining and relational analysis on vast datasets, including academic publications, patents, research projects, and scholar profiles, libraries construct multidimensional knowledge networks that provide institutional members with specialized support for information retrieval and discovery [3]. On the other hand, artificial intelligence (AI) and machine learning algorithms are gradually permeating library service processes. These include personalized recommendations based on user behavior data, intelligent dialogues in virtual reference systems, and automated association matching in academic social networks [7]. Technological advancements are profoundly reshaping the interaction patterns between libraries and users, driving the transition from traditional search tools to multimodal intelligent assistants.
At present, there is widespread exploration of how generative AI technologies can be extensively utilized in academic knowledge systems [25]. Simultaneously, the service transformations triggered by digital and intelligent technologies have also reshaped the evaluation dimensions of discipline-specific information services. It is worth noting that traditional service quality models, such as SERVQUAL and LibQUAL+TM, primarily focus on user satisfaction with service attitudes, resource accessibility, and environmental facilities [26]. However, in the digital-intelligence environment, information ubiquity has become a defining feature of the service landscape. User demands for service convenience, responsiveness, information accuracy, and system intelligence have become increasingly prominent [27].

2.3. Analytic Hierarchy Process (AHP) in Information Service Quality Evaluation

The evaluation of information service quality is one of the key methods for measuring the effectiveness of academic library services. A well-designed evaluation system not only accurately reflects users’ perceptions and needs regarding discipline-specific information services but also provides data-driven support for the continuous optimization of academic libraries.
In an information-ubiquitous environment, traditional evaluation systems often fail to comprehensively capture the multidimensional characteristics of discipline-specific information services in universities. The analytic hierarchy process (AHP), which combines qualitative and quantitative approaches, offers a structured decision-making capability, presenting a new perspective for optimizing evaluation frameworks. This method decomposes complex problems into multiple levels and factors, conducting pairwise comparisons between them to derive the relative weights of different decision-making criteria. As a result, AHP enhances the systematic and scientific nature of information service quality evaluation [28].
Previous studies have applied AHP in various information-service-related domains to optimize the decision-making process. In the field of e-commerce service quality assessment, researchers have developed a multilayered evaluation framework encompassing website design, response speed, and information quality. Through AHP-based quantitative analysis, they identified the perceived importance of different factors and demonstrated that online transaction security, website usability, and timely service delivery are the core determinants of overall user experience [29]. Similarly, in knowledge management and healthcare information systems, AHP has been utilized to assess IT service performance, confirming its effectiveness in multicriteria decision environments for optimizing resource allocation [30]. Additionally, AHP has been employed in multicriteria decision analysis to evaluate the suitability of renewable energy infrastructure locations, contributing to the development of efficient and environmentally friendly energy solutions [31]. Relevant studies have analyzed the key success factors in the information service industry’s expansion into international markets and proposed the adoption of the AHP method for strategic evaluation to optimize decision-making processes [32]. In the knowledge information service sector, research has explored the role of AHP in evaluating knowledge management tools, demonstrating that this method provides a systematic weighting analysis that assists organizations in selecting appropriate management tools [33]. Furthermore, studies indicate that the hierarchical structure of AHP contributes to the refinement of service quality evaluation standards, such as the accessibility of information, the breadth and depth of academic resources, and user satisfaction with service experience, thereby enhancing the accuracy and robustness of evaluation frameworks [34].

2.4. Summary

Libraries have a long-standing history of performance evaluation, which has become a key approach for enhancing effective management [18]. Academic libraries demonstrate their institutional value through three primary aspects: collections, information services, and education [35]. Existing research has developed various information service evaluation theories and models, gradually shaping a user-centered, professionalized, and technology-driven integration trend. These studies not only focus on users’ subjective perceptions of interactive feedback, information accuracy, and overall service experience but also explore how technological advancements can optimize the quality of discipline-specific information services.
In summary, existing evaluation models—such as LibQUAL+TM, WebQUAL, and E-SERVQUAL—provide a comprehensive set of indicator frameworks and operational tools for assessing library and information service quality. These traditional evaluation dimensions have been extensively refined in both process and outcome evaluations [8]. However, the applicability of standardized indicators remains a subject of debate across different application contexts [36]. Some studies suggest that discrepancies may exist between the SERVQUAL model and actual user experiences in library environments [37]. Therefore, given that ubiquitous information has become a defining characteristic of the information landscape, it is essential to recognize that the evaluation dimensions and application contexts of discipline-specific information services in academic libraries have also expanded and evolved [38].
With the advancement of digital-intelligence technologies, the factors influencing discipline-specific information services in academic libraries have become increasingly complex. Meanwhile, existing evaluation methods exhibit certain limitations, making it challenging to comprehensively assess the evolving characteristics of academic information services in this new environment. These challenges necessitate new theoretical and methodological explorations. In response, this study develops a discipline-specific information service quality evaluation system tailored for the digital-intelligence era. Given that the quality of discipline-specific information services encompasses multiple dimensions, including user experience, resource accessibility, and technological support, relying solely on objective data-driven methods (e.g., entropy weight method) may fail to fully capture users’ perceived realities. Therefore, this study employs the analytic hierarchy process (AHP) to integrate expert judgment via scoring, coupled with consistency testing, to mitigate subjective bias. Additionally, the fuzzy comprehensive evaluation (FCE) method is incorporated to conduct an empirical analysis of discipline-specific information services at Tsinghua University Library in China. This study contributes to the ongoing research on information services in academic libraries by enriching the methodological landscape. Furthermore, it provides both theoretical insights and practical guidance for enhancing the quality of discipline-specific information services in academic libraries within an information-ubiquitous environment.

3. Construction of the Discipline-Specific Information Service Quality Evaluation Indicator System

3.1. Research Process

To establish a quality evaluation system for discipline-specific information services in academic libraries within the digital-intelligence environment, this study adopts the analytic hierarchy process (AHP) as the core methodology. The construction process consists of the following key steps:
(1)
Selection and Structuring of Evaluation Indicators: To overcome the limitations of a single evaluation method and address the challenges of operability and measurability in evaluating subjective indicators, this study follows the principles of objectivity, systematicity, scientific rigor, applicability, and operability. A comprehensive approach integrating a literature review and expert interviews is employed to select, refine, validate, revise, and finalize the evaluation indicators for discipline-specific information services in academic libraries. This ensures that the evaluation indicator system is both scientifically robust and widely applicable across different contexts. The selected indicators are further categorized into specific sub-indicators, covering various aspects of service quality, and structured into a hierarchical model.
(2)
AHP-Based Weight Allocation: The AHP method is applied to assign weights to both the criterion-level indicators and sub-indicators. Expert judgments on the relative importance of these criteria are used to construct a pairwise comparison matrix, from which the weight of each criterion is derived. The AHP process consists of the following five steps:
  • Conceptualizing the research problem and decomposing it into multiple interrelated components, forming a multilevel hierarchical structure: In this study, the evaluation of discipline-specific information service quality in academic libraries is structured into a three-tier hierarchical model: the goal level, which represents the overall evaluation of discipline-specific information service quality; the criterion level, encompassing six core dimensions; and the alternative level, consisting of 15 specific evaluation indicators.
  • Using elements from a higher level as evaluation criteria, conducting pairwise comparisons of lower-level elements to establish a judgment matrix: In this study, the expert interview method was employed, involving 12 experts who conducted pairwise comparisons of the six dimensions and their 15 sub-indicators to construct the judgment matrix.
  • Based on the judgment matrix, calculating the relative weights of each element with respect to its higher-level criterion: In this study, the eigenvector method was employed to calculate the relative weights of the judgment matrix, thereby determining the final weight of each indicator.
  • Performing a consistency test on the judgment matrix, ensuring that the consistency ratio (CR) is below 0.10, which indicates an acceptable level of consistency in weight allocation.
  • Calculating the composite weights of all elements within the hierarchy to generate the final evaluation model, providing a foundation for subsequent comprehensive evaluation.
(3)
Comprehensive Evaluation and Validation: The hierarchical structure analysis of AHP provides the weight foundation, while FCE further compensates for AHP’s limitations in capturing users’ subjective evaluations, ensuring that the evaluation system integrates both a structured decision-making logic and a user-centered assessment approach. The fuzzy comprehensive evaluation (FCE) method, based on fuzzy mathematics, is adopted due to its advantages in handling ambiguous, difficult-to-quantify factors and providing clear, structured results [39]. The FCE method is utilized to quantify users’ qualitative evaluations of each indicator, thereby generating an objective and accurate assessment score for the evaluation system.

3.2. Empirical Analysis

3.2.1. Preliminary Selection of Dimension Indicators Based on the Literature Review

This study first conducted a systematic literature review to analyze existing information service quality evaluation models, including LibQUAL+TM, WebQual, and E-SERVQUAL. By integrating findings from information service research, six core dimensions for evaluating information service quality were preliminarily established (see Table 1).
The existing literature provides a theoretical foundation for the selection of indicators in this study. However, several aspects still require further exploration. For instance, most existing indicators and models are based on traditional information service environments and have not fully accounted for the impact of digital-intelligence technologies on discipline-specific information services. Emerging factors such as AI-assisted search and intelligent recommendation systems have yet to be systematically integrated into evaluation frameworks. Given these uncertainties, this study builds upon the preliminary indicator system developed through the literature review and further employs expert interviews to validate the rationality of the proposed dimensions and refine specific indicators (see Section 3.2.2).

3.2.2. Selection of Dimensions and Indicators Based on Expert Interviews

  • Selection and Composition of Research Experts
To ensure the scientific rigor, representativeness, and credibility of the evaluation indicator system, this study invited 12 experts specializing in information security and user information behavior.
These experts were consulted to assess the importance of indicators, provide suggestions for modifications, and contribute to subsequent AHP weighting operations based on the findings from the literature review. The expert panel consisted of academic professionals and practitioners from discipline-specific information service departments at leading universities in mainland China, including eight experts from university library discipline-specific information service departments and four researchers specializing in library and information science.
All invited experts hold senior academic positions (associate professor or above) and have educational backgrounds in library and information science (LIS) or related fields. Their professional responsibilities involve engagement with discipline-specific information services in academic libraries, with over 10 years of research or practical experience in the relevant fields. These qualifications ensure that their insights are both authoritative and practically valuable. The background details of the selected experts are presented in Table 2.
2.
Implementation of Expert Interviews
This study primarily adopted a semi-structured interview approach, which is grounded in the qualitative research methodology framework [43]. This method emphasizes the interactivity between data collection and analysis, ensuring that the summarization and interpretation of interview data are conducted in a scientific and systematic manner [44]. Specifically, flexibility was maintained throughout the interview process, allowing respondents to freely express their perspectives on discipline-specific information services in academic libraries, thereby facilitating a deeper exploration of key evaluation dimensions [45].
To ensure the scientific rigor and relevance of the interview content, the interview framework was designed by integrating theoretical foundations from existing information service quality evaluation models, including LibQUAL+TM, WebQual, and E-SERVQUAL. Additionally, to tailor the framework to the specific context of discipline-specific information services in university libraries, this study drew on the practical experiences of discipline-specific information service platforms at two representative Chinese universities—Tsinghua University and Peking University. From these cases, six core themes were identified for the in-depth interviews. A predefined question framework was developed around these six themes, while allowing experts the flexibility to explore and elaborate on key topics as needed. The primary objective of the interviews was to further refine the preliminary evaluation indicator system developed through the literature review, ensuring its applicability within the digital-intelligence environment. The interview discussions centered around the following key questions:
  • Scientific Rigor and Applicability of the Dimensions and Indicators: Do the six proposed dimensions comprehensively capture the core aspects of discipline-specific information service quality?
  • Technology-Driven and Intelligent Services: How do artificial intelligence, big data, and knowledge graphs impact discipline-specific information service quality in the digital-intelligence era? Which technologies hold the greatest potential for the future development of discipline-specific information services?
  • Practical Value of the Dimensions and Indicators: Does the evaluation system align with the practical needs of discipline-specific information services in university libraries?
During the interview process, the order and phrasing of questions were flexibly adjusted based on experts’ responses, and follow-up inquiries were made to further explore valuable insights. The expert interviews were conducted in October 2024, and with participants’ consent, the entire interview process was recorded in full. The aim of these interviews was to gain an in-depth understanding of the various aspects of discipline-specific information services in university libraries, thereby providing theoretical foundations and practical references for the subsequent development of the quality evaluation system.
3.
Analysis of Expert Interview Results
To ensure the accuracy and reliability of the data, this study conducted a rigorous and detailed analysis of the interview materials. All responses were systematically categorized according to the predefined question themes, followed by in-depth interpretation and content extraction. The organized and analyzed data are presented in Table 3.

3.3. Selection of Model Dimensions and Indicators

Building on the LibQUAL+TM, WebQual, and E-SERVQUAL models, this study first fully references existing research findings on information service quality evaluation. Second, it analyzes the practical characteristics of discipline-specific information services in academic libraries. Finally, it incorporates expert interview results to refine and validate the selection of evaluation indicators. As a result, six core dimensions were identified, encompassing a total of 15 evaluation indicators, forming the quality evaluation system for discipline-specific information services in academic libraries within the digital-intelligence environment. The details are presented in Table 4.

3.4. Determination of Indicator Weights in the Evaluation System Based on the Analytic Hierarchy Process (AHP)

This study employed the analytic hierarchy process (AHP) to determine the weights of the evaluation indicators, inviting the same expert panel from the previous interviews to participate in the weight assignment process. The experts conducted pairwise comparisons to assess the importance of each model dimension and the relative importance of specific indicators within each dimension. The 1–9 scale method proposed by Thomas L. Saaty was adopted for the scoring process [46]. During data processing, the study aggregated and weighted the experts’ scoring results, constructing judgment matrices for each level of the discipline-specific information service quality evaluation system in academic libraries. Subsequently, the consistency ratio (CR) of the judgment matrices was computed to verify the consistency of expert evaluations. The final calculation results indicate that all CR values are below 0.1, meeting the consistency verification standard. This confirms that the expert assessments are rational and consistent, ensuring the scientific rigor and reliability of the weight allocation process. As shown in Figure 1:

Detailed Process Description

  • Establishing the Hierarchical Structure Model (Figure 2):
Goal Level (A-Level): Evaluating the quality of discipline-specific information services in academic libraries within the digital-intelligence environment.
Criteria Level (B-Level): Six core dimensions of evaluation:
        B1: Perceived Information Quality
        B2: Information Usability
        B3: Information Security
        B4: Interactive Feedback
        B5: Tool Application
        B6: User Experience
Alternative Level (C-Level): Specific evaluation indicators under each dimension. These 15 secondary indicators are designated as C1 to C15, covering aspects such as information relevance, comprehensiveness, usability, and security.
2.
Expert Interviews and Scoring
Experts were invited to perform pairwise comparisons for each evaluation indicator at the same hierarchical level. The objective was to ensure the structure and content of the evaluation system aligned with actual business needs and professional understanding. Before distributing the scoring forms, one-on-one interviews were conducted to introduce the main dimensions and indicators of the evaluation system to each expert. Additionally, they were provided with guidelines and examples on how to use the 1–9 scale method for scoring, as shown in Table 5.
3.
Scoring Scale and Data Aggregation
All expert scores were aggregated, and the geometric mean method was applied to reduce the influence of extreme values, ensuring the proportional relationship of the data.
The judgment matrices for each level of the discipline-specific information service quality evaluation system were constructed (see Table 6), reflecting the relative importance of different indicators. The specific analytical approach followed the calculation principles outlined below:
For any given pair of dimensions (e.g., i and j ), if expert k assigns a relative importance score of x i j k , the geometric mean method was applied to combine the scores from all 12 experts, producing a composite score. The calculation formula is as follows:
a x i j = k = 1 12 x i j k 1 12 ,
This approach ensures that the final scores are objective, balanced, and minimize individual biases, strengthening the scientific reliability of the evaluation system.
4.
Software Analysis and Consistency Testing
The analytic hierarchy process (AHP) requires the judgment matrix to exhibit consistency. Successfully passing the consistency test further validates the rationality and scientific rigor of the evaluation system, ensuring that data collected from multiple experts maintains logical coherence and structural consistency. This lays a solid foundation for the subsequent weight calculations and analysis.
This study utilized SPSSAU software for data analysis. SPSSAU is a powerful statistical analysis tool that supports AHP and fuzzy comprehensive evaluation. It enables the direct calculation of consistency ratios and indicator weights based on the expert-assigned judgment matrix.
Through the geometric mean method, the eigenvector values were obtained as follows:
(1.201, 1.081, 0.909, 0.897, 1.310, 0.601)
Using these eigenvector values, the maximum eigenvalue was computed as ( λ max = 6.0088). Subsequently, the consistency index ( C I ) was calculated as C I = 0.0018.
The C I value is essential for consistency verification, ensuring that the expert judgments are reliable. The detailed results are presented in Table 7.
The calculation formula is as follows:
λ max = A w i n w i
C I = λ max n n 1
where λ max represents the maximum eigenvalue, A denotes the judgment matrix, w refers to the weight vector, and n represents the matrix order (number of criteria).
Following the same scoring calculation approach described earlier, experts were again asked to conduct pairwise comparisons across all hierarchical levels and elements, ensuring a comprehensive evaluation of all dimensions and indicators. After completing the full set of comparative evaluations, the weights of each indicator were determined. Subsequently, judgment matrices were constructed for the indicators within each dimension based on the expert scoring results. A consistency test was conducted on each judgment matrix, and the results confirmed that all matrices successfully passed the consistency check. The final indicator weights for each dimension are presented in Table 8.
By integrating the dimensions and their respective indicator weights, this study establishes the final weight distribution for the discipline-specific information service quality evaluation system in academic libraries, as detailed in Table 9.
The calculation formula is as follows:
Relative   Weight   of   an   Indicator   within   Its   Dimension = Indicator   Weight Total   Weight   of   the   Dimension
Final   Indicator   Weight = Relative   Weight   of   the   Indicator   within   Its   Dimension   ×   Dimension   Weight
As shown in Table 9, from the dimension perspective, the weights in descending order are Tool Application (0.2184), Perceived Information Quality (0.2002), Information Usability (0.1802), Information Security (0.1515), Interactive Feedback (0.1496), and User Experience (0.1002). From the indicator perspective, based on the final indicator weights, the top five indicators are System Usability (0.0991), Response Efficiency (0.0972), User Privacy Protection (0.0909), Tool Availability (0.0874), and AI-Driven Features (0.0764).

4. Empirical Validation of the Quality Evaluation of Discipline-Specific Information Services in Academic Libraries

4.1. Sample Selection, Data Sources, and Processing

This study aims to empirically validate the applicability of the proposed evaluation system for discipline-specific information services in academic libraries by conducting a case study at Tsinghua University Library in China. Tsinghua University Library introduced discipline-specific information services in 1998, making it one of the earliest academic institutions in mainland China to implement such services. Given its long-standing experience in this field, the library serves as a representative case for verification and further exploration. The data collection period spanned from 27 October 2024, for sample retrieval, with formal research conducted between December 2024 and January 2025.
The survey questionnaire was designed based on the six dimensions and fifteen evaluation indicators established in the proposed framework. The questionnaire structure consists of two sections: the first section focuses on user demographics, including gender, age, education level, user identity, and frequency of using the university library’s discipline-specific information services; the second section is the core evaluation component, focusing on users’ perceptions and emphasizing their actual experiences when using the services. A five-point Likert scale was employed, ranging from “Very Satisfied” to “Very Dissatisfied”, where 1 = Very Dissatisfied, 2 = Dissatisfied, 3 = Neutral, 4 = Satisfied, and 5 = Very Satisfied. Further information can be found in Appendix A. The survey was conducted using on-site distribution at the research location. A total of 300 questionnaires were distributed, with 287 valid responses collected, and after careful screening, 280 were deemed valid, yielding an effective response rate of 93.3%, meeting the target sample size. The demographic breakdown of survey participants showed that male respondents accounted for 56.79% of surveys and female respondents for 43.21%. In terms of education level, undergraduates made up 38.57%, master’s students 32.86%, doctoral students and above 18.93%, and faculty members 9.64%. Age distribution data indicated that 62.15% of respondents were aged 20–30 years, 25.32% were between 30 and 40 years old, and 12.53% were 40 years and above. Regarding service usage frequency, most respondents reported frequent engagement with discipline-specific information services at their university library, with 28.41% using the services more than five times per week, categorized as frequent users, while 54.92% accessed them at least four times per month, classified as regular users. These findings indicate that graduate students and young faculty members constitute the primary user base of discipline-specific information services, with strong reliance on such resources. At the same time, the sample also includes undergraduate students and faculty members, ensuring representativeness and enhancing the study’s reliability and reference value.

4.2. Evaluation Process

Due to the diversity and complexity of the evaluation process for discipline-specific information services in academic libraries, the fuzzy comprehensive evaluation method was adopted. This method adheres to systematic principles, combining qualitative and quantitative approaches to address the challenge of quantifying qualitative issues.
First, a fuzzy evaluation matrix was constructed for each dimension, integrating evaluation vectors of multiple specific indicators under the dimension. The dimension-level results help capture the overall macro-level performance of discipline-specific information services at the university. For the specific indicators corresponding to each dimension, evaluation vectors were based on actual survey data, revealing the specific performance of each dimension and related indicators in discipline-specific information services.
The survey results were used to construct membership degree vectors, and the calculation formula is as follows:
Membership Degree = Value of a Specific Option/Total Sum
Based on the calculation formula, the membership degree values for the information relevance indicator are as follows:
[0.0357, 0.0536, 0.2143, 0.4286, 0.2679]
The full data are shown in Table 10.
Further, to construct weighted evaluation vectors for the six dimensions, the calculation formula is as follows:
S dimension   = i = 1 n ( V i n d i c a t o r ( i ) × W i n d i c a t o r ( i ) )
where S dim e n s i o n represents the evaluation vector for the dimension, and V i n d i c a t o r ( i ) represents the evaluation vector at the indicator level, i.e., the membership degree values (as shown in Table 10). W i n d i c a t o r ( i ) represents the final weights of the corresponding indicators from the evaluation system (as shown in Table 9). Based on this calculation, the evaluation vector results for each dimension are obtained as follows.

4.3. Results Analysis

This study employs the fuzzy comprehensive evaluation method to conduct an in-depth empirical analysis of the discipline-specific information services at Tsinghua University Library in China. The objective is to assess the service quality performance across key dimensions and propose constructive recommendations to enhance service quality toward higher standards. To improve the interpretability and comparability of the results, both qualitative and quantitative analyses were conducted. In this study, the evaluation set for discipline-specific information service quality is defined as V = {very dissatisfied, somewhat dissatisfied, neutral, somewhat satisfied, very satisfied}. Each qualitative evaluation level is assigned a numerical value, forming a quantifiable evaluation set V = {60, 70, 80, 90, 100}. Based on this, the final score D for the performance of discipline-specific information services at Tsinghua University Library is computed using the following formulas:
C = W × R
D = C × V
where W represents the weight of the dimension layer, R is the membership degree matrix at the indicator level, and C is the comprehensive evaluation vector. The final calculated score D = 86.93 indicates a high level of overall user satisfaction with the discipline-specific information services at Tsinghua University Library.
To visually present the academic library’s discipline-specific information services, a radar chart was generated using Python’s (version 3.11.8) Matplotlib library (version 3.6.3) (Figure 3), based on the survey results in Table 11. The chart illustrates the weighted evaluation outcomes across five rating levels, with each contour line and its shaded area representing the distribution of membership degrees for each dimension at different rating levels.
Overall, the evaluation results for Tsinghua University Library’s discipline-specific information services demonstrate a high concentration around the “Satisfied” rating level. Among the six dimensions, Tool Application received the highest score, followed by Information Usability, Perceived Information Quality, Interactive Feedback, Information Security, and lastly, User Experience.
The library performed consistently across the key dimensions of discipline-specific information services. However, User Experience exhibited relatively weaker performance, particularly in the “Very Satisfied” category, where it had the lowest membership degree among all dimensions. As previously indicated by the AHP weight analysis, User Experience holds a significant weight in the evaluation system, yet its sub-indicators (such as user trust and learning support functions) require long-term systematic optimization and sustained user engagement to be fully realized. The fact that Tool Application ranked as the highest-rated dimension suggests that in an information-ubiquitous environment, users increasingly prioritize the convenience of information acquisition, with a growing reliance on various academic information retrieval and analytical tools. Consequently, their direct experience with these tools—such as their usability and functionality—appears to have a greater influence on their perceptions and satisfaction than their overall trust in the academic information service system itself. While the lower score for User Experience does not necessarily indicate a direct rejection of the current academic information service system, it does reveal an implicit unmet long-term expectation from users. This underscores the necessity for academic libraries to further enhance service capabilities in diversified and personalized service domains to better align with evolving user needs.

5. Discussion

Overall, in an information-ubiquitous environment, discipline-specific information services in academic libraries are shaped by the dual influences of traditional service strategies and technological advancements. The results indicate that Tool Application (B5) and Perceived Information Quality (B1) hold the highest weight in the evaluation system for academic library discipline-specific information services. This finding highlights the increasing demand for intelligent, precise, and efficient information services in the digital-intelligence era, while also reaffirming that users continue to place significant emphasis on the quality of information content itself. Despite the shift toward digital intelligence, the production, filtering, organization, and acquisition of high-quality information remain the core competitive advantages of discipline-specific information services.
As data elements become the primary medium for academic information exchange, new experiences in discipline-specific information services are emerging. The diversification and commercialization of academic information service platforms offer new opportunities for innovation in university libraries but simultaneously introduce service pressures, necessitating enhancements in information usability, interactive feedback, and user experience. In an information-ubiquitous environment, academic information services, characterized by specialized and scholarly data, not only facilitate information exchange but also reveal sensitive user attributes. Data extend beyond being mere information—they embody the intricate connections between individuals and society. Consequently, concerns regarding information security and data privacy have become inevitable in contemporary digital discourse.
Furthermore, Information Security (B3) ranked relatively high in this study, reflecting the growing significance of data privacy protection in an information-ubiquitous environment. With the continued adoption of generative AI and large language models (LLMs), concerns surrounding data security and intellectual property rights have become increasingly prominent. Users now demand greater transparency in data processing, enhanced algorithmic trustworthiness, and stronger personal information protection. As the central hub for academic knowledge dissemination, university libraries must strengthen privacy safeguards, optimize data-sharing mechanisms, and enhance the accessibility of academic resources to ensure their continued relevance and value in the digital age. From a dimensional perspective, combined with the indicator-level findings, this study reveals the growing integration of AI-driven services within discipline-specific information services. This shift signifies that modern university libraries are evolving beyond their traditional roles as static repositories of knowledge. In the digital-intelligence era, information services must transition toward an interactive, dynamic, and high-efficiency exchange model, where continuous, real-time feedback is essential to fulfilling personalized user needs.
Based on the findings of this study, it is recommended that traditional discipline-specific information service models transition from standardized, linear, and deterministic approaches to dynamic, nonlinear, and demand-driven models. The transformation brought by technological innovation fundamentally reshapes the knowledge generation and decision-making mechanisms within university libraries, granting them greater adaptability and resilience. University libraries should further leverage advanced data mining and machine learning techniques to conduct in-depth analyses of user behaviors and preferences. In the digital-intelligence environment, libraries must embrace emerging technologies, shifting from passive response mechanisms to interactive, multilayered discipline-specific information support systems. By repositioning themselves as co-creators of knowledge rather than merely providers of information, academic libraries can reinforce their role as central institutions for academic resource management and scholarly communication.

6. Conclusions

This study employs an empirical approach to evaluate and enhance the quality assessment of discipline-specific information services in academic libraries within an information-ubiquitous environment. In response to the current landscape of academic library discipline-specific information services, a user-centric evaluation framework was proposed, applying the analytic hierarchy process (AHP) to determine indicator weights. Furthermore, a fuzzy comprehensive evaluation method was employed to conduct an empirical study on the discipline-specific information services of Tsinghua University Library in China. The results indicate that Tool Application (B5) received the highest score, followed by Perceived Information Quality (B1) and Information Usability (B2). These findings suggest that users prioritize convenient academic information tools and AI-driven functionalities, the relevance, accuracy, and diversity of information, as well as the ease of system operation and the capacity for personalized services. Overall, users expressed satisfaction with the academic information services of Tsinghua University Library. At the same time, this study highlights critical challenges related to user experience, interactive feedback, and information security, providing empirical support for optimizing discipline-specific information services in academic libraries under the influence of digital-intelligence technologies.
Undoubtedly, technological advancements over the past decades have significantly expanded the depth and breadth of discipline-specific information services in academic libraries. The findings of this study further validate the increasing reliance of users on technological tools. However, information overload and the varying quality of information have also introduced new challenges for users. With the widespread adoption of large language models (LLMs), academic library discipline-specific information services are undergoing profound transformations. While LLMs offer advantages in knowledge integration, language comprehension, and automated content generation—enhancing the efficiency and personalization of academic information services—they also present challenges such as the lack of data traceability, hallucination issues, and limitations in exploratory knowledge discovery. These constraints suggest that LLMs cannot yet replace the fundamental roles of academic libraries in ensuring information authority, integrating multimodal resources, and supporting scholarly communities. At the same time, it is crucial to recognize that under the influence of AI and other critical information technologies, academic libraries will no longer serve merely as physical repositories of knowledge. Instead, they are likely to evolve into digitalized, intelligent academic information service platforms. The increasing integration of AI-driven tools is expected to reshape traditional discipline-specific information services, potentially leading to deeper knowledge production and innovation [47]. As AI-driven transformations continue to accelerate, future discipline-specific information service models will likely emphasize human–machine collaboration, knowledge creation, and open-access sharing. Within this context, a key research question will be how academic libraries can maintain their role as the central hubs of scholarly information amidst rapid technological advancements.
This study further enriches and refines the research on information service quality evaluation while providing valuable references for optimizing discipline-specific information service assessment in the digital-intelligence era. However, certain limitations remain. The study employs a combined approach of the analytic hierarchy process (AHP) and fuzzy comprehensive evaluation (FCE) for a comprehensive assessment. AHP relies on expert scoring and pairwise comparisons, which significantly increases computational complexity when applied to large-scale datasets or multi-institutional comparative studies. Additionally, FCE, when processing highly subjective user perception data, still depends on expert-defined membership functions, potentially affecting the stability of computational results. Future research could explore integrating machine learning techniques to optimize weight calculations and improve computational efficiency. While AHP has demonstrated high reliability in evaluating discipline-specific information service quality, its computational demands increase substantially when applied to large-scale datasets, particularly in comparative analyses involving multiple universities. Similarly, FCE requires significant storage and computational resources for processing fuzzy data. Future studies should consider algorithmic optimizations to enhance efficiency and scalability, enabling evaluations across a broader range of academic libraries. Expanding empirical research to a larger number of institutions would further improve the objectivity and applicability of the evaluation framework.

Author Contributions

Conceptualization, S.Z.; methodology, S.Z.; software, S.Z.; validation, S.Z., T.Z. and X.W.; formal analysis, S.Z.; investigation, S.Z.; resources, S.Z.; data curation, S.Z.; writing—original draft preparation, S.Z.; writing—review and editing, T.Z. and X.W.; visualization, S.Z.; supervision, T.Z. and X.W.; project administration, T.Z. and X.W.; funding acquisition, T.Z. and X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

  • Survey Questionnaire: Survey on the Evaluation of Discipline-Specific Information Services at Tsinghua University Library
  • Section 1: Information Quality
  • Information Relevance
  • How would you evaluate the relevance of the library’s discipline-specific information to your research and teaching needs? Do you find the library’s resources closely aligned with your discipline and research focus?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Information Accuracy
  • How would you rate the accuracy and reliability of the library’s discipline-specific information? Do you find the retrieved documents and data credible and trustworthy?
  • (1)Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Information Diversity
  • How would you evaluate the richness and diversity of the discipline-specific resources provided by the library? Are you satisfied with the library’s coverage of various types of literature and interdisciplinary resources?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Section 2: Information Usability
  • Ease of System Operation
  • How would you rate the user experience of the library’s discipline-specific information service platform, including search, browsing, and download functions? Is the system interface intuitive and easy to use?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Personalized Services
  • How would you evaluate the library’s ability to provide personalized information recommendations and customized services based on your preferences? Do you find the recommended content relevant to your needs?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Section 3: Information Security
  • User Privacy Protection
  • How satisfied are you with the library’s measures to protect your privacy and ensure data security? Do you trust the library to handle your personal and academic data responsibly?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • System Reliability
  • How would you evaluate the stability and reliability of the library’s discipline-specific information service system? Does the system frequently encounter issues or become inaccessible?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Section 4: Interaction Feedback
  • Efficiency of Communication and Feedback
  • How would you rate the speed and efficiency of the library in responding to your inquiries, suggestions, or feedback? Do you find the service responses timely?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Organization and Collaboration
  • How effective do you find the library’s efforts to build discipline-specific communities or virtual collaboration platforms? Do these platforms facilitate communication and collaboration among faculties, students, and library staff?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Section 5: Tools and Technology Application
  • Tool Availability
  • How would you evaluate the practicality of the information analysis tools and related training services provided by the library? Do these tools help you better utilize discipline-specific information?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • AI-Driven Features
  • How would you evaluate the library’s use of AI, big data, and other technologies to enable intelligent recommendations, automated classification, and other features? Do you believe these features enhance service quality and efficiency?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Discipline Data Management
  • How satisfied are you with the library’s support for the collection, storage, sharing, and reuse of discipline-specific research data? Are you satisfied with the quality and usability of these services?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Section 6: User Experience
  • Satisfaction and Trust
  • Overall, how satisfied are you with the library’s discipline-specific information services? How much do you trust the library’s brand and service quality?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Support for Learning and Research
  • How would you rate the contribution of the library’s discipline-specific information services to your learning and research activities? Have the services effectively supported your academic work?
  • (1) Very Dissatisfied
  • (2) Dissatisfied
  • (3) Neutral
  • (4) Satisfied
  • (5) Very Satisfied
  • Willingness for Continued Use
  • Based on your experience, would you be willing to continue using the library’s discipline-specific information services and recommend them to others?
  • (1) Very Unwilling
  • (2) Unwilling
  • (3) Neutral
  • (4) Willing
  • (5) Very Willing

References

  1. Tam, L.W.H.; Robertson, A.C. Managing change: Libraries and information services in the digital age. Libr. Manag. 2002, 23, 369–377. [Google Scholar] [CrossRef]
  2. Maria Helena, V.; Leonor Gaspar, P.; Paula, O. Revisiting digital libraries quality: A multiple item scale approach. Perform. Meas. Metr. 2011, 12, 214–236. [Google Scholar] [CrossRef]
  3. Town, J.S. Value, impact, and the transcendent library: Progress and pressures in performance measurement and evaluation. Libr. Q. 2011, 81, 111–125. [Google Scholar] [CrossRef]
  4. Cook, C.C. A Mixed-Method Approach to the Identification and Measurement of Academic Library Service Quality Constructs: LibQUAL+TM. Ph.D. Thesis, Texas A&M University, College Station, TX, USA, 2001. [Google Scholar]
  5. Adarkwah, M.A.; Okagbue, E.F.; Oladipo, O.A.; Mekonen, Y.K.; Anulika, A.G.; Nchekwubemchukwu, I.S.; Okafor, M.U.; Chineta, O.M.; Muhideen, S.; Islam, A.A. Exploring the Transformative Journey of Academic Libraries in Africa before and after COVID-19 and in the Generative AI Era. J. Acad. Librariansh. 2024, 50, 102900. [Google Scholar] [CrossRef]
  6. Horstmann, W. Are academic libraries changing fast enough? Bibl. Forsch. Prax. 2018, 42, 433–440. [Google Scholar] [CrossRef]
  7. Oh, D. Beyond Providing Information: An Analysis on the Perceived Service Quality, Satisfaction, and Loyalty of Public Library Customers. Libri 2002, 70, 345–359. [Google Scholar] [CrossRef]
  8. Kiran, K.; Diljit, S. Modelling web-based library service quality. Libr. Inf. Sci. Res. 2012, 34, 184–196. [Google Scholar] [CrossRef]
  9. Zeithaml, V.A. Consumer Perceptions of Price, Quality, and Value: A Means-End Model and Synthesis of Evidence. J. Mark. 1988, 52, 2. [Google Scholar] [CrossRef]
  10. Nicholson, S. A conceptual framework for the holistic measurement and cumulative evaluation of library services. Proc. Am. Soc. Inf. Sci. Technol. 2004, 41, 496–506. [Google Scholar] [CrossRef]
  11. Wilson, T. On user studies and information needs. J. Doc. 2006, 62, 658–670. [Google Scholar] [CrossRef]
  12. Schumann, L.; Stock, W.G. The Information Service Evaluation (ISE) model. Webology 2014, 11, 1–20. Available online: http://www.webology.org/2014/v11n1/a115.pdf (accessed on 16 March 2025).
  13. Guttsman, W.L. Subject Specialisation in Academic Libraries: Some preliminary observations on role conflict and organizational stress. J. Librariansh. 1973, 5, 1–8. [Google Scholar] [CrossRef]
  14. Ishar NI, M.; Masodi, M.S. Students’ perception towards quality library service using Rasch Measurement Model. In Proceedings of the International Conference on Innovation Management and Technology Research, Malacca, Malaysia, 21–22 May 2012; pp. 668–672. [Google Scholar]
  15. Kiran, K. Service quality and customer satisfaction in academic libraries: Perspectives from a Malaysian university. Libr. Rev. 2010, 59, 261–273. [Google Scholar] [CrossRef]
  16. Kettinger, W.J.; Lee, C.C. Perceived service quality and user satisfaction with the information services function. Decis. Sci. 2010, 25, 737–766. [Google Scholar] [CrossRef]
  17. Cabrerizo, F.J.; López-Gijón, J.; Martínez, M.A.; Morente-Molinera, J.A.; Herrera Vied-ma, E.A. fuzzy linguistic extended LibQUAL+ model to assess service quality in academic libraries. Int. J. Inf. Technol. Decis. Mak. 2017, 16, 225–244. [Google Scholar] [CrossRef]
  18. Brophy, P. Measuring Library Performance: Principles and Techniques; Facet Press: London, UK, 2006. [Google Scholar]
  19. Loiacono, E.T.; Watson, R.T.; Goodhue, D.L. WebQual: An Instrument for Consumer Evaluation of Web Sites. Int. Electron. Commer. 2007, 11, 51–87. Available online: http://www.jstor.org/stable/27751221 (accessed on 16 March 2025). [CrossRef]
  20. Hernon, P.; Calvert, P. E-Service quality in libraries: Exploring its features and dimensions. Libr. Inf. Sci. Res. 2005, 27, 377–404. [Google Scholar] [CrossRef]
  21. Pinfield, S. The changing role of subject librarians in academic libraries. J. Librariansh. Inf. Sci. 2001, 33, 32–38. [Google Scholar] [CrossRef]
  22. Hoffmann, D.; Wallace, A. Intentional informationists: Re-envisioning information literacy and re-designing instructional programs around faculty librarians’ strengths as campus connectors, information professionals, and course designers. J. Acad. Librariansh. 2013, 39, 546–551. [Google Scholar] [CrossRef]
  23. Gardner, S.; Eng, S. What students want: Generation Y and the changing function of the academic library. Portal Libr. Acad. 2005, 5, 405–420. [Google Scholar] [CrossRef]
  24. Shahzad, K.; Khan, S.A.; Iqbal, A. Effects of big data analytics on university libraries: A systematic literature review of impact factor articles. J. Librariansh. Inf. Sci. 2024. [Google Scholar] [CrossRef]
  25. Dempsey, L. Generative AI and Libraries: Seven Contexts. 29 July 2024. LorcanDempsey.net. Available online: https://www.lorcandempsey.net/generative-ai-and-libraries-7-contexts/ (accessed on 16 March 2025).
  26. Smith, H.J.; Milberg, S.J.; Burke, S.J. Information privacy: Measuring individuals’ concerns about organizational practices. MIS Q. 1996, 20, 167–196. [Google Scholar] [CrossRef]
  27. Li, S.; Jiao, F.; Zhang, Y.; Xu, X. Problems and changes in digital libraries in the age of big data from the perspective of user services. J. Acad. Librariansh. 2018, 45, 22–30. [Google Scholar] [CrossRef]
  28. Mo, Y.; Fan, L. Performance evaluation about development and utilization of public information resource based on Analytic Hierarchy Process. In Proceedings of the 2011 International Conference on Management and Service Science, Wuhan, China, 12–14 August 2011; pp. 1–4. [Google Scholar] [CrossRef]
  29. Yu, Y. Evaluation of e-commerce service quality using the analytic hierarchy process. In Proceedings of the 2010 International Conference on Innovative Computing and Communication and 2010 Asia-Pacific Conference on Information Technology and Ocean Engineering, Macao, China, 30–31 January 2010; pp. 123–126. [Google Scholar] [CrossRef]
  30. Oddershede, A.M.; Carrasco, R.; Barham, E. Analytic hierarchy process model for evaluating a health service information technology network. Health Inform. J. 2007, 13, 77–89. [Google Scholar]
  31. Villacreses, G.; Jijón, D.; Nicolalde, J.F.; Martínez-Gómez, J.; Betancourt, F. Multicriteria decision analysis of suitable location for wind and photovoltaic power plants on the Galápagos Islands. Energies 2023, 16, 29. [Google Scholar] [CrossRef]
  32. Chen, M.K.; Wang, S.-C. The Critical Factors of Success for Information Service Industry in Developing International Market: Using Analytic Hierarchy Process (AHP) Approach. Expert Syst. Appl. 2010, 37, 694–704. [Google Scholar] [CrossRef]
  33. Ngai, E.W.T.; Chan, E.W.C. Evaluation of Knowledge Management Tools Using AHP. Procedia Comput. Sci. 2005, 29, 889–899. [Google Scholar] [CrossRef]
  34. Chang, T.-S.; Hsieh, Y.-C. Applying the Analytic Hierarchy Process for Investigating Key Indicators of Responsible Innovation in the Taiwan Software Service Industry. Technol. Soc. 2024, 78, 102690. [Google Scholar] [CrossRef]
  35. Thompson, J. Redirection in Academic Library Management; Library Association: London, UK, 1991. [Google Scholar]
  36. Landøy, A. Using statistics for quality management in the library. In New Trends in Qualitative and Quantitative Methods in Libraries: Proceedings of the 2nd Qualitative and Quantitative Methods in Libraries International Conference, Chania, Crete, Greece 2012, 25–28 May 2010; Katsirikou, A., Skiadas, C., Eds.; World Scientific: Singapore, 2012; pp. 97–102. [Google Scholar] [CrossRef]
  37. Green, J.P. Determining the reliability and validity of service quality scores in a public library context: A confirmatory approach. Adv. Libr. Adm. Organ. 2008, 26, 317–348. [Google Scholar] [CrossRef]
  38. Ahmad, M.; Abawajy, J.H. Digital library service quality assessment model. Procedia-Soc. Behav. Sci. 2014, 129, 571–580. [Google Scholar] [CrossRef]
  39. Bai, Y.; Zhang, W.; Yang, X.; Wei, S.; Yu, Y. The Framework of Technical Evaluation Indicators for Constructing Low-Carbon Communities in China. Buildings 2021, 11, 479. [Google Scholar] [CrossRef]
  40. Parasuraman, A.; Zeithaml, V.A.; Malhotra, A. E-S-QUAL: A multiple-item scale for assessing electronic service quality. J. Serv. Research 2005, 7, 213–233. [Google Scholar] [CrossRef]
  41. Rigotti, S.; Pitt, L. SERVQUAL as a measuring instrument for service provider gaps in business schools. Manag. Res. News 1992, 15, 9–17. [Google Scholar] [CrossRef]
  42. Zeithaml, V.A.; Parasuraman, A.; Malhotra, A. Conceptual Framework for Understanding E-Service Quality: Implications for Future Research and Managerial Practice; working paper, Report No. 00-115; Marketing Science Institute: Cambridge, MA, USA, 2000. [Google Scholar]
  43. Patton, M.Q. Qualitative Evaluation and Research Methods; Sage: Thousand Oaks, CA, USA, 2015. [Google Scholar]
  44. Kvale, S. Interviews: An Introduction to Qualitative Research Interviewing; Sage: Thousand Oaks, CA, USA, 1996. [Google Scholar]
  45. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Aldine de Gruyter: Chicago, IL, USA, 1967. [Google Scholar]
  46. Saaty, T.L. Multicriteria Decision Making: The Analytic Hierarchy Process; RWS Publications: Pittsburgh, PA, USA, 1980. [Google Scholar]
  47. Carroll, A.J.; Borycz, J. Integrating large language models and generative artificial intelligence tools into information literacy instruction. J. Acad. Librariansh. 2024, 50, 102899. [Google Scholar] [CrossRef]
Figure 1. AHP analysis process.
Figure 1. AHP analysis process.
Information 16 00245 g001
Figure 2. Hierarchical structure model of the evaluation system for discipline-specific information services in academic libraries.
Figure 2. Hierarchical structure model of the evaluation system for discipline-specific information services in academic libraries.
Information 16 00245 g002
Figure 3. Radar chart of evaluation results for discipline-specific information services across dimensions at Tsinghua University Library, China.
Figure 3. Radar chart of evaluation results for discipline-specific information services across dimensions at Tsinghua University Library, China.
Information 16 00245 g003
Table 1. Sources of literature for the dimensions in the evaluation of discipline-specific information service quality in academic libraries.
Table 1. Sources of literature for the dimensions in the evaluation of discipline-specific information service quality in academic libraries.
DimensionDefinitionRepresentative LiteratureReferenced Models
Perceived Information QualityEvaluates the relevance, accuracy, timeliness, reliability, comprehensiveness, and diversity of discipline-specific information provided by the library.Parasuraman [40]
Loiacono [19]
E-SERVQUAL
Information UsabilityAssesses the convenience of accessing and utilizing discipline-specific information services, including system simplicity, search tool usability, and ease of information retrieval.Aladwani, A. M. [5]WebQual
Information SecurityEvaluates the degree of confidentiality protection in discipline-specific information services, including user privacy protection and data security.Smith et al. [28]E-SERVQUAL
Interactive FeedbackMeasures the library’s capability to facilitate communication and feedback between users, as well as between users and the service platform or providers.Rigotti et al. [41]LibQUAL+TM
E-SERVQUAL
WebQual
Tool ApplicationAssesses the types and quality of tools provided by academic libraries for viewing, analyzing, and utilizing information services.Loiacono [19]WebQual
User ExperienceReflects users’ overall perception and satisfaction regarding personalized discipline-specific information services in academic libraries.Zeithamal et al. [42]WebQual
SERVQUAL
Table 2. Detailed introduction of experts.
Table 2. Detailed introduction of experts.
Serial Number of the ExpertType of ExpertResearch/Practical Background
1University Library Discipline-Specific Information Service ExpertBackground in history and library science; works in a university discipline-specific information service department, responsible for academic information resource management, subject-specific information organization, and retrieval.
2Background in computer science; responsible for university technological innovation, research commercialization, intellectual property information management consulting, and smart library system development.
3Background in history and information science; responsible for academic information service consulting and faculty–student information literacy education.
4Background in engineering; responsible for intelligence analysis and consulting services related to key disciplines in the university.
5Background in library and information science; responsible for graduate education information services and research data management.
6Background in library science; specializes in academic information resource sharing and inter-institutional collaboration, previously participated in academic library consortium projects.
7Background in information science and data analysis; responsible for academic information service consulting, strategy report writing, and related tasks.
8Background in information science; responsible for university digital resource procurement, evaluation, and research data management and mining.
9Library and Information Science ResearcherPhD, professor, and doctoral supervisor; research focuses on discipline-specific information resource development and user information behavior analysis; has led multiple national research projects.
10PhD, professor; specializes in e-service quality and user behavior psychology; has been invited to participate in government digital service quality evaluations.
11PhD, professor; responsible for master’s and doctoral courses in information management; primary research areas include artificial intelligence and information ethics.
12PhD, associate professor, and master’s supervisor; research focuses on artificial intelligence and academic innovation, as well as university education management.
Table 3. Expert interview results.
Table 3. Expert interview results.
Question CategoriesSummary of Interview Results
Quality Elements of Discipline-Specific Information Content1. Authenticity, accuracy, and applicability of information are crucial.
2. Timeliness of information ensures users access the latest discipline-specific knowledge.
3. Diversified types of information, including books, articles, research reports, and databases.
4. Originality of information supports the innovative development of disciplines.
5. Discipline-specific information should cover different research levels and directions.
Coverage Elements of Discipline-Specific Information1. Provide comprehensive discipline-specific information covering various academic fields.
2. Supply diverse types of information, such as books, journals, databases, and electronic resources.
3. Introduce intelligent recommendation algorithms to identify personalized user needs and offer dynamic updates.
Acquisition and Navigation of Discipline-Specific Information1. Easy-to-use retrieval systems with convenient keyword search, categorized search, and data download options.
2. Scientifically designed and intelligent navigation systems support fast resource location via voice or natural language.
3. Structured presentation of information for ease of understanding and application.
4. Provide necessary user guides and support for efficient system usage.
User Feedback and Interaction in Discipline-Specific Information Services1. Establish user communities to encourage communication and collaboration among users.
2. Conduct regular user satisfaction surveys to understand changing user needs.
3. Develop evaluation and feedback mechanisms for service quality.
4. Facilitate user feedback through online Q&A, intelligent chatbots, and real-time support to improve efficiency and satisfaction.
Tools and Application Support for Discipline-Specific Information Analysis1. Provide convenient online tools, such as data analysis and reference management tools.
2. Ensure usability of research tools to meet diverse user needs with technical, teaching, and training support.
3. Guarantee compatibility and stability of research tools.
User Experience in Discipline-Specific Information Services1. Offer diverse service formats, such as hybrid online and offline services, to meet varying user needs.
2. Monitor user behavior trends to provide advanced and responsive services.
Table 4. Indicators of the quality evaluation model for discipline-specific information services in academic libraries.
Table 4. Indicators of the quality evaluation model for discipline-specific information services in academic libraries.
DimensionIndicatorIndicator Definition
Perceived Information QualityInformation RelevanceThe degree to which the discipline-specific information provided by the library matches users’ research and teaching needs, including alignment with topics, fields, and research trends.
Information AccuracyAuthenticity and reliability, such as the accuracy of document resources, credibility of discipline-specific data sources, and correctness of discipline knowledge graphs.
Information DiversityRichness of resource types (e.g., papers, conferences, patents, standards, experimental data) and coverage of interdisciplinary and emerging fields.
Information UsabilityEase of System OperationThe platform and resource repository offer an intuitive, convenient, and efficient user experience for searching, browsing, and downloading, covering interface design, navigation structure, and search functionality layout.
Personalized ServicesDiscipline-specific information services provide recommendations based on user preferences or behavior data, such as customized topic subscriptions and focused discipline sections, offering tailored online and offline integrated services.
Information SecurityUser Privacy ProtectionSecurity guarantees in information services, including authorized access, technical standards, and protection of confidential research data.
System ReliabilityThe stability of the discipline-specific information service platform, such as ensuring data backup and emergency response plans.
Interactive FeedbackEfficiency of Communication and FeedbackThe speed and efficiency of responses to user inquiries and suggestions by librarians or service platforms, including online consultations, offline appointments, and real-time communication tools.
Organization and CollaborationBuilding discipline communities or virtual collaboration platforms to support ongoing communication, resource sharing, and co-creation among faculties, research teams, and librarians.
Tool ApplicationTool AvailabilityInformation analysis tools and training services provided by libraries to facilitate the use of discipline-specific information and the presentation of research outcomes.
AI-Driven FeaturesCapabilities enabled by AI, big data, or knowledge graphs, such as intelligent recommendations, automatic classification, semantic search, trend prediction, and digital literacy skills.
Discipline Data ManagementSupport for the collection, storage, sharing, and reuse of discipline-specific research data, such as data repositories, metadata management, data visualization, and collaborative analysis.
User ExperienceSatisfaction and TrustUsers’ overall satisfaction with discipline-specific information services and their trust in the library’s brand and the professional competence of librarians.
Support for Learning and ResearchThe extent to which services contribute to users’ academic activities, including support for course instruction, thesis writing, and research project progress.
Willingness for Continued UseThe likelihood that users will continue or repeatedly use discipline-specific information services and recommend them to others after their experience.
Table 5. 1–9 scale definitions.
Table 5. 1–9 scale definitions.
ScaleDefinition
1Two factors are of equal importance.
3Indicates that one factor is slightly more important than the other.
5Indicates that one factor is significantly more important than the other.
7Indicates that one factor is strongly more important than the other.
9Indicates that one factor is extremely more important than the other.
2, 4, 6, 8Intermediate values between two adjacent judgments to reflect a more precise level of importance.
Reciprocal ( 1 x i j )If factor i is x i j times more important than factor j , then factor j is 1 x i j times as important as factor i .
Table 6. Judgment matrix of dimension levels.
Table 6. Judgment matrix of dimension levels.
Relative ImportancePerceived Information QualityInformation UsabilityInformation SecurityInteractive FeedbackTool ApplicationUser Experience
Perceived Information Quality1.0001.151.311.310.912.02
Information Usability0.871.0001.201.210.821.86
Information Security0.770.831.0001.020.681.54
Interactive Feedback0.760.830.981.0000.691.48
Tool Application1.101.221.471.451.0002.14
User Experience0.500.540.650.690.471.000
Table 7. Summary of consistency check results.
Table 7. Summary of consistency check results.
Maximum EigenvalueCI ValueRI ValueCR ValueConsistency Check Result
6.00880.00181.2600.0014Passed
Table 8. Weights of indicators within each dimension.
Table 8. Weights of indicators within each dimension.
Perceived Information Quality IndicatorsInformation Relevance (C1)Information Accuracy (C2)Information Diversity (C3)
Weight0.100100.060060.04004
Information Usability IndicatorsSystem Usability (C4)Personalized Services (C5)
Weight0.099110.08109
Information Security IndicatorsUser Privacy Protection (C6)System Reliability (C7)
Weight0.090900.06060
Interactive Feedback IndicatorsResponse Efficiency (C8)Collaboration and Engagement (C9)
Weight0.097240.05236
Tool Application IndicatorsTool Availability (C10)AI-Driven Features (C11)Discipline Data Management (C12)
Weight0.087360.076440.05460
User Experience IndicatorsSatisfaction and Trust (C13)Learning and Research Support (C14)Continued Usage Intent (C15)
Weight0.045090.035070.02004
Table 9. Weighting of evaluation system for the quality of university library subject information services.
Table 9. Weighting of evaluation system for the quality of university library subject information services.
DimensionDimension WeightIndicatorRelative Weight Within DimensionFinal Indicator Weight
Perceived Information Quality Indicators0.2002Information Relevance (C1)0.500.1001
Information Accuracy (C2)0.300.0600
Information Diversity (C3)0.200.0404
Information Usability Indicators0.1802System Usability (C4)0.550.0991
Personalized Services (C5)0.450.0810
Information Security Indicators0.1515User Privacy Protection (C6)0.600.0909
System Reliability (C7)0.400.0606
Interactive Feedback Indicators0.1496Response Efficiency (C8)0.650.0972
Collaboration and Engagement (C9)0.350.0523
Tool Application Indicators0.2184Tool Availability (C10)0.400.0874
AI-Driven Features (C11)0.350.0764
Discipline Data Management (C12)0.250.0546
User Experience Indicators0.1002Satisfaction and Trust (C13)0.450.0451
Learning and Research Support (C14)0.350.0351
Continued Usage Intent (C15)0.200.0200
Table 10. Membership degree matrix for the quality evaluation weights of discipline-specific information services at Tsinghua University Library.
Table 10. Membership degree matrix for the quality evaluation weights of discipline-specific information services at Tsinghua University Library.
IndicatorVery DissatisfiedDissatisfiedNeutralSatisfiedVery Satisfied
Information Relevance (C1)0.03570.05360.21430.42860.2679
Information Accuracy (C2)0.05170.08620.24140.34480.2069
Information Diversity (C3)0.0610.04580.24410.29050.1678
System Usability (C4)0.01850.0370.18520.51850.3148
Personalized Services (C5)0.0270.04050.18510.43630.2361
User Privacy Protection (C6)0.04690.05860.27340.42970.1914
System Reliability (C7)0.0280.040.320.420.26
Response Efficiency (C8)0.02960.05560.20370.48150.2407
Collaboration and Engagement (C9)0.03150.04890.24490.42060.1641
Tool Availability (C10)0.01560.0250.1250.43750.25
AI-Driven Features (C11)0.02590.04440.18520.50.2593
Discipline Data Management (C12)0.02880.0360.21620.45050.2342
Satisfaction and Trust (C13)0.04310.06470.30260.47410.2138
Learning and Research Support (C14)0.0480.0720.320.420.22
Continued Usage Intent (C15)0.06250.08330.3750.41670.2083
Table 11. Weighted evaluation vector calculation results for the quality of discipline-specific information services at Tsinghua University Library.
Table 11. Weighted evaluation vector calculation results for the quality of discipline-specific information services at Tsinghua University Library.
Very DissatisfiedDissatisfiedNeutralSatisfiedVery Satisfied
Perceived Information Quality0.009139970.012387680.045797070.075327060.04600991
Information Usability0.004020350.00694720.033346420.086723650.05032078
Information Security0.005960010.007750740.044244060.064511730.03315426
Interactive Feedback0.004524570.007961790.032607910.068799180.03197847
Tool Application0.004914680.007542760.03687880.10103480.05444784
User Experience0.004878610.007111170.032379260.044457910.02153038
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, S.; Zhang, T.; Wang, X. AHP-Based Evaluation of Discipline-Specific Information Services in Academic Libraries Under Digital Intelligence. Information 2025, 16, 245. https://doi.org/10.3390/info16030245

AMA Style

Zhang S, Zhang T, Wang X. AHP-Based Evaluation of Discipline-Specific Information Services in Academic Libraries Under Digital Intelligence. Information. 2025; 16(3):245. https://doi.org/10.3390/info16030245

Chicago/Turabian Style

Zhang, Simeng, Tao Zhang, and Xi Wang. 2025. "AHP-Based Evaluation of Discipline-Specific Information Services in Academic Libraries Under Digital Intelligence" Information 16, no. 3: 245. https://doi.org/10.3390/info16030245

APA Style

Zhang, S., Zhang, T., & Wang, X. (2025). AHP-Based Evaluation of Discipline-Specific Information Services in Academic Libraries Under Digital Intelligence. Information, 16(3), 245. https://doi.org/10.3390/info16030245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop