Next Article in Journal
A Call for a More Efficient Submission Process
Previous Article in Journal
Striving for Modernity: Layout and Abstracts in the Biomedical Literature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries

University Libraries, University of Tennessee, Knoxville, TN 37996, USA
Publications 2020, 8(3), 39; https://doi.org/10.3390/publications8030039
Submission received: 27 May 2020 / Revised: 1 July 2020 / Accepted: 16 July 2020 / Published: 23 July 2020

Abstract

:
Background: Journals with high impact factors (IFs) are the “coin of the realm” in many review, tenure, and promotion decisions, ipso facto, IFs influence academic authors’ views of journals and publishers. However, IFs do not evaluate how publishers interact with libraries or academic institutions. Goal: This provisional system introduces an evaluation of publishers exclusive of IF, measuring how well a publisher’s practices align with the values of libraries and public institutions of higher education (HE). Identifying publishers with similar values may help libraries and institutions make strategic decisions about resource allocation. Methods: Democratization of knowledge, information exchange, and the sustainability of scholarship were values identified to define partnership practices and develop a scoring system evaluating publishers. Then, four publishers were evaluated. A high score indicates alignment with the values of libraries and academic institutions and a strong partnership with HE. Results: Highest scores were earned by a learned society publishing two journals and a library publisher supporting over 80 open-access journals. Conclusions: Publishers, especially nonprofit publishers, could use the criteria to guide practices that align with mission-driven institutions. Institutions and libraries could use the system to identify publishers acting in good faith towards public institutions of HE.

Motto of the American Library Association in 1879:
“The best reading for the greatest number at the least cost.”

1. Introduction

Citation metrics, including journal-level impact factors, are the predominant means of evaluating scholarly journals and their publishers. Relying exclusively on such metrics to evaluate publishers or the scholarship they publish is fraught with problems. Furthermore, these numbers do not evaluate publishers’ business practices, business models, or behaviors towards institutions of higher education (HE) and academic libraries. As faculty across the disciplines express discontent with the state of scholarly journal publishing, and as an increasing number of institutions of HE and their libraries look closely at the subscription fees and open-access article processing charges they pay to publishers, a system to compare journal publishers exclusive of citation metrics may be of value. This paper outlines a scoring system that awards points to publishers that act as partners with HE through practices that align with the origin, values, and missions of institutions of HE (using the values of land-grant institutions in the United States as a basis), the profession of librarianship, and some learned society publishers. If it were to be used, the scoring system would require vetting, as well as resources for implementation, coordination, and sustainability. Even if never applied, the scoring system’s criteria may serve as a core set of practices to consider when evaluating any service/content provider working with HE, not just publishers.

1.1. Evidence of Discontent in Scholarly Journal Publishing

In the past three decades, discontent with the state of scholarly journal publishing has been growing in higher education. For example, among academic faculty, members of editorial boards have resigned to protest their inability to change a publisher’s terms related to their journal’s subscription cost and/or their journal’s access and copyright terms. The first such instance occurred in 1989, when the majority of the editorial board of Vegetatio left to launch a new journal, Journal of Vegetation Science [1]. Coverage of another journal’s transition or independence gained popular attention in a 1998 New York Times article [2]. Peter Suber and others have noted an annual occurrence of such events since 2012 [1], and in February 2020 a group of over 20 academics resigned from a Wiley law journal over questions of ownership and decision-making for the journal [3].
At the same time as Vegetatio’s editorial board resignation, academic libraries began to frame their discontent in terms of the “serials crisis,” that is, the discrepancy between the consumer price index when compared to the extremely high institutional price tag for serials subscriptions and/or access, especially in scientific, technical, and medical (STM) fields [4]. This crisis dominated discourse in library and information sciences for many years. A statement from a 1989 report by the Association of Research Libraries (ARL) sounds similar to statements that libraries make today: “Research libraries and the scholars that rely upon them face a major crisis … the publication of certain key STM serials is concentrated increasingly in the hands of a small group of publishers. More of the money spent on academic library subscriptions is going to fewer publishers and the cost of these serials is soaring” [5]. Ideas on how to approach the serials crisis have varied. One significant response has been for faculty and/or librarians to use shared governance structures to create open-access policies on their campuses. Such a policy was first established at the department-level in 2003 [6]. Open-access policies come in many varieties, but are arguably most effective when they serve as a legal means for researchers to deposit a version, typically the post-print (also known as the author’s accepted manuscript), to their institution’s open-access repository with an embargo of 12 months or less. To bolster community awareness, librarians and campus partners engage in outreach to inform authors about how to deposit accepted manuscripts to the repository. While the policies show great potential, reports of policy adherence have been registered at only around 25% and such policies are relatively few in the United States [7]. If departmental or college-level policies are not considered, there are fewer than 80 institutional-level open-access policies in the United States [8].
In 2012, another sign of discontent: British mathematician Sir William Timothy Gowers set in motion a boycott of Elsevier, with thousands of individual academics signing their name to what became known as “The Cost of Knowledge” [9]. Gowers’ original complaints—high subscription costs, “bundling” of titles (now called “big deal packages”), “ruthless” practices in negotiations with libraries, and lobbying for legislation against open access [10]—continue to impact the scholarly publishing field today. Adding to the list of criticisms, Tal Yarkoni, a faculty member in Psychology at the University of Texas at Austin, included Elsevier’s organizing of international arms trade fairs and the publication of sponsored journals masquerading as peer-reviewed titles [11]. Despite these complaints and concerns, Yarkoni presented anecdotal evidence of a persistent lack of knowledge about, and/or indifference to, Elsevier’s actions among researchers [11]. In fact, not all academics who signed on to support the boycott have maintained it, suggesting the boycott had limited impact on the publisher [12].
The library profession and the broader scholarly communication community have expressed their discontent with big publishers’ big deal packages (BDP), in which academic libraries purchase access to journals in bundles rather than individually by title. Libraries have been working with their institutions to cancel these packages [13]; European countries, often with national publisher contracts, took the lead here. German institutions canceled their Elsevier subscriptions in 2016–17 due to unsuccessful negotiations, and in 2019–20, Swiss universities canceled agreements with Springer Nature [13]. In the United States, the University of California system cancelled their Elsevier subscriptions in 2019 [13].
While it is too early to know what effect these mass cancellations may have on scholarly publishing, other means for registering dissatisfaction have not led to major changes in publishing. Journals with editorial resignations continue to exist and publish. Submissions to repositories on campuses with open-access policies do not even reach one quarter of eligible deposits. Authors have changed their minds about boycotts while Elsevier’s list of new journals grows yearly: 80 new journal titles in 2016, 111 in 2015 [14]. In HE, there is not only discontent with scholarly publishing but also indetermination about how to change it.

1.2. Sources of Discontent and Lack of Publisher Evaluations

A decreasing number of publishers in control of scholarly publishing fuels some of the discontent. Larivière et al. discussed the transformation of scholarly journal publishing, from a time in which learned societies were largely in control of the journal publishing enterprise to the current state in which an oligopoly of five large corporate entities (Elsevier, Wiley–Blackwell, Springer, Taylor & Francis, and Sage; a sixth, the American Chemical Society, is a learned society) owns the majority of academic journals across all fields [15]. Fyfe et al. described these changes in greater detail, contrasting scholarly publishing since the mid-20th century with the prior publishing culture that was largely led and owned by HE and academics [16]. Fyfe et al.’s report warrants extensive quotation: “The older model of academic publishing practised by learned societies and university presses had prioritized the wide circulation of high-quality scholarship, with little or no expectation of making money” [16]. Additionally, in stark contrast to the ethos of the academic research community [which] had historically been non-commercial” and which largely worked with entities “such as learned societies and university presses—with a mission for scholarship rather than profit ... the new commercial model demonstrates that, in the new world order, it is possible not merely to break even but to make profit” [16].
It is important to contextualize that the financial gains for commercial publishers are not small. RELX, Elsevier’s parent company, makes substantially higher profit margins from their publishing arm alone than pharmaceutical companies, international banks, and automobile companies, often doubling or even tripling the profit margins of such corporations [17]. While it may seem an unlikely pairing between society publishers and RELX, which is included in London’s Financial Times Stock Exchange 100 Index, or other large publishers, nonetheless “smaller learned societies ... have increasingly entered into profit-sharing co-publishing relationships with larger publishers. By 2004, about half of all societies published via a third party; both commercial publishers and university presses are involved in this” [16]. Alongside this practice, the “societies who have outsourced their publishing have very little information about how their income is derived” [18]. Furthermore, while smaller publishers may have more restrictive copyright and access policies than larger publishers [18], entering into these relationships often means that a society no longer determines the copyright and archiving policies of their journals. This means societies are not simply defaulting to the terms of the larger publisher but are operating without a full picture of their publishing policies’ effects on authors, researchers, and others, without an understanding of how their outsourced publishing income is generated, and without the ability to change policies or practices themselves.
These changes—a smaller number of companies outside of HE publishing an ever-increasing number of journals; a tension between profit-driven and educational missions; a lack of awareness among smaller scholarly publishers about copyright, licensing, and the practices of the five biggest publishers with whom they might enter into agreements—contribute to the disgruntlement felt about scholarly journal publishing in institutions of HE. Many of the efforts outlined above, such as the resignation of faculty from editorial boards, are punitive actions aimed at shaming, or decreasing financial and intellectual support for, the biggest publishers. Yet there is little in the literature about how the five (or six) biggest publishers’ practices differ from those of smaller publishers. This is a notable lacuna in the literature as there is no clear basis for comparison or change.
Instead of reactionary responses, organizations are emerging to develop proactive responses to concerns in journal publishing, suggesting better alignment of publishers’ practices with the goals and missions of funders as well as institutions of HE. As an example, Plan S is an ambitious European funder-led plan to require open access to grant-funded research. Though open access is their main goal, publishers and publishing organizations, including the Open Access Scholarly Publishers Association, have expressed concerns about unintended consequences of the plan, especially on publishers not in the big five; in particular, there are conditions that would be extremely challenging for smaller publishers to meet [17,19]. In another, the Royal Society of Chemistry’s “read and publish” agreement outlines the UK-based society’s transition from hybrid publishing to fully open-access publishing with support from libraries and institutions of HE [20]. Furthermore, academic libraries are increasingly taking on roles as publishers, and several groups are engaging in related discussions [21,22].
These strategies address access to articles but they also begin to engage with broader concerns related to the sustainability of scholarship, which may be understood as the sustainability of creating and sharing scholarly findings. Björn Brembs, neurobiologist and professor at the University of Regensburg, asserted that open access is of less concern to the sustainability of scholarship than adopting solutions to improve its “reliability, affordability and functionality” [23]. With regard to reliability, many innovations have been discipline-driven rather than publisher-driven. For example, Registered Reports, a system for early peer review of a study’s methodology, has been adopted by 225 journals from various publishers in psychology, cognitive sciences, and medicine, but not by other journals in these publishers’ portfolios [24]. Regarding affordability and functionality, analysis from SPARC, a coalition of academic libraries promoting open access to scholarship, shows that publishing companies have become multi-armed corporations, which are less dependent on journals and monographs as sources of income and more interested in collecting and analyzing data from institutions of HE [25]. As the large publishing companies transform into data analytics and scholarly enterprise companies, their growing control over infrastructure and data could make “affordability” and “functionality” (that is, interoperability) a permanent problem, as has been seen already in their publishing practices.
Given these concerns, the ability to compare and rate publishers in terms of their practices and interactions with HE may be helpful, but there is no existing model or tool for doing so. While Plan S comes closest to this form of evaluation, it adopts a lens of funder, rather than HE, expectations, and largely excludes smaller publishers. The existing systems for evaluating publishers are similarly limited in that they do not take into account the wide range of practices that make up an institution’s interactions with a publisher. Schonfeld’s Red Light/Green Light model offers a budgetary perspective for libraries and suggests rating content providers based on return on investment; however, this proposal is not limited to publishers and does not present a holistic view of publisher practices [26]. Another, the list of “Principles of Transparency and Best Practice in Scholarly Publishing” from the Committee on Publication Ethics (COPE), relates to the processes of reviewing, publishing, and making editorial decisions, but does not identify preferred terms of access or licensing, nor address the political activities of the publisher [27]. Similarly, the Fair Open Access Principles evaluate aspects of a publisher’s practices through an important but narrow lens of access and copyright [28]. The Principles are also presented as absolutes, which, in a complex ecosystem such as scholarly publishing, may limit participant involvement.
It could also be argued that impact factors, which measure the citation impact of journals (rather than publishers) could be utilized as an evaluation tool. While a journal’s metrics of impact and acceptance/rejection rates are dominating forces in faculty review, tenure, and promotion (RTP) decisions [29], there are known flaws, as both impact factors and rejection rates can be manipulated [30,31,32,33]. Additionally, journals with high impact factors have higher retraction rates due to fraud and other serious forms of misconduct than journals with lower impact factors [34]. Furthermore, impact factors are not intended to serve as the primary criteria for what makes “good research,” nor are they designed to be the sole mark of “quality” of a journal [35,36,37,38,39]. In other words, the question of what constitutes “good research” is separate from what is, or is not, a “good journal.” If the impact factor is a flawed metric, and if “good journals” are determined by more than one metric, then impact factor cannot serve as a sufficient metric in identifying “good publishers.”

1.3. Toward a Partnership-Based Method of Evaluation

The growing discontent in scholarly publishing is relevant to institutions of HE at the institutional level, which includes faculty, administrators, and campus units, such as offices of research, holistically, and also at the unit level, specifically, among academic libraries, which are responsible for the purchasing and licensing of materials. While it is important to identify a publishers’ terms with libraries, this cannot serve as the sole means for evaluating a vendor’s partnership with HE. That is, a micro- or unit-level view of partnerships does not adequately reflect libraries’ roles, the effects of the agreements into which they enter with publishers and vendors, or the myriad ways that disciplinary faculty are also impacted by these practices.
From a libraries’ perspective, Finnie argued the act of “voting with dollars” is vital to decisions related to support for publishers and other vendors [40]. That is, it is not just cost and fees that should be considered before buying access or subscriptions, but also the access method (open or subscription access), publisher’s licensing terms, business model, and history of behaviors towards academic libraries and institutions of HE, which include faculty in varying roles as readers, authors, editors, reviewers, instructors, and sometimes as publishers. One such method for “voting with dollars” may be to identify and support publishers that act as partners with institutions of HE, and in particular, public institutions of HE.
Definitions of the word “partner” in the Oxford English Dictionary refer to “joint shares,” “a sharer or partaker,” “a person who takes part with others in doing something,” and “individuals with interests and investments in an enterprise, among whom expenses, profits, and losses are shared” [41]. Partners may have shared goals, interests, and activities. They may also have shared values. With these definitions in mind, this paper presents a new system for evaluating publishers that centers public institutions of HE, the profession of librarianship, and learned societies as partners in scholarship. The criteria have been informed by the shared origins, missions, and values of these partners and are intended to help libraries and institutions make informed decisions about who might, or might not, be considered partners in scholarship and to what degree they exhibit these characteristics. The criteria are also intended to help address some of the discontent in scholarly journal publishing outlined above.
In support of this thesis of partnership, there are three main points: (a) the professional values of equality in and equity of access to information upheld by libraries is mirrored in the origins and values of institutions of HE that emphasize democratization of access to knowledge (i.e., public and land-grant colleges and universities in the United States and many institutions of HE internationally); (b) librarianship’s professional values related to the findability and “shareability” of information, that is, the exchange of information, correlate with the origin, values, and missions of many learned societies; and (c) these entities are invested in the sustainability of scholarship, and the similarities among members of these three groups make them partners in scholarship.

1.3.1. Democratizing Access to Knowledge (Education and Information)

Equity, equality, and fairness are each a distinct perspective on how to expand access. In terms of educational opportunities in HE, any of these three perspectives can be generalized as a push to democratize access to HE beyond a single group. In the United States, this push was epitomized by the establishment of land-grant colleges through the Morrill Act of 1862, which prioritized giving the citizenry access to HE opportunities. The original agricultural- and mechanical-focused curricula at land-grant colleges addressed contemporary needs for improving agricultural practices and food security and targeted the nearly 60% of employed people who worked in agriculture [42]. However, Morrill’s vision for land-grants was not limited to particular fields of study [43]. Instead, he advocated “broad” education, both in terms of curricula and also accessibility to that education, which he said should be “placed in every State within the reach” of all citizens; the Morrill Acts of 1890 and 1994—which expanded land-grant funding to former Confederate States and Native American Tribal colleges, respectively—supported that vision of a broad education [43]. Notably, the Act stipulated that reports of “the progress of each college, recording any improvements and experiments made, with their ... results” be shared with all other land-grant colleges as well as the Secretary of the Interior, intending to make not only education but also the results of research publicly available [44].
The Morrill Act, which was largely about agricultural education, is relevant for its intent to purposefully democratize higher education; however, the social histories of land-grant colleges are complex. The Morrill Act was developed under the shadow of westward expansion and the forced removal of Native Americans in the United States. It is important to acknowledge that land-grant colleges were largely founded with, and expanded by, the sale of indigenous lands [45]. Today, many of the land-grants, both public and private, are top-tier universities [46,47]. The missions of these institutions, though mired in historical contradictions, stress access to all people of abilities, education of students to make positive and meaningful contributions to the state or region in which they live and the larger world, outreach to the community, lifelong learning, and a commitment to research, especially research that helps improve society [48,49]. Sharing knowledge and providing broad access to HE is, of course, neither limited to land-grant institutions in the United States, nor to public institutions of HE more widely. The desire to democratize access to education is a driving force in many institutions of HE throughout the world, but especially among those that welcome the public and especially since the latter half of the 20th century [50]. Still, HE institutions in the United States have been “expansion-minded”, offering “opportunities of universal access” for many more years than other Western nations [50]. The Morrill Act is one important source of such characteristics.
In the United States, democratizing access to knowledge was envisaged not only by the Morrill Act but also in the founding of Andrew Carnegie-funded public libraries and the development of the American Library Association (ALA), now the oldest library association in the world. For Carnegie, who donated much of his enormous 19th century steel industry fortune philanthropically, access to books was vital to his personal education and success [51]. He envisioned the United States as a meritocracy and viewed libraries as sources of education to support such a system [52]. Carnegie’s select funding of library buildings across the nation between 1886 and 1923 helped popularize the view that access to information should be equally available to all. Yet, in the early days of the ALA, both equality and equity in services and collections was under debate. The profession was divided about its responsibilities and priorities in terms that included whether to select only the so-called “best” literature or to include popular literature, too, whether (and how) to work with immigrants, and so on [53]. Carnegie’s actions helped the library become an integral presence in local communities throughout the United States and beyond. Perhaps with his emphasis on equality, the ALA was able to prioritize additional facets of information access to be addressed by the profession, such as equity. In any case, the ALA solidified equity of access as a professional value around World War I [54].
Access to information continues to be a central professional value. The ALA Code of Ethics includes statements that the profession is “explicitly committed to intellectual freedom and the freedom of access to information” and that libraries “provide the highest level of service to all library users through ... equitable access” among other practices [55]. Besides the ALA, the Association of Research Libraries (ARL), a membership organization of over 120 major research universities’ libraries in the United States and Canada, has as part of its mission statement to “achieve enduring and barrier-free access to information” [56]. ARL’s Principles and Practices encompass support for equitable and open access, and ARL is a signatory on several open-access and open scholarship statements [57]. The interrelated values of democratizing access to education and information that are present within public HE institutions and libraries (especially academic libraries) are central and requisite criteria in identifying HE partners.

1.3.2. Information Exchange: Findable and Shareable

Another shared value is that of information exchange—making information findable to broader audiences and intentionally sharing information with a community of interested people. While equity of access may not be a value of learned societies, they, like libraries, do value sharing information with a community. Wider distribution and exchange of information have been values of many learned societies since the 19th century [58], as evidenced by the rapid growth of such organizations at that time, and the associated increases in publications produced by these organizations in both the United States and the United Kingdom [59]. Exchange of information is the reason many societies began to publish journals and shared them freely or inexpensively [60]. Though there are exceptions, most do not intend for their journals to be read by the general public [61,62]; however, it is not uncommon for societies to count among their members not just scholars or academics but also practitioners and individual citizens, giving members access to publications for relatively modest individual membership fees. In addition, some societies deposit all their journal articles in PubMed Central or other open repositories within six months of publication, effectively participating in delayed open access that results in 100% deposit in just a few months (see American Society for Microbiology, American Society of Plant Biologists, the Society for Neuroscience, and others). This practice is a significant contribution to, and prime example of, information exchange with the public.
For libraries, information exchange is more than access to publications. Academic libraries, which focus on campus researchers and students as their primary audiences, situate equity of access within the broader needs of these researchers. Through various services, such as interlibrary loan, librarians in HE are acutely aware of disparities of access among researchers and institutions, so even when immediate or open access to an information source is not available, librarians encourage discoverability by sharing information about the source and where to find it. Libraries, as memory institutions, also want to ensure that the information is preserved for generations to follow. In other words, libraries function not only to collect publications and other forms of information, but also to preserve them and make them findable. Examples of this work of librarianship include the development of shared online public access catalogs built on shared metadata (e.g., WorldCat); the development of metadata schema, such as MODS; the development and maintenance of open-access repositories; contributions to Linked Data in progress towards the Semantic Web; library contributions in digitization and copyright review for HathiTrust; the creation of the Digital Public Library of America, similar to Europeana in the European Union; and the creation and maintenance of PubMed by the United States’ National Library of Medicine. Each of these contributions reflects core practices in librarianship and reinforces the values in ALA’s Code of Ethics, which states that librarians have a “special obligation to ensure the free flow of information and ideas to present and future generations.” Information about publications and other information resources is created and openly shared by librarians to improve findability of resources, foster the exchange of knowledge and information, and ensure that copies are preserved for posterity.

1.3.3. Sustainability of Scholarship

Along with libraries and institutions of HE, many societies are also committed to the sustainability of scholarship. The fact that many societies contribute all research publications to PubMed Central within six months of publication reinforces the values of sharing information and preserving the intellectual outputs of researchers and scholars. Another example of this is the relatively low cost of access to journals published by nonprofit publishers. Bergstrom et al. found that cost-per-citation for journals from large commercial publishers was anywhere from two times to ten times the cost-per-citation for nonprofit publishers’ journals [63]. Though criticisms of several society publishers and their practices (such as executives’ salaries) are prevalent in the scientific and library communities, there are two membership organizations made up of nonprofit learned societies that publish and also substantially reinvest funds in scholars and HE: the Scientific Society Publisher Alliance in the United States and the Society Publishers’ Coalition in Europe [64,65]. Members of both organizations have retained a good deal of control over their journals’ policies and practices—determining for themselves the copyright agreements, the rights of authors, the cost of subscriptions, and so on—reflecting practices from an age when scholarly organizations’ publications were often shared widely, inasmuch as print materials could be shared widely in the pre-digital age [16]. From the 1980s to today, the trend has been for societies to work with large publishers to manage their journals online and, in exchange, give the large publishers permission to guide and/or determine the policies and practices of the journal; however, the societies and associations that are members of these two groups typically control their own publishing systems and thus set their own publishing policies. Collectively, members of these two groups demonstrate commitment to sustainable access costs, reinvestment in higher education, and various models of lifelong learning. They serve, in some capacity, as the voice of researchers, scholars, and faculty who have ownership interest in their publications in order to share their scholarship with the public first and financially profit from it second. In contrast, the American Chemical Society has faced questions regarding practices such as the annual salary of their CEO and lobbying against the National Institutes of Health’s free database, PubChem [66,67].

1.4. Summary

As academic libraries question the economic sustainability of supporting large publishers, some of their parent institutions are supporting them in canceling expensive subscriptions and identifying unacceptable licensing and access terms (e.g., University of California). At this time, when institutions are considering the budgetary impacts of supporting large journal publishers, it is reasonable to also ask whether a publisher’s values and practices are additional aspects to consider with regard to both institutional resource allocation and measurement of an individual researcher’s “quality” of scholarly contributions. If so, the shared values of institutions of HE and libraries identified above that also align with learned societies that have stayed close to their founding principles—with, as Fyfe et al. wrote, “a mission for scholarship rather than profit” [16]—could form a basis for a systematic evaluation of publisher values and practices. The commonalities among these types of organizations and institutions may be used to characterize them as partners, in which each group contributes to similar activities—learning, experimenting, educating, and sharing—with the aim of benefitting others through knowledge. Importantly, each partner considers this knowledge to be more of a public good than a commodity. Thus, partnership could be used a guiding principle in a publisher evaluation system.

2. Materials and Methods

This paper provides one possibility for a set of criteria that institutions of HE, and their libraries in particular, might use to evaluate publishers. The key values of institutions, libraries, and many learned societies discussed above are the basis from which these criteria were developed. The more a publisher’s practices align with these values, the more it can be considered a partner with HE.
These criteria relate to the sustainability of scholarship because they speak to Brembs’ concerns about the future of affordability and interoperability in research and scholarly infrastructure. Partners prioritize interoperability and affordability. They are also relevant to Finnie’s “voting with dollars” in that public institutions of HE might be more willing to enter into service agreements with publishers (or other vendors, providers, or suppliers) whose values and practices align with their own and less willing to work with those whose values and practices do not.
The criteria are presented as a scoring system in which publishers are akin to HE not only through shared values but also through empowering researchers and scholars and the institutions of HE that support them. Because such publishers see faculty, students, and institutions of HE as essential partners, not customers, they emphasize the rights of content creators and disciplinary experts in the publishing process. The scoring system is presented in some detail in order to provide an example of how such a system might work. Tentatively called Publishers Acting as Partners with Public Institutions of Higher Education and Land-grant Universities (or PAPPIHELU, hereafter referred to as PAPPI), the system has not been vetted by any professional organization. It should not be considered the best, preferred, or only means for evaluating whether or not a publisher’s behaviors support the values above, making the publisher an HE “partner” or not. Instead, PAPPI is one way to demonstrate that publishers’ practices can be identified and measured for how well they mirror the common worldview and ethic of HE institutions (public institutions especially), their libraries, and many scholars.
Proposed scores are similar to the United States’ Green Building Council’s Leadership in Energy and Environmental Design, or LEED, certification in architectural and building projects: “Projects pursuing LEED certification earn points for various green building strategies across several categories. Based on the number of points achieved, a project earns one of four LEED rating levels: Certified, Silver, Gold or Platinum” [68]. Additionally, “LEED certification is a globally recognized symbol of sustainability achievement and leadership” [69]. Evaluation criteria are presented as a list of publishers’ practices that earn a publisher points. For example, publishers that use Creative Commons licenses or allow authors to retain all rights to their work earn substantial credit in the PAPPI system.

2.1. Value Scheme

The scoring system introduced in this paper attempts to provide proof of the concept that evaluating and promoting partnership practices among vendors in HE is possible and beneficial, and that libraries have significant contributions to make in such activities. Librarians often take into simultaneous consideration the experiences of readers of scholarship and also authors/creators of scholarship. As such, PAPPI makes clear judgments about scholarly journal publishing that could be valid in any similar evaluation system. For example:
  • Similar practices between large and small publishers often differ with regard to both justifications and resulting effects. An evaluation system should account for those differences. With regard to access and distribution, some publishers make money from publishing journals that incur both subscription fees and open-access fees, known as hybrid-open-access journals. This means the public can access some but not all of the journal’s articles. Typically, publishers make money in several ways: from hybrid journals, from journals that are solely subscription-funded, and from other journals that are completely open-access and solely article processing charge (APC)-funded. Libraries often consider APCs for hybrid journals to be “double dipping”, as institutions may be asked to pay a subscription rate on top of APCs for individual articles. Because publishers’ budgets are helped by a diverse portfolio of all models, the hybrid model can increase large publishers’ already high profits. On the other hand, many learned societies are nonprofit organizations with values similar to public institutions of HE. They often have much smaller publication lists (often fewer than 10 journal titles) with lower institutional subscription rates than large publishers. Societies often use the hybrid model to increase modest revenues used to fund scholarships, travel grants, and other activities that benefit researchers in HE [58]. Hybrid journals are not singled out in the PAPPI model. Instead, the publisher’s type of business, HE-support activities (such as travel grants), average fees for APCs, and public archiving activities are considered separately in the credits awarded.
  • When considering practices (means) and resulting effects (ends), avoid existing assumptions, conflations, and labels. For example, because public access to research is of great importance in evaluating partnership practices, PAPPI considers a publisher’s agreement with the author(s) holistically, giving credit for including terms not only for open publishing but also for open archiving. Even with a delay or embargo, open archiving can result in significant public access to scholarship, especially when open archiving is done routinely by the publisher (not the authors), as is the case with many scholarly societies. A partnership scoring system might evaluate journals that follow any distribution method and any label—open with publication fees (“gold”), open with no publication fees (“diamond” or “platinum”), subscription-only, or a hybrid-open-access model—with an eye toward the publisher’s true end result regarding public access (or any other practice).
  • Provide additional dimensions to determine “quality” journals, not replacement metrics. Partnership practices are additional metrics by which to judge publications and are separate from a journal’s IF. A PAPPI score is, in part, a measure of a publisher’s interest in publishing research for the sake of inquiry and science, with consideration for transparency, public access, and author rights, among other values. For example, see the high points awarded for COPE membership, or the credit earned for publishing studies with a methodology vetted by the Center for Open Science’s Registered Reports. In addition, publishers that allow authors to retain copyright and thus empower authors to decide how to share and use their own work earn high points. PAPPI helps academic authors identify publishers that value independent scholarly inquiry and support authors in sharing their own work as they wish, which often leads to the sharing of knowledge, not the restriction of it, to the benefit of scholars everywhere.
From a wider perspective, the question is not just, Are there publishers with practices that benefit authors and/or institutions of HE more than others? The question is also, When choosing the tools and resources for scholarship that an institution will support, do libraries’ and scholars’ values (e.g., interoperability and findability) and experiences (e.g., negotiations or observations) matter? This provisional list of partnership practices is in part intended to help libraries and campuses reflect on and quantify a vendor’s practices in order to participate in conversations about them.

2.2. Designations

Points earned by a publisher result in designations similar to LEED certification for buildings. LEED’s designations are Platinum (highest designation), Gold, Silver, and Certified, but identical or similar terms of “gold” and “platinum” (or “diamond”) are already used in open-access designations, Schonfeld’s proposed designations of red/yellow/green relate to whether or not a library should continue, pause, or stop subscriptions or purchases of particular products or collections. For these reasons, a distinct classification has been created. “Platinum” or highest designation is PAPPI Tier 1, followed by PAPPI Tier 2; the lowest scoring publishers receive no PAPPI designation at all. The scoring range for each tier should be considered with further input from librarians and the greater HE community. For current purposes, the author assigned PAPPI Tier 1 status to publishers that earn 62 points or more, that is, 70% of all possible points (89 total). PAPPI Tier 2 publishers are those that earn at least 45 points (51% of all possible points). PAPPI Tier 3 publishers earn at least 27 points (30%).

2.3. Scoring Library

A brief outline of the PAPPI criteria is below. The full scoring system, which includes points awarded per criterion, is available on a public wiki (DOI: 10.17605/osf.io/wgup2) and in static form in Appendix A. Scores are determined by how many points a publisher earns in the following categories:
  • Public Access (see Table 1 for a full list of points in this category): Highest credit (4 points) is given to publishers that make articles fully and immediately open access in at least 85% of their journals. These journals should also be listed in the Directory of Open Access Journals (DOAJ), and/or the publishers be members of the Open Access Scholarly Publishers Association (OASPA). Next highest credit (3 points) is given to publishers that automatically deposit/archive 100% of their articles to a nonprofit repository owned by institutions of higher education or a government entity, such as the NIH, within six months of publication.
    Justification: Making the work available to the public supports democratization of knowledge, and using government-owned or nonprofit open repositories to do so supports information exchange and sustainability of scholarship.
  • Article Processing Charges: Highest credit is given to publishers who have an average open-access article processing charge (APC) of $999 or less, factoring in all open-access journal titles (4 points). This dollar amount was informed by the University of California’s Pay It Forward project, which found that average APCs were between $1800 and $2000 (depending on data source), and that a sustainable APC range for an article published in the STEM fields, dominated by the five big publishers as shown by Larivière, to be $1103 to $2566 [70]. This range does not reflect non-STEM fields, nor does it compare big publishers’ costs to those of smaller publishers. Smaller publishers may have lower overhead, a nonprofit mission, and more publishing subsidies through grants, library support, or other subsidies. Average APCs over $2500 do not earn any points. In the future, the number of journals with APCs over $2500 may detract from the overall credits earned by a publisher. (If additional fees are possible, such as fees for color images, these are taken into consideration in a separate category.) For publishers not providing open-access/APC options, article publication fees should be considered instead.
    Justification: Many open-access journals do not charge any APCs (see the sample of journal titles from University of California’s eScholarship portfolio). Many such journals are subsidized with support from funders or institutions of HE. Low or zero APCs are often the result of partnerships involving nonprofit funding organizations, publishers, and institutions of HE and support the value of sustainability of scholarship.
  • Copyright: Highest credit is given to publishers that allow authors to retain all rights to their work, or their articles are openly licensed with a free cultural work license, in 100% of their journals (3 points). Fewer points are given for a Creative Commons license that prohibits derivatives. Embargoes are not considered here as they are included in the “Public Access” category.
    Justification: Author rights’ retention relates to the sustainability of scholarship (putting authors in control of their work rather than outside, possibly for-profit entities). Creative Commons licenses support both democratization of access to knowledge and information exchange through the rights they grant to users.
  • Author Use: Highest credit is given to publishers that allow authors to reuse or adapt the content they have authored without written permission if the work is cited and a link to the publisher’s version is provided (3 points). This is separate from the copyright criterion as some publishers give authors these rights even though copyright is transferred to the publisher.
    Justification: Allowing authors to exercise many or all rights related to their work indicates respect for authors as content providers and partners with publishers in scholarship, and relates to sustainability of scholarship.
  • Educational Use: Highest credit is to publishers that allow reuse of articles for noncommercial educational use anywhere without either written permission or a fee if the work is cited and a link to the publisher’s version is provided (3 points). Though several of the Creative Commons licenses embody this right, not all publishers use a Creative Commons license even when their publication agreements match one of the license’s permissions. Again, this is separate from the copyright criterion as some publishers give authors these rights even though copyright is transferred to the publisher.
    Justification: Allowing authors to retain, and institutions to exercise, many or all rights related to a publication honors authors and institutions as partners with publishers in scholarship, is important to lifelong learning, and, similar to “Author Use,” relates to sustainability of scholarship.
  • Business Model: Highest credit (4 points) is given to publishers with missions that parallel and/or mirror the missions of HE institutions, and public institutions especially, and that also provide additional support for students and researchers through their services (e.g., a society providing graduate student members travel scholarships). This support, long existing among learned and professional societies, is core to the publishing organization’s mission, founding, and/or purpose. They exist outside of institutions and epitomize the meaning of “partners” with institutions of higher education.
    Justification: Both democratization of knowledge and sustainability of scholarship are core values of many nonprofit, scholarly organizations; they typically view institutions of HE more as collaborators and partners, less as exploitable funding sources. This category highlights similarity or dissimilarity between organizational missions more than any other category.
  • Discoverability: Publishers that provide open-access metadata and contribute to their journals’ discoverability are favored. Ideally, descriptive as well as administrative (intellectual property) data will be shared. Publishers can earn points through multiple practices/criteria within this category, up to a total of 13 points. Note: A journal being listed in the DOAJ does not count as metadata contributions to DOAJ; journals must add article-level metadata to DOAJ to earn points.
    Justification: This category is vital to information exchange.
  • Business Practices (see Table 1 for a full list of points in this category): The business practices evaluated in the system are limited to those apropos to the values of democratizing access to knowledge, information exchange, and sustainability of scholarship. The practices in this category represent best practices that may come from outside the field of librarianship (e.g., website accessibility standards) and/or from librarians’ licensing/negotiating experiences (e.g., absence of Nondisclosure Agreements). A total of up to 23 points can be earned in this category.
    Justification: Viewing scholars and institutions as partners is key in this category. From accessibility standards to cost transparency to free text mining, this category more than any other measures partnership practices.
  • Publishing Practices: Publishers earn points for providing clear rights statements, waving APCs for authors and institutions unable to pay, and so on (up to 31 points total). While many of the criteria in this category are similar to COPE’s best practices and membership evaluation terms, PAPPI identifies preferred practices, whereas COPE looks for transparency in practices. Publishers may not be COPE members and can still earn points in this category.
    Justification: Sustainability (e.g., preservation), democratization of knowledge (e.g., low submission fees), and information exchange (e.g., use of ORCID identifiers) are all represented in this category.
  • Other Innovations: Criteria in this category may grow, but examples include integration with ORCID, which is valuable for information exchange and proper attribution.
    Justification: ORCID may become a standard in “Discoverability,” but many journals are not yet integrated with it. Other innovations in this category are especially important in the sciences (such as CRediT taxonomy or Registered Reports) but are less relevant in disciplines in which one author on a paper is the norm, or in which methodologies are not scientific in nature. This category currently functions as a placeholder for important but far from universal practices. Community discussions about this category’s criteria and scoring would be necessary before awarding points.

2.4. Samples and Calculated Scores

In order to test the scoring system, the author chose four publishers to evaluate using PAPPI. Publishers were chosen to represent a wide range of publishing organizations. One was Elsevier, a large, for-profit publishing company owned by a large multinational corporation. It is one of the five big publishers Larivière identified and publishes over 2500 journals. A sample size of 30 journals published by Elsevier and Elsevier Cell Press (excluding Elsevier España and Elsevier Masson) was used in the evaluation. The next largest publisher was eScholarship Publishing, a member of the Library Publishing Coalition. eScholarship Publishing is subsidized by the University of California and managed by the California Digital Library. A total of 85 journals are supported on the eScholarship platform. All 85 are either gold or diamond/platinum open-access journals. Most are faculty-led peer-reviewed journals or proceedings, though some are student-led publications. A sample size of 23 eScholarship journals was evaluated.
Two smaller publishers were also included in the evaluation. The Society for Neuroscience is an international learned/professional society based in the United States and a member of the Scientific Society Publisher Alliance. It publishes two peer-reviewed journals, one of which is a hybrid, open-access-optional journal. The other is a fully “gold” open-access journal. Both were evaluated. Finally, Evolutionary Ecology Research was one of the first journals founded after an editorial board resigned to protest publisher policies and subscription fees at their former journal after it was transferred to a larger publisher. The publisher is Evolutionary Ecology Limited, an independent company owned by the editor-in-chief of the journal, and it publishes only one journal, which was evaluated for the test.
To create the sample for the largest publisher in the test, the lists of Elsevier and Elsevier Cell Press titles were screen scraped from Jisc’s SHERPA/RoMEO database and added to an Excel spreadsheet. Then, with each journal title in a separate numbered row, a random number generator determined which journals/rows to include in the sample. If a journal was not in English, or if a journal was no longer published, another title was chosen by moving to the next lower-numbered row in the spreadsheet until an English-language journal currently published was found. English-language was a requirement so that the author, whose native language is English, could read the journal’s website to gather additional data to score. The sample for the eScholarship Publishing was handled similarly. All titles were copied and pasted into a new sheet of the same Excel file mentioned above, and random numbers were used to select journals for evaluation. If a journal was run by students (with the exception of law journals, in which law student leadership is the norm) or if the journal had not published an issue in two or more years, another journal was chosen by moving to the next lower-numbered row.
PAPPI categories that were evaluated at the publisher level included Business Model, Discoverability, and Business Practices, as well as several criteria from the category Publishing Practices, such as preservation. Searching the websites of the publishers sufficed in finding most of that information. Occasionally, searching another organization’s site (such as COPE) was required to verify membership or participation. The PAPPI categories of Public Access, APCs, Copyright, Author Use, Educational Use, and several criteria from the category of Publishing Practices were evaluated by journal title. This evaluation involved searching individual journal web pages or sites, and sometimes consulting Jisc’s SHERPA/RoMEO database of journal publishing policies. APCs of the sample titles were averaged to determine score. Copyright scores were determined according to PAPPI documentation, which states “When copyright ownership and licensing policies differ across journal titles, assign the score using the most restrictive permissions.” Screen captures of the publishers’ and journals’ websites that match the content evaluated by the author are available in the Internet Archive’s Wayback Machine.
No scores were tallied for any title or publisher for the criterion “Meets the latest standard of the Web Content Accessibility Guidelines from the World Wide Web Consortium” (under Business Practices) nor for “Tables, graphs, illustrations, etc., published in each article clearly identify creator name(s) and include a clear rights statement” (under Publishing Practices) as these were either outside of the individual author’s knowledge (accessibility) or too time-consuming for one person to tabulate (rights statements). No scores were tabulated for the entire category of Other Innovations as the criteria are still nascent.

3. Results

The learned society publisher and the library publisher earned the most points, 67 and 66 respectively, and fell into the proposed PAPPI Tier 1 designation. Their scores largely resulted from high points for Discoverability, Business Model, and Business Practices, as well as Copyright, Author Use, and Educational Use. The independent, single-title publisher earned 45 points, placing it in PAPPI Tier 2. Discoverability and Publishing Practices hurt its scores the most, though it scored well in Business Practices, Copyright, Author Use, and Educational Use. The large, commercial publisher earned the fewest points. A summary of the scores earned are in Table 2. For a more detailed breakdown of points awarded, see the Supplemental File Spreadsheet S1: Journal Data and Publisher Scores.

4. Discussion

A learned society that is a member of the Scientific Society Publisher Alliance and a library publisher that is a member of the Library Publishing Coalition earned the highest scores in the test, making them the publishers with the strongest HE and library partnership practices of the four evaluated. This result is not surprising because PAPPI scores are based on values shared by learned societies and librarians. It is notable that one of the learned society’s two journals is not fully “gold” open-access, but is a hybrid, open-access-optional journal, and the publisher still earned high marks. The publisher earning the third highest score is an independent, single-title operation owned by a faculty member. This mission-driven publisher was already a partner with HE in many ways (see points earned for affordability, copyright agreements, and business practices), but could increase its degree of partnership with further support involving metadata creation. Providing additional institutional resources to this publisher to meet these expectations may be justified and encouraged.
The publisher with the largest portfolio in the test, Elsevier, scored high in good publishing practices but its business practices and proprietary approach to author rights hurt its score overall. It was the publisher acting the least like a partner with HE in this comparison. An institution may be justified in seeking substantially different terms from such a publisher.
The scores were calculated by one individual, so errors are possible. Nonetheless, the scores in Table 2 should approximate what the publisher’s overall score would be because categories 7–9 include a high number of possible points to earn, and these same categories relate to a journal publisher’s overall policies and practices and generally should not differ between titles. There is less possibility for error in calculating these points. For example, all of a single publisher’s journals should have the same discoverability/metadata scores, which would match the publisher’s overall score because metadata practices are typically the same across a publisher’s portfolio.
PAPPI currently awards publishers 1–4 points for Public Access (category 1) and APCs (category 2), 1–3 points for Copyright terms (category 3), and so on. In the categories of Discoverability (category 7), Business and Publishing Practices (categories 8 and 9), and Other Innovations (category 10), points are accumulated for various terms, practices, and technological innovations. For smaller publishers with few staff, such as Evolutionary Ecology Limited, earning points for practices and technological innovations in categories 7 through 10 may prove difficult. If so, should weights be assigned to each category? Should points earned within particular categories be increased or decreased? These and similar questions should be considered before implementing any publisher scoring system.
For librarians, this method of evaluation is not typical practice. It moves vendor evaluations away from cost and access models alone. The profession might consider whether a more holistic evaluation of a publisher that is inclusive of ethics and practices, like this one, is worthwhile. In addition, it is important to consider whether or not this scoring system, as it stands, accurately reflects library and HE values. What is missing? What could be added? What other scoring systems are needed? Is a scoring system the best method for meeting the goals below, or is there something better? Additionally, how can libraries explain and share professional values and partnership practices in other ways that are meaningful outside the profession?

4.1. Goals

One of the goals in introducing a publisher evaluation system focused on partnership practices is to increase awareness of the scope of what libraries identify as publishers’ “more acceptable” (positive) practices, and conversely, what “less acceptable” (negative) practices are. Professional librarians are not typically prolific authors, so they are members of HE who can advocate for evaluating practices outside of established author-to-author and author–reviewer–publisher interactions that are the foundation of COPE’s Principles of Transparency and Best Practice in Scholarly Publishing. Partnership evaluations do not replace or substitute for the importance of COPE membership evaluations but add to a comprehensive understanding of a publisher’s practices.
Libraries overall naturally focus on practices related to licensing, costs, and access. These do not have to be discussions of absolutes. In the interest of sustainability, the future of scholarly publishing must be a topic of conversations among publishers, libraries, faculty, and institutional administrators. These conversations may benefit from framing issues in non-absolute terms, which is something PAPPI does. For example, open access may be viewed with skepticism or animosity by societies or small publishers. A scoring system that allows for gradations of access and licenses is more inclusive in identifying publishers with “positive” practices than a binary system, and could open opportunities for further dialogue.
The evaluation of publishers situates librarians’ professional perspective of themselves as stewards of public funds (or, in the case of private institutions, community funds), as part of larger institutions that are also stewards of public (or community) funds. The PAPPI criteria reflect several professional principles of librarianship beyond cost and access that relate to the rights of the public, such as information exchange (especially findability and preservation). These principles are central values of public HE institutions and many societies. Sharing this professional perspective of stewardship may be helpful in justifying an institution’s exploration of different metrics for evaluating publishers and other vendors, especially for those that evaluate the presence or absence of partner behaviors.
Libraries and HE already consider the interests of departments, faculty, and students in decision-making, as well as their Boards of Trustees, but also have the responsibility to consider the interests of the public as well as the quality, application, and preservation of the scholarly record beyond the institution. Research integrity and public access are both aspects of these greater institutional responsibilities, in which not only libraries but also institutional research offices play a role. Thus, another goal is to help start and/or continue discussions about what it is that institutions, as stewards of public funds and publicly-funded information, want from the parties with whom they enter into fiscal agreements. The threat of enclosure of the scholarly commons spurred by oligopolistic conditions is significant not just in publishing but also in systems for collecting and analyzing student data, research data, institutional data—systems that also preserve, control access to, and develop unique tools to associate such data. For institutions of HE, this threat leads to many questions about how to evaluate products and expenses. A system evaluating partnership practices might help answer some of them.

4.2. Audiences

A publisher evaluation system like PAPPI is not intended for exclusive use by any one entity in HE. While PAPPI does evaluate the practices of publishers through several values of librarianship, it is not intended to be used exclusively by librarians.
Faculty who have expressed concerns about the enclosure of the scholarly commons in any venue may be interested in a PAPPI-like evaluation system, including authors who signed the Cost of Knowledge boycott. Faculty who are serving on the editorial boards of journals that operate independently from the five biggest publishers may also be interested, as such a system may demonstrate ways in which their publications surpass others in the field. Additionally, a scoring system brings up questions for any researcher: When choosing a journal in which to publish, are there characteristics to consider other than IF? What might those characteristics be? And, should faculty RTP decisions take into account additional factors such as publisher scores that relate to the missions of public and land-grant universities?
Additionally, some small publishers and some society publishers may want to identify where they are on the partnership continuum, especially if doing so defines them as library-identified positive players in the publishing realm. This audience may overlap to a degree with faculty, as faculty members may lead a society or its publications as part of their service commitments. If partnership with institutions of HE is a priority for them, a system like PAPPI may help guide some of these publishers’ policy choices.
Institutional administrators wary of entering into million-dollar agreements with one entity, and/or deliberating about how to best achieve stewardship of public funds, may also find some value in the evaluation PAPPI provides. Those already spurred to action by BDP cancellations or interested in identifying vendors and partners that share their mission-driven priorities, may consider modifying PAPPI to evaluate non-publisher institutional agreements. PAPPI scores also beg the question: Do a vendor’s acts as a publisher, seen from the perspective of a library as steward and customer, give insight into that vendor’s potential behaviors toward institutions in the future, regardless of the product being sold by the vendor? And, should that perspective (and/or insight) be considered before spending more institutional funds on a vendor’s products? That is, if the publishing arm of a vendor scores low in partnership practices, what does that mean for decision-making about buying other products from that vendor? A partnership scoring system would give another data point in calculating such decisions.
Finally, though any scoring system is not limited to librarians’ uses, it may help librarians determine which publishers to prioritize working with in terms of resource allocation or transformative agreements. Libraries may identify publishers that act as strong partners with institutions of HE and that publish only via subscription access or hybrid, open-access-optional models; conversations on sustainability with these publishers may be beneficial for both the publishers and libraries. The criteria in such systems may also be relevant when negotiating institutional licensing terms with publishers. Note that PAPPI would not help libraries evaluate BDPs. A BDP evaluation may include factors such as cost-per-use, perpetual access to journal backfiles, and so on, but a publisher’s partnership score could be used as one aspect of a BDP evaluation when applicable.

4.3. Benefits of a Scoring System for Partnership Practices

In libraries, advocating for open access to scholarly publishing is a core value of the profession as it relates to equity of access; however, if all journals were to be published open-access, there are still other practices and characteristics that distinguish publishers from one another. These distinctions are important for libraries and institutions of HE to consider, and this system outlines some of them.
Furthermore, if all journals were to be published open-access, this would greatly impact some society publishers still in control of their own publications, especially those that publish hybrid open-access journals as an important source of funding. Many societies have values that are much more in common with those of public institutions of HE than with large publishers and data analytics companies. Because PAPPI scores give publishers credit for partnership practices, high PAPPI scores may lead libraries and their institutions to consider more fully the benefits of the subscription model for these publishers. Additionally, PAPPI can be useful to libraries in deciding whether to support a new publisher.
PAPPI does not push for “gold” open access exclusively. A publisher’s PAPPI score takes into account gradations of access and rights and avoids absolutes. This aspect may be simultaneously frustrating to those wanting a more singular, access-focused standard for evaluation, and also helpful for those seeking inclusive conversations with a wide range of scholarly publishing organizations.
Finally, for those outside librarianship, PAPPI scores are quantitative, similar to impact factors or certification designations such as LEED-certified buildings, standardizing the analysis and comparison of publishers. They also address publisher practices in terms more comprehensive than BDPs or cost-per-use, which are more library-centric considerations.

4.4. Limitations of a Scoring System for Partnership Practices

PAPPI is rooted in the values of librarianship and public colleges and universities. It may or may not be directly applicable to or useful for other institutions of HE. However, it should be relevant to an academic library housed in any institution of HE.
Perhaps the biggest weakness of this scoring system is its lack of vetting. In its current form, PAPPI should be considered a provisional scoring method. Missing from the criteria is anything about user data and privacy, for example, or percentage of review articles published, which can manipulate impact factors. If this system or any similar system is deemed valuable, libraries and/or institutions will have work to do to reach consensus on the criteria and points awarded per criterion.
The largest limitation of this scoring system may be that, if PAPPI scores have already been calculated, there will be no established incentives for faculty to publish with a high-scoring PAPPI publisher. This is a major hurdle for any similar scoring system. There are tensions between disciplinary review, RTP decisions, and public HE mission statements, due in part to current reliance on limited and flawed metrics for RTP decisions. PAPPI scores would increase those tensions.
Other limitations are the time and effort required to correctly identify and score publishers, and to review scores on a regular basis for accuracy. The biggest publishers have large portfolios with policies that vary not only from field to field but also from title to title. Furthermore, many smaller publishers would need to be identified in order for this to be a comprehensive system. Crowdsourcing is a potential method for identifying publishers and calculating scores, but any effort, crowdsourced or not, requires coordination and leadership. No such activities have begun.
Additionally, this system was designed to apply to journal publishers (not monograph publishers). If a publisher is owned by a parent company, or if a library or other unit on campus is considering paying a vendor without a publishing arm, this scoring system does not fully evaluate larger parent companies and/or non-publishing companies. Neither does PAPPI evaluate other tools a publishing company may offer (researcher profiles or institutional grant funding comparisons, for example). However, aspects of this scoring system may be used in or adapted to developing a scoring system for other vendors or products.
Finally, as with other scoring systems and metrics, PAPPI scores are not comprehensive. Just as it is problematic to evaluate a researcher by their H-index alone, or a journal only by its IF, it is also problematic to evaluate a publisher by any single metric.

5. Conclusions

There have been signs of discontent with the state of scholarly publishing for the last 30 years. This coincides with the years in which large publishers began expanding their journal title portfolios by acquiring established journals typically run by nonprofit publishers and/or societies. This discontent is not limited to librarians, but includes many researchers concerned about publishers’ actions, the quality and reproducibility of research, and the sustainability of academic research and publishing. The system presented here, even if never actualized, is intended to help anyone involved in scholarly publishing consider what makes publishers partners with academics and their institutions from the perspectives of library, learned society, and HE values.
This system is similar to evaluations used in other realms to determine sustainability and best practices. In scholarly publishing, COPE and the DOAJ offer seals or memberships to those who meet particular criteria. In agriculture, organic certifications identify food grown and produced according to particular practices. In trade, fair trade certification and B corporation certification identify best practices with regard to environmental impact and human labor. PAPPI is similar to but less binary than most of these systems, making it akin to the LEED tiered certification in architecture.
Most, if not all, of these other certifications or scoring systems are the responsibility of an outside entity, independent of the parties involved. Systems like this require work: updating, maintenance, outreach, verification, and so on. This work is likely outside the realm of libraries alone. One question is whether academic libraries see any value in scoring publishers; the next question is whether institutions of HE see any value in it. If so, whose responsibility is it to implement such a system? Would relevant entities (ALA, Association of Public and Land-grant Universities, Scientific Society Publisher Alliance, and so on) take part? These questions remain to be answered.

Supplementary Materials

The following are available online at https://www.mdpi.com/2304-6775/8/3/39/s1, Spreadsheet S1: Journal Data and Publisher Scores.

Funding

This research was funded with financial support from the University Libraries Faculty Research Incentive Program at the University of Tennessee.

Acknowledgments

I would like to thank Ellen Finnie for encouragement and comments on the scoring system and criteria, as well as Georgie Donovan, Meredith Hale, Susan Payne, Elisabeth Shook, and Amanda Echterling for feedback on the scoring criteria, Rachel W. Gammons and Iman Tahamtan for comments and suggestions on the literature review, Peter Fernandez for early support of this concept, and Linda Phillips for recommending starting points.

Conflicts of Interest

The author declares no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Appendix A

Complete PAPPI Scoring Library
Scores are determined by how many points a publisher earns in the following categories:
  • Public Access
  • Article Processing Charges
  • Copyright
  • Author Use
  • Educational Use
  • Business Model
  • Discoverability
  • Business Practices
  • Publishing Practices
  • Other Innovations
(1) Public Access
Credit is given if the publisher favors public access as defined below.
  • 4 = Publisher makes articles fully and immediately available to the public in at least 85% of their journals (and they are DOAJ-listed journals or the publisher is an OASPA member)
  • 3 = Publisher auto-archives 100% of articles in a non-profit repository (owned by government or HE institution) within 6 months of publication
  • 2 = Publisher auto-archives 100% of articles in a non-profit repository (owned by government or HE institution) within 12 months of publication
  • 1 = Author-optional archiving of articles to a non-profit repository (owned by government or HE institution) within 12 months of publication for 100% of articles, using either the post-print/ author-accepted-manuscript or publisher’s version
(2) Article Processing Charges (APC)
Article processing charges should not vary widely between a publisher’s journal titles in the sciences, social sciences, and humanities. Consistency and reasonableness are the aim. Credit is given based on a publisher’s average APC across all open access titles. (In the future, the number of journals with APCs over $2500 may detract from the overall credits earned by a publisher. If additional fees are possible, such as fees for color charts, these are taken into consideration in a separate category.) These dollar amounts were informed by the University of California’s Pay It Forward project.
  • 4 = Average APC is $0–$999
  • 3 = Average APC is $1000–$1499
  • 2 = Average APC is $1500–$1999
  • 1 = Average APC is $2000–$2500
Note: For publishers not providing OA/APC options, article publication fees should be considered instead using the same point:cost ranges above.
(3) Copyright
Copyright agreements should be the same across all journal titles and favor true author ownership or open licenses. Credit is given if 100% of the publisher’s journal titles allow authors to retain all rights to their work, or the articles are openly licensed for community use and reuse. Any exclusive license to the publisher should be for 12 months or less, otherwise the score in this category will be zero. When copyright ownership and licensing policies differ across journal titles, assign the score using the most restrictive permissions.
  • 3 = Author retains all rights
  • 3 = CC BY or CC BY-SA license
  • 2 = CC BY-NC or CC BY-NC-SA license
  • 1 = CC BY-NC-ND with copyright belonging to the author and/or Author retains all rights to post-print (including right to share accepted manuscript anywhere immediately)
(4) Author Use
Credit is given if 100% of a publisher’s journal titles allow authors to reuse the content they have authored without written permission if the work is cited and a link to the publisher’s version is provided. Any exclusive license to the publisher should be for 12 months or less. This is separate from the copyright criterion as some publishers give authors these rights even though copyright is transferred to the publisher.
  • 3 = Authors may reuse work they authored for any purpose, commercial or noncommercial
  • 2 = Authors may reuse work they authored for any noncommercial purpose
  • 1 = Authors have limited explicit permissions, such as explicit permission to reuse full articles in theses/dissertations
(5) Educational Use
Credit is given if 100% of a publisher’s journal titles allow reuse of articles for noncommercial educational use without either written permission or a fee if the work is cited and a link to the publisher’s version is provided. Though several of the Creative Commons licenses embody this right, not all publishers use a Creative Commons license even if their publication agreements match one of the license’s permissions.
  • 3 = Anyone, including authors, may reuse articles for noncommercial educational purposes (12 month embargo or less)
  • 2 = Authors may reuse work they authored for noncommercial educational purposes (No embargo)
  • 1 = Authors may reuse work they authored for noncommercial educational purposes at their institution (No embargo)
(6) Business Model
Publishers with missions that parallel and/or mirror the missions of HE institutions, and public land-grant institutions especially, are favored.
  • 4* = Nonprofit Publisher, not tied to institutions of higher ed (e.g., a scholarly society) providing additional support for students and researchers through its services (e.g., scholarly society providing graduate student members travel scholarships or developing educational resources for K-12 students). This support is core to their mission, founding, and/or purpose. They exist outside of institutions and epitomize the meaning of “partners” with institutions of higher ed.
  • 3 = Nonprofit Publisher, tied to institutions of higher ed (e.g., a university press or library press).
  • 2 = Nonprofit Publisher, other (e.g., not a society, not a university press).
  • 1 = Journal/Publisher Owned and Distributed by Faculty or Academics who determine policies and do not have a non-profit status.
*Note: If Supplemental Materials published for non-scholarly audiences, such as K-12 educational resources, are openly-licensed, the publisher earns an additional point.
(7) Discoverability
Publishers that provide rich open access metadata and contribute to their journals’ discoverability are favored. Note: A publisher may earn points in more than one criteria within this category.
  • 3 = Metadata can be shared following an established schema, such as NLM, RDF, or MODS
  • 3 = Metadata is open access
  • 3 = Articles are published using a permanent identifier, such as a DOI or Handle
  • 2 = Publisher provides rich metadata, improving discoverability and reuse, including descriptive, technical, structural, and administrative (IP) metadata elements
  • 2 = Publisher voluntarily contributes metadata to open initiatives, such as the DOAJ or PMC, even when not funder-required (Note: A journal being listed in the DOAJ does not count as metadata contributions to DOAJ; journals must add article-level metadata to DOAJ to earn points.)
(8) Business Practices
The practices outlined below represent best practices from outside the field of librarianship (e.g., website accessibility standards) and institutional considerations from librarians’ experiences (e.g., Nondisclosure Agreements). Note: A publisher may earn points in more than one criteria within this category.
  • 4 = The publisher’s website and its online publications meet the latest standard of the Web Content Accessibility Guidelines from the World Wide Web Consortium
  • 4 = The publisher does not allow the same editorial board, with the same review process, to lead multiple journals (such as one hybrid, one open access)
  • 4 = The publisher has not politically lobbied against public access policies.
  • 4 = No boards of journals have left en masse due to disagreements over subscription costs or other terms.
  • 4 = The publisher’s agreements with libraries (via journal subscription agreements and/or via the publisher’s licenses of other products) do not include Non-Disclosure Agreements or similar limitations.
  • 3 = The publisher permits text and data mining at no cost for scholarly purposes.
(9) Publishing Practices
Many of the criteria in this category relate to COPE’s best practices and membership evaluation; however, PAPPI identifies preferred practices. Publishers may not be COPE members and can still earn points in this category. Note: A publisher may earn points in more than one criteria within this category.
  • 5 = Publisher is a member of COPE (or at least 90% of its journals are members)
  • 4 = The publisher waives or reduces APCs to authors and institutions unable to pay the full APC. This applies not only to countries but also to smaller institutions, regardless of country. This applies to publication fees when OA is not an option.
  • 3 = Tables, graphs, illustrations, etc. published in each article clearly identify creator name(s) and include a clear rights statement.
  • 3 = The publisher clearly grants permission for academic reuse of author-created tables/graphs/illustrations without seeking further permission.
  • 3 = The publisher clearly grants permission for reuse of articles as chapters in theses/dissertations by the same author at no cost, provided appropriate citation/notice is given in thesis/dissertation (see COPE recommendations).
  • 3 = The publisher’s journals are preserved by participation in CLOCKSS, LOCKSS, or other established preservation effort.
  • 3 = Charges outside the APC, such as color image charges or page charges, are in line with actual costs, i.e., color image charges should be 10% of the APC (up to $200, that is, 10% of $2,000) or less.
  • 3 = The submission fee is $0-$150 and can be waived for hardship.
  • 2 = Authors, or at least the corresponding author, are identified with ORCIDs.
  • 2 = Authors are allowed to share their pre-prints on a disciplinary pre-print server.
(10) Other Innovations
Innovations in openness, transparency, usability, community, or preservation are favored. Credit is given if a publisher has at least one journal with any of the innovations with one asterisk, or if 85-100% of journal titles include any of the innovations with two asterisks. Note: A publisher may earn points in more than one criteria within this category.
  • 3 = Article-level Metrics available*
  • 2 = Functional ORCID integration*
  • 3 = Registered Reports (peer-reviewed methodology from Center for Open Science)**
  • 3 = Publishes any scholarship reviewed as “good scholarship” (Eliminating arbitrary numbers of articles per issue)**
  • 2 = Open peer-review**
  • 2 = Articles provide proper author contributions using CRediT (Contributor Role Taxonomy by CASRAI)**
Note: Many of the criteria in this category are more applicable to journals published in the sciences and social sciences and less applicable to the humanities. This category needs further consideration to accommodate publishers in the humanities.
Notes on Additional Criteria
Consider earning points for:
  • Improving diversity, inclusion, and equity in scholarly publishing (perhaps regularly publishing and/or translating research from outside the Anglosphere)
  • Not charging higher fees for CC licenses (see Danny Kingsley’s 2017 blog post, “Flipping journals or filling pockets? Publisher manipulation of OA policies,” https://unlockingresearch-blog.lib.cam.ac.uk/?p=1726)
  • Protecting privacy and user data
  • Low percentage of review articles published

References

  1. Open Access Directory—Simmons University. Journal Declarations of Independence. Available online: http://oad.simmons.edu/oadwiki/Journal_declarations_of_independence (accessed on 8 January 2020).
  2. Yoon, C.K. Soaring Prices Spur a Revolt in Scientific Publishing. The New York Times, 8 December 1998. Available online: https://www.nytimes.com/1998/12/08/health/soaring-prices-spur-a-revolt-in-scientific-publishing.html (accessed on 27 January 2020).
  3. Mass Resignations at Wiley Journal over Academic Independence. Times Higher Education (THE), 7 February 2020. Available online: https://www.timeshighereducation.com/news/mass-resignations-wiley-journal-over-academic-independence (accessed on 7 February 2020).
  4. Kyrillidou, M.; Young, M. ARL Statistics 2004-2005; Association of Research Libraries: Washington, DC, USA, 2006. [Google Scholar]
  5. Association of Research Libraries. Report of the ARL Serials Prices Project: A Compilation of Reports Examining the Serials Prices Problem; Association of Research Libraries: Washington, DC, USA, 1989; p. 62. [Google Scholar]
  6. Open Access Directory—Simmons University. Timeline. Available online: http://oad.simmons.edu/oadwiki/Timeline (accessed on 8 January 2020).
  7. Basken, P. The U. of California’s Open-Access Promise Hits a Snag: The Faculty. The Chronicle of Higher Education, 7 July 2016. Available online: https://www.chronicle.com/article/The-U-of-California-s/237044 (accessed on 18 May 2020).
  8. Registry of Open Access Repository Mandates and Policies (ROARMAP). Policy Alignment to Horizon 2020. Available online: http://roarmap.eprints.org/dataviz.html (accessed on 31 January 2020).
  9. Gowers, W.T. The Cost of Knowledge. Available online: http://thecostofknowledge.com/ (accessed on 8 January 2020).
  10. Gowers, W.T. Elsevier—My Part in Its Downfall. Gowers’s Weblog, 21 January 2012. Available online: https://gowers.wordpress.com/2012/01/21/elsevier-my-part-in-its-downfall/ (accessed on 18 May 2020).
  11. Yarkoni, A.T. Why I Still Won’t Review for or Publish with Elsevier–and Think You Shouldn’t Either. Citation Needed [Blog]. 12 December 2016. Available online: https://www.talyarkoni.org/blog/2016/12/12/why-i-still-wont-review-for-or-publish-with-elsevier-and-think-you-shouldnt-either/ (accessed on 18 May 2020).
  12. Heyman, T.; Moors, P.; Storms, G. On the Cost of Knowledge: Evaluating the boycott against Elsevier. Front. Res. Metr. Anal. 2016, 1. [Google Scholar] [CrossRef] [Green Version]
  13. Scholarly Publishing and Academic Resources Coalition (SPARC). Big Deal Cancellation Tracking. Available online: https://sparcopen.org/our-work/big-deal-cancellation-tracking/ (accessed on 8 January 2020).
  14. Elsevier. Shop and Discover over 51,000 Books and Journals—Elsevier. Available online: https://www.elsevier.com/catalog?producttype=journals (accessed on 10 February 2020).
  15. Larivière, V.; Haustein, S.; Mongeon, P. The oligopoly of academic publishers in the digital era. PLoS ONE 2015, 10, e0127502. [Google Scholar] [CrossRef] [PubMed]
  16. Fyfe, A.; Coate, K.; Curry, S.; Lawson, S.; Moxham, N.; Røstvik, C.M. Untangling Academic Publishing: A History of the Relationship between Commercial Interests, Academic Prestige and the Circulation of Research. 2017. Zenodo. Available online: https://zenodo.org/record/546100#.XxfylShKhPY (accessed on 18 May 2020).
  17. cOAlition S. Principles and Implementation | Plan S. Available online: https://www.coalition-s.org/addendum-to-the-coalition-s-guidance-on-the-implementation-of-plan-s/principles-and-implementation/ (accessed on 10 February 2020).
  18. Inger, S.; Gardner, T. Scholarly Journals Publishing Practice: Academic Journal Publishers’ Policies and Practices in Online Publishing, Fourth Survey, 2013; Association of Learned & Professional Society Publishers: Hertfordshire, UK, 2013. [Google Scholar]
  19. Open Access Scholarly Publishers Association. OASPA Feedback on Plan S Implementation Guidance. Available online: https://oaspa.org/oaspa-feedback-on-plan-s-implementation-guidance/ (accessed on 10 February 2020).
  20. Royal Society of Chemistry. Read & Publish Scheme. Available online: https://www.rsc.org/journals-books-databases/open-access/read-and-publish/ (accessed on 10 February 2020).
  21. Schlosser, M. Building capacity for academy-owned publishing through the Library Publishing Coalition. Libr. Trends 2018, 67, 359–375. [Google Scholar] [CrossRef]
  22. Transitioning Society Publications to Open Access. Charge. Available online: https://tspoa.org/about/charge/ (accessed on 18 May 2020).
  23. Brembs, B. Scholarship Has Bigger Fish to Fry Than Access. björn.brembs.blog. 14 October 2019. Available online: http://bjoern.brembs.net/2019/10/scholarship-has-bigger-fish-to-fry-than-access/ (accessed on 10 May 2020).
  24. Center for Open Science. Registered Reports. Available online: https://cos.io/rr/ (accessed on 10 February 2020).
  25. Aspesi, C.; Allen, N.S.; Crow, R.; Daugherty, S.; Joseph, H.; McArthur, J.T.; Shockey, N. SPARC Landscape Analysis, 2019. Scholarly Publishing and Academic Resources Coalition. Available online: https://osf.io/preprints/lissa/58yhb/ (accessed on 12 February 2020).
  26. Schonfeld, R. Red Light, Green Light: Aligning the Library to Support Licensing. 2017. Ithaka S+R. Available online: https://sr.ithaka.org/publications/red-light-green-light-licensing/ (accessed on 31 January 2020).
  27. Principles of Transparency and Best Practice in Scholarly Publishing, 2018. Committee on Publication Ethics. Available online: https://publicationethics.org/node/19881 (accessed on 10 February 2020).
  28. Fair Open Access Alliance. The Fair Open Access Principles. Available online: https://www.fairopenaccess.org/the-fair-open-access-principles/ (accessed on 12 February 2020).
  29. McKiernan, E.C.; Schimanski, L.A.; Muñoz Nieves, C.; Matthias, L.; Niles, M.T.; Alperin, J.P. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. eLife 2019, 8, e47338. [Google Scholar] [CrossRef] [PubMed]
  30. Vanclay, J.K. Impact factor: Outdated artefact or stepping-stone to journal certification? Scientometrics 2012, 92, 211–238. [Google Scholar] [CrossRef]
  31. Seglen, P.O. Why the impact factor of journals should not be used for evaluating research. BMJ 1997, 314, 497. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. PLoS Medicine Editors. The Impact Factor Game. PLOS Med. 2006, 3, e291. [Google Scholar] [CrossRef] [Green Version]
  33. Falagas, M.E.; Alexiou, V.G. The top-ten in journal impact factor manipulation. Arch. Immunol. Ther. Exp. 2008, 56, 223–226. [Google Scholar] [CrossRef] [PubMed]
  34. Fang, F.C.; Steen, R.G.; Casadevall, A. Misconduct accounts for the majority of retracted scientific publications. Proc. Natl. Acad. Sci. USA 2012, 109, 17028–17033. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Garfield, E. Fortnightly Review: How can impact factors be improved? BMJ 1996, 313, 411–413. [Google Scholar] [CrossRef] [PubMed]
  36. Opthof, T. Sense and nonsense about the impact factor. Cardiovasc. Res. 1997, 33, 1–7. [Google Scholar] [CrossRef]
  37. Cole, J.R. A short history of the use of citations as a measure of the impact of scientific and scholarly work. In The Web of Knowledge: A Festschrift in Honor of Eugene Garfield; Cronin, B., Atkins, H.B., Eds.; Information Today: Medford, NJ, USA, 2000; ISBN 978-1-57387-099-3. [Google Scholar]
  38. Young, N.S.; Ioannidis, J.P.A.; Al-Ubaydli, O. Why current publication practices may distort science. PLoS Med. 2008, 5, e201. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. San Francisco Declaration on Research Assessment. Available online: https://sfdora.org/read/ (accessed on 15 May 2020).
  40. Finnie, E. What Organic Food Shopping Can Tell us about Transforming the Scholarly Communications System—IO: In The Open. Available online: https://web.archive.org/web/20180711230349/http://intheopen.net/2016/03/what-organic-food-shopping-can-tell-us-about-transforming-the-scholarly-communications-system/ (accessed on 25 February 2020).
  41. Partner, n.1. OED Online. Available online: https://www.oed.com/view/Entry/138316 (accessed on 25 February 2020).
  42. United States. Bureau of the Census Historical Statistics of the United States, Colonial Times to 1970; U.S. Dept. of Commerce; Bureau of the Census; U.S. Govt. Print. Off.: Washington, DC, USA, 1975. Available online: http://archive.org/details/historicalstatis00unit (accessed on 25 February 2020).
  43. Association of Public and Land-grant Universities. The Land-Grant Tradition. Available online: https://www.aplu.org/library/the-land-grant-tradition/file (accessed on 17 February 2020).
  44. 7 U.S. Code § 305—Conditions of Grant. Available online: https://www.law.cornell.edu/uscode/text/7/305 (accessed on 25 February 2020).
  45. Lee, R.; Ahtone, T.; Pearce, M. Land-Grab Universities. 2020. Available online: https://www.landgrabu.org/ (accessed on 19 July 2020).
  46. NCSES Academic Institution Profiles—Rankings by Total R&D Expenditures. NSF—National Science Foundation. Available online: https://ncsesdata.nsf.gov/profiles/site?method=rankingBySource&ds=herd (accessed on 25 February 2020).
  47. Doctorate Recipients from U.S. Universities 2018. NSF—National Science Foundation. Available online: https://ncses.nsf.gov/pubs/nsf20301/data-tables (accessed on 25 February 2020).
  48. Abramson, C.I.; Damron, W.S.; Dicks, M.; Sherwood, P.M.A. History and Mission. In The Modern Land-Grant University; Sternberg, R.J., Ed.; Purdue University Press: West Lafayette, IN, USA, 2014; ISBN 978-1-61249-335-0. [Google Scholar]
  49. Robert, J. Sternberg Defining a Great University | Inside Higher Ed. Available online: https://www.insidehighered.com/views/2010/11/29/defining-great-university (accessed on 4 March 2020).
  50. Guri-Rosenbilt, S. Access and equity in higher education: Historical and cultural contexts. Access Equity 2010, 9–34. [Google Scholar] [CrossRef]
  51. Nasaw, D. Andrew Carnegie; Penguin: New York, NY, USA, 2006; ISBN 978-0-14-311244-0. [Google Scholar]
  52. Krass, P. Carnegie; John Wiley & Sons: New York, NY, USA, 2002; ISBN 978-0-471-38630-8. [Google Scholar]
  53. Jones, P.A. Libraries, Immigrants, and the American Experience; Greenwood Press: Westport, CT, USA, 1999. [Google Scholar]
  54. McCook, K.P. Rocks in the Whirlpool, 2006. American Library Association. Available online: http://www.ala.org/aboutala/missionhistory/keyactionareas/equityaction/rockswhirlpool (accessed on 31 January 2020).
  55. Code of Ethics of the American Library Association, 2008. American Library Association. Available online: http://www.ala.org/tools/ethics (accessed on 25 February 2020).
  56. Association of Research Libraries. Who We Are. Available online: https://www.arl.org/who-we-are/ (accessed on 4 March 2020).
  57. Association of Research Libraries. Principles and Practices. Available online: https://www.arl.org/principles-and-practices/ (accessed on 4 March 2020).
  58. Cooney-McQuat, S.; Busch, S.; Kahn, D. Open access publishing: A viable solution for society publishers. Learn. Publ. 2010, 23, 101–105. [Google Scholar] [CrossRef]
  59. Meadows, A.J. The growth of journal literature: A historical perspective. In The Web of Knowledge: A Festschrift in Honor of Eugene Garfield; Cronin, B., Atkins, H.B., Eds.; Information Today: Medford, NJ, USA, 2000; ISBN 978-1-57387-099-3. [Google Scholar]
  60. Fyfe, A. Journals, learned societies and money: Philosophical Transactions, ca. 1750–1900. Notes Rec. R. Soc. J. Hist. Sci. 2015, 69, 277–299. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Leslie, D.M. A shifting mosaic of scholarly publishing, scientific delivery, and future impact changing the face of learned societies. J. Mammal. 2007, 88, 275–286. [Google Scholar] [CrossRef] [Green Version]
  62. Fyfe, A.; Moxham, N. Making public ahead of print: Meetings and publications at the Royal Society, 1752–1892. Notes Rec. 2016, 70, 361–379. [Google Scholar] [CrossRef] [PubMed]
  63. Bergstrom, T.C.; Courant, P.N.; McAfee, R.P.; Williams, M.A. Evaluating big deal journal bundles. Proc. Natl. Acad. Sci. USA 2014, 111, 9425–9430. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  64. The Society Publishers’ Coalition. Home page. Available online: https://www.socpc.org (accessed on 27 January 2020).
  65. The Scientific Society Publisher Alliance. The Alliance. Available online: https://byscientistsforscience.org/the-alliance/ (accessed on 27 January 2020).
  66. Brumfiel, G. Director’s salary makes chemists see red. Nature 2004, 430, 957. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Marris, E. Chemistry society goes head to head with NIH in fight over public database. Nature 2005, 435, 718. [Google Scholar] [CrossRef] [PubMed]
  68. U.S. Green Building Council. LEED Rating System. Available online: https://www.usgbc.org/leed (accessed on 13 May 2020).
  69. U.S. Green Building Council. Why LEED Certification. Available online: https://www.usgbc.org/leed/why-leed (accessed on 13 May 2020).
  70. Smith, M.; Anderson, I.; Björk, B.C.; McCabe, M.; Solomon, D.; Tananbaum, G.; Tenopir, C.; Willmott, M. Pay It Forward: Investigating a Sustainable Model of Open Access Article Processing Charges for Large North American Research Institutions. 2016. eScholarship. Available online: https://escholarship.org/uc/item/8326n305 (accessed on 10 February 2020).
Table 1. How points are awarded in two of the ten Publishers Acting as Partners with Public Institutions of Higher Education and Land-grant Universities (PAPPIHELU, or PAPPI) categories. See Appendix A for more.
Table 1. How points are awarded in two of the ten Publishers Acting as Partners with Public Institutions of Higher Education and Land-grant Universities (PAPPIHELU, or PAPPI) categories. See Appendix A for more.
Public Access: Credit is given if the publisher favors public access as defined below.
4 pointsPublisher makes articles fully and immediately available to the public in at least 85% of their journals, and they are Directory of Open Access Journals (DOAJ)-listed journals or the publisher is an Open Access Scholarly Publishers Association (OASPA) member
3 pointsPublisher auto-archives 100% of articles in a nonprofit repository (owned by government or HE institution) within six months of publication
2 pointsPublisher auto-archives 100% of articles in a nonprofit repository (owned by government or HE institution) within 12 months of publication
1 pointAuthor-optional archiving of articles in a nonprofit repository (owned by government or HE institution) within 12 months of publication for 100% of articles, using either the post-print/author-accepted-manuscript or publisher’s version
Business Practices: The practices outlined below represent best practices from outside the field of librarianship (e.g., website accessibility standards) as well as institutional considerations from librarians’ experiences (e.g., removal/absence of Nondisclosure Agreements). Note: A publisher may earn points in more than one criteria within this category.
4 pointsThe publisher’s website and its online publications meet the latest standard of the Web Content Accessibility Guidelines from the World Wide Web Consortium
4 pointsThe publisher does not allow the same editorial board, with the same review process, to lead multiple journals (such as one hybrid, one open-access)
4 pointsThe publisher has not lobbied politically against public access policies
4 pointsNo boards of journals have left en masse due to disagreements over subscription costs or other terms
4 pointsThe publisher’s agreements with libraries (via journal subscription agreements and/or via the publisher’s licenses of other products) do not include Nondisclosure Agreements or similar limitations
3 pointsThe publisher permits text and data mining at no cost for scholarly purposes
Table 2. Scores for selected publishers.
Table 2. Scores for selected publishers.
Elsevier eScholarship Publishing (University of California and California Digital Library) Evolutionary Ecology Limited (Evolutionary Ecology Research)Society for Neuroscience
1. Public Access0413
2. APCs0430
3. Copyright0133
4. Author Use2333
5. Educational Use0223
6. Business Model0314
7. Discoverability69011
8. Business Practices0191919
9.Publishing Practices23211321
10. Other Innovations (not tallied for any title)----
Total Points31664567
PAPPI DesignationTier 3Tier 1Tier 2Tier 1

Share and Cite

MDPI and ACS Style

Caldwell, R. A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries. Publications 2020, 8, 39. https://doi.org/10.3390/publications8030039

AMA Style

Caldwell R. A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries. Publications. 2020; 8(3):39. https://doi.org/10.3390/publications8030039

Chicago/Turabian Style

Caldwell, Rachel. 2020. "A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries" Publications 8, no. 3: 39. https://doi.org/10.3390/publications8030039

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop