Publish or Perish? A Content Analysis of Scholarship Criteria in R1 Academic Libraries’ Promotion and Tenure Documentation
This study sought to understand how R1 libraries define scholarship and creative activities and how they address quality of scholarship through a content analysis of promotion and tenure documentation. Peer review is the most common indicator of quality mentioned in the documents, followed by the geographical reach of scholarship and originality of the research. Other common scholarship criteria included the need to demonstrate sustained scholarship activity, while discussions of open access research were rare. Academic libraries that offer promotion and tenure should evaluate their documentation to ensure they provide clarity to candidates and are up to date.
Introduction
Although tenure for academic librarians has existed for some time, one component—scholarship—tends to strike more fear into the hearts of librarians than any other. Various studies report that anywhere from 40% to 60% of U.S. academic libraries offer tenure to their faculty members (Duffy & Webb, 2017; Walters, 2016). Although scholarship has not always been part of tenure, recent studies have reported rates anywhere from 85% to 100% for libraries covered by the studies (Ackerman et al., 2018; Damasco & Hodges, 2012; Sassen & Wahl, 2017; Smith & DeVinney, 1984; Walters, 2016). This demonstrates that scholarship is now a common area for any tenure-seeking librarian to address. The Association of College & Research Libraries (ACRL) indicated the important role that scholarship plays in advancing faculty status for librarians, stating, “The function of the librarian as participant in the processes of teaching, research, and service is the essential criterion of faculty status” (ACRL Board of Directors, 2012, p. 2).
Yet, other studies continue to show that librarians struggle with scholarship. While this can be partly due to lack of training, confusion about what is expected can also play a role. The researchers behind this project, who work at an R1 institution and have all gone or are going through the tenure process, were inspired by their own experiences and questions about what counts as scholarship, particularly in terms of what the documentation actually says. This study sought to better understand documentation regarding the promotion and tenure (P&T) process by conducting a content analysis of P&T documents, primarily bylaws, gathered from R1 university libraries that provide tenure or similar procedures.
Literature Review
The literature review will first look at how library and information science scholarship has been evaluated, what has counted, and some problems with the evaluation criteria. It will then summarize what is known about the documentation of P&T for libraries, with a focus on the language used.
Evaluating Scholarship
How librarians are evaluated on their scholarship has often been fuzzy and changing. Typically, evaluation has focused on the format of disseminated scholarship, with journal articles, books, and book chapters being seen as highly valuable; conference presentations are also commonly mentioned, if given slightly less weight (Leysen & Black, 1998; Sassen & Wahl, 2017). Artistic works, exhibits, and blogs are sometimes seen as legitimate forms of scholarship, but less so (Hendricks, 2010; Novara & Novara, 2017; Sassen & Wahl, 2017). Even within a university system, standards and rules can vary among institutions (Hecker & Smith, 2012).
Peer review is often seen as the primary criteria for rating these and other formats (Ackerman et al., 2018; Best & Kneip, 2010; Bradigan & Mularski, 1996; Sassen & Wahl, 2017; Wirth et al., 2010). Hendricks (2010) found that one of the reasons blogs were not as highly valued as other publication forms was their lack of peer review.
Other criteria that studies identify include the number of authors of a publication, whether a publication has a national or international audience, publishing in library science-focused outlets, the research metrics of a journal, reviews, and awards (Best & Kneip, 2010; Bradigan & Mularski, 1996; Leysen & Black, 1998). Nixon (2017) argued for a tiered rating of journals, saying such a system would assist reviewers in better determining the appropriateness of a journal, especially when they were not familiar with them. Although this article prioritized peer review as the most important criteria, it also suggested criteria such as lists of top journals as identified by library administrators in studies from 1985 and 2005, low journal acceptance rates, and high journal-level metrics.
Evidence also exists that some libraries (although not necessarily a majority) specify how many publications a library faculty member should have when they go up for tenure, ranging from one to five (Ackerman et al., 2018; Shropshire et al., 2015). Others do not use a specific number but instead use “less easily defined qualities such as ‘progress,’ ‘consistency,’ or ‘competence’ in scholarship” (Ackerman et al., 2018, p. 555).
But many have critiqued the ways in which academia tries to evaluate scholarship, noting an over reliance on quantitative metrics. A case study at Oregon State University noted how they had relied on a journal’s peer review status, as well as metrics such as Journal Impact Factor (JIF) and the Eigenfactor, but these “‘standard metrics’ called upon to evaluate the quality of library publications fall short, except for the use of peer review” (Wirth et al., 2010, p. 516). The San Francisco Declaration on Research Assessment (DORA) attempts to take these concerns into account in its principles, stating that evaluators should avoid putting too much weight into journal-level metrics like JIF, and instead focus on assessing the content of a scholarly publication (Declaration on Research Assessment, n.d.). The declaration also asserts that institutions should “be explicit about the criteria used to reach hiring, tenure, and promotion decisions” (n.d., para. 11).
Promotion and Tenure Documentation
ACRL has stated the importance of documentation for promotion and tenure criteria and that this documentation should be developed and approved by library faculty (ACRL Board of Directors, 2012). At the same time, ACRL noted that these documents will likely vary among institutions. One survey of academic librarians found that 80% reported having documentation specific to their library, with the same amount reporting faculty handbooks played a role as well (Connell, 2018).
Bolin argues that the language in promotion and tenure documents is important as “documents encode values and tradition and perpetuate institutional memory” (Bolin, 2014, p. 214). The study found that the documents are broad in covering the entire tenure process but also vague. In analyzing the language of these documents, Bolin found that they demonstrate librarians are expected to direct their scholarship toward advancing the field of librarianship while also showing how librarians have embraced research as part of their work.
However, other studies have pointed out how the lack of P&T documentation or vague language can cause stressful confusion for librarians seeking tenure. One study found that P&T documents often fail to define specific phrases and terms, and do not specify what is needed to obtain tenure or how various criteria are weighted, with the authors saying that they are often “highly subjective” (Lo et al., 2022, p. 85). This can be especially stressful for librarians of color, as broad language can leave interpretation open to racial discrimination (Damasco & Hodges, 2012). The authors argued that P&T documentation is essential for librarians at the start of their tenure clock to ensure they can properly plan their time. They surveyed tenure-track and recently tenured librarians of color and found that most (80%) had been provided with P&T documentation when they were hired. However, about a third did not believe their library policies clearly defined standards of performance, the process of earning tenure, or the evaluation criteria. Another study found that librarians who reported clearly understanding tenure requirements at their institution also reported less job stress than those who did not (Cameron et al., 2021).
One case study at Boise State University details the library’s process of attempting to better clarify their evaluation rules and procedures. Meregaglia et al. noted that
There had been a growing dissatisfaction among tenure-track and tenured library faculty regarding the confusion and lack of information that was available to guide the annual evaluation process. Faculty noted that the absence of written guidelines created confusion and perceived unfair variation in how annual evaluations were conducted. (2021, p. 1)
Despite the desire for better documentation, though, the authors said library faculty could not agree on how much weight peer review should carry, leading the library to sidestep the question of scholarship for a later time.
While some research has sought to better understand how library leaders and specific libraries have addressed how scholarship is evaluated and how librarians themselves understand their institutions’ criteria, there’s a dearth of literature that analyzes what the documentation itself says. This study sought to help fill this gap by answering the following research questions:
- How do R1 academic libraries define scholarship and creative activities, and what counts?
- How do R1 academic libraries address quality of scholarship, including peer review and use of metrics?
Methodology
The researchers worked from a predeveloped list of R1 academic libraries with tenure from a recently published article (Lo et al., 2022). To find each institution’s promotion and tenure documentation for the libraries, the researchers searched Google for the institution name plus words that would aid in finding the relevant documentation, such as “guidelines,” “tenure,” and “bylaws” (e.g., site: www.clemson.edu OR site: library.clemson.edu) (tenure OR librarian) (guidelines OR criteria OR document OR policy OR policies). If documentation was found for main and branch libraries, only the main library documentation was pulled. If no library-specific documentation was found, the main institution’s documentation was used, but only if it specifically referenced librarians. When documentation could not be found, the researchers contacted libraries directly, targeting contacts in the library’s administration office or contacts the researchers knew professionally. The researchers located documentation for 47 of 51 institutions (see Appendix 1) using this method in December 2023 and January 2024, although they might have missed documentation, especially informal documentation that might not be publicly accessible. Some documentation was also more than 10 years old or undated, so it’s possible newer documentation exists that was not found using the stated methodology.1
Five institutions’ documentation was chosen at random for all researchers to review to inductively create an initial codebook informed by the research questions. The initial codebook identified types of scholarship mentioned (e.g., books, journal articles, etc.), quality indicators used (e.g., rubrics, metrics, etc.), and other common elements related to the understanding of what counts as scholarship. The initial codebook was then used by all three researchers on three additional institutions’ documentation. The researchers met to discuss the test coding and resolved coding discrepancies, added, merged, or refined codes, and adjusted code definitions (see Appendix 2). The 47 documents were equally divided to have two researchers code each document.2
During initial coding, the researchers discovered that some institutions in the study sample referred to continuing appointment instead of tenure in their documentation. The researchers met to discuss these institutions and decided that if continuing appointment was defined similarly to the American Association of University Professor’s (AAUP) definition of tenure, that is, “indefinite appointment” (2006, para. 1), then those institutions were valuable to include in the study even if they did not require research specifically as they follow a similar trajectory and outcomes. The researchers then reviewed the remaining 95 R1 institutions to identify those without tenure, but with continuing appointment, using the same Google search strategy; they then reviewed their documentation for the terms of continuing appointment contracts. During this review of remaining R1 institutions, it was discovered that the previously predeveloped list missed some tenure-granting institutions. The researchers identified 12 additional institutions that met the study parameters. Out of 56 R1 institutions that the researchers confirmed offered tenure for librarians, the study was able to find documents for 48 of them (86%), and of the 13 institutions that met this study’s parameters for continuing appointment, the study was able to find documents for 11 of them (85%) for a total of 59 documents. One institution that offers tenure also offers continuing appointment.
While each whole document was reviewed, emphasis was placed on the scholarship and creative activities sections of each due to the study design. A total of eight of the institutions have not specifically required research and scholarship to obtain tenure (one institution) or continuing appointment (seven institutions). Some of these institutions combined scholarship/creative activity and service under an umbrella category such as “professional development.” In those cases, the whole section was coded to remain consistent. Criteria for full professorship was not included because not all documents had it, and it is not generally applicable to obtaining tenure/continuing appointment. Each document was reviewed by two researchers to resolve coding discrepancies.
Limitations
Researchers were unable to confirm whether 19 R1 universities provided tenure or continuing appointment for their libraries, so the study population might be missing some organizations. Researchers first relied on publicly available P&T documents and stopped if they were able to find them for an institution, not contacting anyone at the library to see if there were additional or more current documents. This means this project could have missed additional documentation that was not publicly available. Also, analyzing documentation provides one view of how an institution or library defines and evaluates scholarship; however, that does not mean that is how an institution defines and evaluates scholarship in practice.
Results
Quality Indicators
One of the main goals of this project was to assess how, specifically, P&T documentation addressed issues of quality, or how did they define or provide direction for assessing quality of scholarship? From this, several themes emerged.
Peer review is the most common marker of quality across the documents, appearing in 48 (81%) of them. The term peer review is usually used in reference to journal articles but also sometimes to conference presentations and book chapters. One outlier is Baylor University, which allows for peer-reviewed “software, data, or other nonprint media” in tenure and promotion applications. Several institutions ask librarians to explicitly identify which of their publications have been peer reviewed.3
While some institutions require peer-reviewed publications, most simply indicate that peer-reviewed products ranked higher or counted more in tenure and promotion decisions than non-peer-reviewed products. According to the University of New Hampshire’s document, for example, “[p]eer-reviewed scholarship in all cases outweighs non-peer-reviewed scholarship.” Several of the institutions with less stringent research expectations listed peer-reviewed publications as more of a nice-to-have. “Peer-reviewing of the publications provides additional evidence of quality, but its absence does not invalidate the value of the work,” says Baylor University.
Another component was the geographical distribution or reach of a librarian’s scholarship. Just over half (30) of the libraries mentioned geography in some way. Generally, these referenced the dissemination venue (e.g., journal, conference) but could also refer to the candidate’s own reputation (i.e., they had established, or were working on establishing, a national reputation) or the university’s reputation. A large majority of these mentions indicated a desire for national recognition or reputation. More than half also indicated a desire for the international level, although all but one of these mentions indicated a national level was acceptable. Slightly less than half included the regional and state levels, and some included the local level. While most of these mentions of smaller areas also included national or international reputation and recognition, eight of the mentions did specify only the regional or small geographical level was required. Three mentions included all geographical levels, essentially rendering geography moot as an indicator of quality. Five mentions said geography mattered but did not indicate at what level.
Half (28) of the institutions included originality as another marker of quality in these documents. In the context of scholarship, originality often refers to how unique or groundbreaking a librarian’s research is. More specifically, original research fills a gap in the literature, signifies an entirely new line of inquiry, or showcases the use of new approaches or new interpretations. The University of Nevada, Las Vegas’ bylaws, for instance, asks evaluators to consider if a librarian’s research “is completely new and uncharted territory? Such as a new theoretical approach to a topic, an innovative use of research methodology, the first practical implementation of a theoretical concept, or development of a new technical application.” Likewise, the University of Colorado, Denver’s bylaws state that a “faculty member’s scholarship must provide compelling promise of continued creativity with respect to generating new observations, new concepts, and new interpretations related to the individual’s scholarly endeavors.”
An equal number of the institutions also include quantity of scholarship outputs in some way. Of these, 13 set a specific number of research outputs. A minimum requirement for three scholarly items was most mentioned (six institutions), followed by five and four items (four institutions each), and one and two items (two institutions each). In a few instances, a university’s documentation provided contradicting numbers, for example, one document noted that two substantial writings (i.e., a book chapter or journal article) were required but then noted a minimum of five journal articles elsewhere. Some institutions gave a range, usually three to five. Some of these specified the type of work produced, which could be a journal article or a “major work.” Another 15 institutions mentioned that quantity mattered or would be considered but did not give an indication as to how many works were desired or required. Often this was done by just including “quantity” in a list of items to be considered; however, some institutions used other language, such as “multiple publications” (University of Arizona) and “a regular pattern of scholarly activity” (Kent State University). Eight of these institutions noted either the quantity did not matter or mattered less, such as Clemson University’s document that stated, “the quality of this research program will not be determined by the quantity of outputs.”
There were 22 (37%) institutions that mentioned the publication venue. Most of these focused on the prestige or reputation of the journal, the audience or circulation of the journal, and/or the scope of the journal. Language related to prestige varied and included phrases such as “prestige of venue,” “respected academic journals,” “standing of the outlet,” “reputable,” “influential or pioneering,” “major journal in the discipline,” “authority,” “a venue indicating superlatively high regard from peers.” For the audience of the journal, some documentation instructed librarians to provide information about a journal’s circulation but did not indicate what was acceptable. Other mentions noted the need for a librarian to disseminate their research to an appropriate audience, and one institution did note that the audience would vary based on the needs of the research. Mentions of a journal’s scope were generally vague, such as Oregon State University, which noted the need to communicate research to “external audiences in appropriate outlets.” Less common themes regarding the publication venue were the reputation of the publisher (as opposed to the journal), statements of no preference in terms of the publication venue, the impact factor of a journal, and how selective the journal is.
In terms of metrics, 17 (29%) institutions noted citations as a quality measure of research. Most indicated citation counts were paramount to demonstrate a measure of quality; however, two specifically mentioned that the citations need be “favorable” or “positive.” Two institutions also noted that citations should accompany awards, perhaps indicating that only the highest recognition of scholarship or engagement should be cited. One institution noted that citations would be used to demonstrate the “degree of dissemination” of the work.
Only 12 (20%) institutions mentioned other metrics of research quality, such as, “descriptions of stories in the media, pageview statistics, h-index, or alternative metrics” (Oregon State), “evidence of influence on the work of others” (Ohio State), download statistics, journal or conference acceptance rates, and even mentions on social media or the news.
The least common indicator—used by just 10 institutions—references the impact of a librarian’s research output as a criterion in a general, unqualified way. Most often, impact as a marker of quality often appears alongside a request for other metrics, such as the number of articles a librarian has published in peer reviewed journals, citation counts, or download statistics. The University of Colorado, Denver, for instance, asks librarians to provide “formal or informal measures of impact, such as publisher download statistics for electronic versions of published articles.” Some documents ask for qualitative evidence of impact, such as an argument for how a librarian’s work has contributed to the field or effected various audiences (e.g., the university, the state, or the profession). For example, Oregon State has evaluators consider “how the [librarian’s research] findings have impacted the conversation in the candidate’s field.”
Overall, two institutions—Temple University and the University of Cincinnati—had no descriptions of how quality would be evaluated. Another two, the University of Hawai‘i and the University of Washington, had only one specific criterion listed, while five institutions had two criteria, three had three criteria, and five had four criteria listed. It should be noted that of the above institutions, seven do not specifically require some kind of research or scholarship, but eleven do.
Type of Scholarship
The study also analyzed what types of specific research outputs were mentioned in the documents. Although the researchers anticipated finding references to the most common forms of scholarship within the field of librarianship—namely, journal articles, book chapters, and conference presentations—54 of the documents included examples of scholarship not covered by another of this study’s codes. The most-mentioned scholarly products are as follows:
- Conference materials: 51 institutions; included papers, posters, or presentations given at conferences.
- Journal articles: 45 institutions.
- Grants: 43 institutions. Main differences included what stage of the grant counted for research credit; 14 institutions note that both funded and unfunded grants count for research credit, while 9 institutions specified that only awarded grants counted.
- Reviews: 39 institutions; 15 of these mentions were of book reviews specifically.
- Creative works: 30 institutions; included art, films, compositions, performances, choreography, recitals, graphic design, and even audiocassettes.
- Books: 28 institutions.
- Book chapters: 27 institutions.
- Awards and honors: 25 institutions; eight specified the award should be related to research; two specified librarianship or service.
- Exhibits: 23 institutions; four institutions included language about virtual exhibits as well as physical.
- Code: 23 institutions; included databases, software development, technology development, computer programs, hardware, etc. In a few cases, this needed to be shared through peer review.
- Serving as journal editor: 22 institutions; a few had qualifiers, such as how it helped expand the applicant’s impact.
- Bibliographies or indexes: 20 institutions.
- Web content, including blogs: 20 institutions.
- Course materials: 19 institutions; included broad language, such as curricula, and more specific, such as textbooks.
Scholarly types listed by fewer than a quarter of the R1 institutions included graduate degrees other than an MLIS; white papers or reports; reference work entries; digital projects (including digital humanities projects); consulting work; abstracts; standards; entrepreneurship (including patents); data; lectures and speeches; translations; non-journal editorial work; essays; and participating in mentor relationships. A small number of institutions also specifically listed works that do not count as scholarship, which included in-house reports, preprints, work created as part of a librarian’s main duties (such as research guides), conference reports/minutes, and moderating a conference panel.
Additional Scholarly Indicators
The researchers also looked for other factors that provided details about what scholarship counted and how it would be evaluated.
Among these, 40 (68%) institutions included the concept of professional scholarship growth in some form or another. For the purposes of this study, the researchers defined professional scholarship growth as developing, fostering, and sustaining a research agenda or distinct line of inquiry during the pre-tenure years (with an understanding that this may take many different forms or need to change over time). The University of Utah, for instance, asks that “[a] librarian’s research/creative activity should reflect a coherent agenda in at least one topic area.” Likewise, Oregon State University’s document states that “[a]ll the pieces of [a librarian’s] scholarly output should form a cohesive picture of the faculty member as a librarian and a researcher.” Many of the institutions also state that librarians should show evidence of “maturity” when it comes to their research. That is, they are not only consistently producing scholarship but also strengthening their research skills and developing expertise in their chosen area of inquiry. The University of New Hampshire, for example, stipulates that “[a] candidate’s activities should show a steady progression or developing focus in expertise and selectivity of dissemination throughout the period before their application for promotion and/or tenure.” The researchers also used the professional growth code in cases where evaluators are asked to assess a librarian’s potential for sustained research beyond the awarding of tenure and promotion. For instance, Ohio State University’s library bylaws state:
It is therefore essential to evaluate and judge the probability that faculty, once tenured, will continue to develop professionally and contribute to University Libraries’ academic vision, mission, and values at a high level for the duration of their time at the university.
Interdisciplinary research is addressed by 37 (63%) of the institutions. In all these instances, interdisciplinary research is painted in a positive light; librarians are generally encouraged to engage in research that is interdisciplinary in nature. Many documents contained caveats, however. Most often, they stipulated that a librarian’s entire body of work cannot lie completely outside of the field of library and information science (LIS). For example, Montana State University’s bylaws state that, “[w]hile Library faculty are encouraged to pursue their interests in research and creative activities wherever they lie, their publication record is expected to include contributions to the field of Library and information science.” Some of the institutions likewise specify that research outside the field of librarianship is allowed or encouraged only if it falls within one of the subjects a librarian supports as part of their primary assignment or has some, even tenuous, connection to their day-to-day responsibilities. At the University of Memphis, for instance, “[s]cholarship and creative activities in non-librarianship fields may be included if they are germane to the faculty member’s professional duties, but they should not outweigh those in librarianship.”
Just over half (30 institutions) included language in their documentation about the weight research plays in P&T decisions in comparison to other job concentrations. Broadly, 15 institutions indicate that an excellent rating in research is encouraged or required for P&T. For contrast, eight of those 15 instances indicated some combination of research excellence with primary assignment excellence, usually a required excellence in both or in one or the other. Further, an unrelated nine institutions said all aspects were to be excellent or well-balanced to make tenure, promotion, or continuing appointment. Only six were silent on research’s rating significance in the decision, using language such as, “Each candidate must present evidence of effectiveness in all of the professional domains in which he or she performs” (Utah State University), or “a faculty member is expected to have demonstrated excellence in the areas of expertise applicable to the candidate’s appointed position” (Ohio State University).
Twenty-five institutions referred to the university’s mission or values when discussing research. In many instances, producing scholarship of high quality and boosting the institution’s wider reputation is part of these universities’ missions, and this applies to all faculty, including librarians. Land-grant universities’ P&T documents suggest that faculty research should benefit the state and its people in some, even indirect, way. The University of Arkansas, for example, notes that the research its faculty produces should be “all in service to Arkansas.” A few documents mention the university’s mission and values in the context of definitions of what “counts” as scholarship. Virginia Tech University, for example, states that “[c]reative works that can be tied to the missions of the University Libraries or the university through context provided by the candidate in the dossier” are acceptable scholarly products.
Twenty (34%) institutions addressed collaboration in research. Most indicated the collaborative nature of librarianship and the ways in which research extends the collaborations that are so ingrained in the profession. Two institutions even described it the same way, saying, “Much of the advancement of librarianship depends on formal cooperative efforts” (University of Colorado Denver, University of Kansas). This value of collaboration often accompanied mentions of entities, colleges, or groups that were external to the libraries; however, Kansas State University says that for tenure/promotion, “it is essential that faculty members demonstrate the ability to work cooperatively and collaboratively with other Libraries personnel” showing that even internal collaboration was beneficial. Of the 20 mentions of collaboration, 11 stipulate that collaborative and independent scholarship are weighted equally. Only two institutions mention the need to demonstrate independent scholarship apart from collaborative efforts. Finally, one institution mentions collaboration with students specifically, while two make a point that any collaborative effort should include an explanation to an individual’s role in the scholarship.
There were 19 (32%) institutions that expressed an expectation for individuals to explain their role in scholarship in formal documentation for tenure/promotion. The length of explanation varied widely from “not to exceed one page” (Montana State University) to “one to two sentence[s]” (Baylor University). Similarly, whether the explanation was to be narrative or quantitative was hard to parse from the language used, with phrases such as “estimate of the extent” (University of Hawai‘i) and “degree of responsibility” (University of Illinois). Two institutions stipulated that the role should only be explained for works in progress or grant applications.
A less common element was whether and how open access (OA) research is handled, with 11 (19%) institutions including it in some way. These institutions generally noted that open access work was accepted and often encouraged. For example, the University of Buffalo, states that “publishing in open access venues is also encouraged and valued but not required for tenure or promotion at any rank.” A few institutions gave a reason for supporting OA. For example, the University of Nevada, Las Vegas, states that OA allows “wider dissemination and potential impact.” The University of New Hampshire notes that OA would be evaluated similarly to paywalled research, and Virginia Tech University states that they rated OA scholarship more highly.
Another minor area that emerged was the use of a rubric, with nine institutions using some type of a rubric. Most rubrics divided scholarship into two or three ranked groups and provided instructions about producing so many works from each group. For example, one library had a group for “major works” and another for “minor works.” Peer review and the geographical area (i.e., national versus regional/state) were often indicators used to separate items. One institution used just one list of works that was ranked in order of preference and credit given.
Likewise, only eight universities defined their various ranking levels (e.g., excellent, satisfactory) in some way. Sometimes the institution defined only one level, such as what it took to reach exemplary. Some definitions provided only one or two details, such as Auburn University, which stated “exemplary research/creative work and/or awards demonstrate progress towards a national reputation,” while others went into much greater detail. Sustained work showed up in multiple definitions for excellent, as did achieving national recognition. Some definitions included specific types of research products, such as presenting at major conferences.
Other Scholarship Language
Finally, the researchers coded and analyzed other language related to research but that did not fit any of the definitions of existing codes, and which were often found in only one or two institutions’ documents. Some language was ultimately neutral, such as one institution that provides a range for what percentage of someone’s role is assigned to research. However, most of these statements fell under one of two broad categories: prescriptive or permissive/supportive.
Examples of prescriptive language center on placing limits on what is allowed or adding rules (e.g., noting that works in progress do not count toward tenure and promotion); requiring research to connect to general duties; requiring solo authorship; calling for proof of either citing articles or that an article has been accepted; further criteria for each department in a library; the need for research to be conducted ethically; and that tenure candidates should consider spending their own money to attend conferences.
Permissive/supportive statements include those that specifically encourage flexibility or call for support for librarians seeking tenure. Some of the more common language found in this theme include statements about providing support to the tenure-seeking candidate, such as a mentorship system, rules requiring tenure documentation be shown to new employees, and a statement supporting the need for a healthy work/life balance. Another subtheme revolved around support of equity, diversity, and inclusion (EDI). For example, one institution specifically asks candidates to note whether they’ve incorporated EDI into a research project. One-off examples of permissive language include a call to be open to new forms of research; recognition that research can change based on a job change or that managers might not be able to perform as much research; a note that librarians going up at the same time will not be judged against each other; and encouragement to research what people are interested in.
Discussion
In their P&T formal documents, R1 academic libraries address quality of scholarship, including peer review and use of metrics, in both expected and unexpected ways. The researchers acknowledge that unstated expectations are prevalent in academia, so this study’s analysis might not tell the whole story (Cate et al., 2022). Nevertheless, peer review is the most common marker of research quality used by the libraries in the study sample. However, it is not a universal requirement, even among R1 institutions, and its dominance raises potential questions about how these institutions view emerging types of scholarship and publication venues. For example, what does the preference for peer-reviewed journal articles mean for preprints and overlay journals? Open access appears far less frequently in the documents, which elicits concerns because many still reference quality indicators that privilege paywalled journals (i.e., the publication venue). However, the low number is similar to the findings of a study of institutional P&T documents, which found that most of the small number of documents that did mention OA did so negatively (Alperin et al., 2019). Quantity (i.e., number of articles published pre-tenure) as an evaluative criterion is common as well and helps entrench the “publish or perish” view many faculty (including librarians) have of the entire P&T process. Yet, surprisingly, few documents explicitly mention research metrics. Considering the well-established problems with prevailing measures, like the Journal Impact Factor and the Eigenfactor, perhaps this absence is for the best (Declaration on Research Assessment, n.d.). Of course, these omissions do not necessarily mean that a certain number of publications and “good” metrics aren’t an unwritten expectation, as noted. Overall, the researchers were concerned about the small but real number of institutions that had few, if any, evaluative criteria included in their documentation. More criteria do not necessarily equal better criteria, but a lack of criteria could leave new librarians struggling with understanding how their research will be judged.
In terms of how R1 academic libraries define scholarship—as well as which research output types count—journal articles do appear in most of the documents; however, they were not the most mentioned scholarly product. Instead, conference materials dominate. Considering that librarianship is a practice-based field, this makes some sense. The fact that grants and reviews of various kinds also appear in many of the documents further indicates that librarians’ research supports their day-to-day work. However, vague language abounds, and many of the documents merely list examples of research without much explanation. For example, “web material” as a type of scholarship appears in dozens of the documents but is almost never explicitly defined. Does all web content count toward tenure or promotion, or does the context and venue matter? Some descriptions of scholarship types are confusingly vague and so broad that they theoretically could refer to several different creations. Additionally, the institutions’ scholarly product lists seem outdated. For instance, bibliographies appear throughout the study sample, even though, the researchers argue, librarians have been publishing formal bibliographies with much less frequency than other forms of scholarship (Jabeen et al., 2015, p. 445). And, while many of the documents are long, they devote as much or more space to logistics as they do to substantive concerns, like criteria and definitions. Others use grandiose or jargon-laden language, which the researchers felt obscured meaning even further. For example, one institution proclaims that “[a]ccepting weakness in any aspect of performance in making a tenure decision is tantamount to deliberately diminishing the department’s ability to perform and to progress academically.” Taken together, these characteristics might make it harder for newer librarians to make sense of the documents or external reviewers asked to evaluate candidates in accordance with the documentation.
Likewise, urging faculty to conduct research aligned with the institution’s mission or values is an interesting, and potentially problematic, finding. Depending on the nature of the mission or values in question, this requirement could hamper academic freedom. However, the documents that link research to the university’s or library’s mission or values use especially vague language to do so. Moreover, none of the documents’ research quality indicators address this. Therefore, the researchers suspect that most of these institutions do not actually evaluate their librarians’ research according to how closely it aligns with stated missions or values. In contrast, support for interdisciplinary research is much more explicitly stated in the documents, with institutions using positive language to describe it. Librarians conducting scholarship of this kind remains uncontroversial at R1s, so long as it somehow relates to the librarian’s work or to the overall field of library and information science. Given the supportive and service-oriented nature of librarianship, this encouragement of interdisciplinary research is heartening.
The researchers hope libraries use this study’s findings to reevaluate their own documentation. However, they realize libraries are not always fully in control of how their bylaws are written and that they might want to be less prescriptive in some respects to allow for unique research agendas and emerging forms of scholarship. Nevertheless, here are a few points to consider:
- Determine whether a library uses any quality indicators that appear in bylaws or other documentation to evaluate applications. If not, update them so that they align with practice. Or, if changing bylaws regularly is too onerous, consider creating a less formal guide for all librarians that outlines local expectations in greater detail, or with more nuance, than the bylaws would allow. Such documentation would need to be regularly maintained to ensure it complies with any bylaws.
- If documentation is short on (or missing) evaluative criteria, consider adding it.
- Use clear, precise language to define any concepts and standards that are relied upon for decision making. Don’t assume those new to the field or to an institution understand what is meant by research metrics, “well-regarded” publication venues, or what kind of web content counts as scholarship.
- Encouraging open access publications and interdisciplinary research is a good idea, given the current scholarly publishing landscape, and is in keeping with best practices.
- Use positive, supportive language to highlight the qualities an institution wants to see in P&T applications.
Conclusion
This study aimed to better understand how R1 academic libraries define and rank scholarship expectations for promotion and tenure. Overall, peer review and conference materials were the most heavily mentioned when discussing quality and type of scholarship. While similarities existed in the structure and elements of the documents—including lists of scholarship outputs and procedural matters—details on how candidates’ scholarship would be evaluated were slim. Additionally, documents seemed to be outdated, overly formal, and hard to understand, leading the researchers to question both how these institutions were applying the criteria in practice and if candidates at these institutions really understood how they were being evaluated. Achieving tenure or continuing appointment is a significant accomplishment for any librarian, clarifying the scholarship quality required would benefit candidates as well as evaluators.
Scholarship expectations were the focus of this study; however, other elements of the process could be evaluated in further research, such as full professor criteria, criteria specific to land-grant institutions, or a deeper dive into how open access scholarship is treated. Additional research could also utilize interviews or focus groups to obtain personal accounts of how scholarship criteria in formal documentation is being applied or understood.
Author Contributions
All authors contributed equally throughout all steps of this research project.
Data Availability Statement
Because of copyright concerns, the actual documentation analyzed in this study will not be shared. However, the appendices, which contain the list of the study population and the codebook, including definitions for each code, can be found at https://doi.org/10.5281/zenodo.18686283.
References
Ackerman, E., Hunter, J., & Wilkinson, Z. T. (2018). The availability and effectiveness of research supports for early career academic librarians. The Journal of Academic Librarianship, 44(5), 553–568. https://doi.org/10.1016/j.acalib.2018.06.001
ACRL Board of Directors. (2012). Joint statement on faculty status of college and university librarians. College & Research Libraries News, 73(11), 669–670. https://doi.org/10.5860/crln.73.11.8869
Alperin, J. P., Muñoz Nieves, C., Schimanski, L. A., Fischman, G. E., Niles, M. T., & McKiernan, E. C. (2019). How significant are the public dimensions of faculty work in review, promotion and tenure documents? eLife, 8, e42254. https://doi.org/10.7554/eLife.42254
American Association of University Professors. (2006, June 30). Tenure. AAUP. https://www.aaup.org/issues/tenure
Best, R. D., & Kneip, J. (2010). Library school programs and the successful training of academic librarians to meet promotion and tenure requirements in the academy. College & Research Libraries, 71(2). https://doi.org/10.5860/0710097
Bolin, M. K. (2014). The language of academic librarianship: The discourse of promotion and tenure. In Advances in library administration and organization (Vol. 32, pp. 213–264). Emerald Group Publishing. https://doi.org/10.1108/S0732-067120140000032004
Bradigan, P. S., & Mularski, C. A. (1996). Evaluation of academic librarians’ publications for tenure and initial promotion. The Journal of Academic Librarianship, 22(5), 360–365. https://doi.org/10.1016/S0099-1333(96)90085-3
Cameron, L., Pierce, S., & Conroy, J. (2021). Occupational stress measures of tenure-track librarians. Journal of Librarianship and Information Science, 53(4), 551–558. https://doi.org/10.1177/0961000620967736
Cate, L., Ward, L. W. M., & Ford, K. S. (2022). Strategic ambiguity: How pre-tenure faculty negotiate the hidden rules of academia. Innovative Higher Education, 47(5), 795–812. https://doi.org/10.1007/s10755-022-09604-x
Connell, R. S. (2018). Promotion & tenure procedures: A study of U.S. academic libraries. Library Leadership & Management, 32(4), Article 4. https://doi.org/10.5860/llm.v32i4.7296
Damasco, I. T., & Hodges, D. (2012). Tenure and promotion experiences of academic librarians of color. College & Research Libraries, 73(3). https://doi.org/10.5860/crl-244
Declaration on Research Assessment. (n.d.). San Francisco declaration on research assessment. DORA. Retrieved March 22, 2024, from https://sfdora.org/read/
Duffy, M. A., & Webb, P. L. (2017). Do southeastern public universities adhere to the ACRL tenure and promotion standards? Journal of Library Administration, 57(3), 327–345. https://doi.org/10.1080/01930826.2016.1269536
Hecker, P., & Smith, L. (2012). Tenure and promotion: Criteria and procedures used by University of Louisiana System Libraries. Codex: The Journal of the Louisiana Chapter of the ACRL, 2(2), 17-45. https://journal.acrlla.org/index.php/codex/article/view/71
Hendricks, A. (2010). Bloggership, or is publishing a blog scholarship? A survey of academic librarians. Library Hi Tech, 28(3), 470–477. https://doi.org/10.1108/07378831011076701
Leysen, J. M., & Black, W. K. (1998). Peer review in Carnegie research libraries. College & Research Libraries, 59(6), 512–522. https://doi.org/10.5860/crl.59.6.511
Lo, L. S., Coleman, J., & Pankl, L. (2022). Exploring collegiality as an evaluation factor in librarian promotion and tenure documents. Journal of Library Administration, 62(1), 85–100. https://doi.org/10.1080/01930826.2021.2006987
Meregaglia, A., Keyes, K., Vecchione, A., Armstrong, M., & Ruppel, M. (2021). Creating an annual evaluation framework for library faculty. The Journal of Academic Librarianship, 47(5), 102426. https://doi.org/10.1016/j.acalib.2021.102426
Nixon, J. M. (2017). Core journals in library and information science: Developing a methodology for ranking LIS journals. College & Research Libraries, 75(1). https://doi.org/10.5860/crl12-387
Novara, E. A., & Novara, V. J. (2017). Exhibits as scholarship: Strategies for acceptance, documentation, and evaluation in academic libraries. American Archivist, 80(2), 355–372. https://doi.org/10.17723/0360-9081-80.2.355
Sassen, C., & Wahl, D. (2017). Fostering research and publication in academic libraries. College & Research Libraries, 75(4). https://doi.org/10.5860/crl.75.4.458
Shropshire, S., Semenza, J. L., & Kearns, K. (2015). Promotion and tenure: Carnegie reclassification triggers a revision. Library Management, 36(4/5), 340–350. https://doi.org/10.1108/LM-09-2014-0113
Smith, K. F., & DeVinney, G. (1984). Peer review for academic librarians. Journal of Academic Librarianship, 10(2), 87.
Walters, W. H. (2016). Faculty status of librarians at U.S. research universities. The Journal of Academic Librarianship, 42(2), 161–171. https://doi.org/10.1016/j.acalib.2015.11.002
Wirth, A. A., Kelly, M., & Webster, J. (2010). Assessing library scholarship: Experience at a land grant university. College & Research Libraries, 71(6), 510–524. https://doi.org/10.5860/crl-51r1

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Article Views (By Year/Month)
| 2026 |
| January: 0 |
| February: 0 |
| March: 0 |
| April: 26 |