Librarians and Administrators on Academic Library Impact Research: Characteristics and Perspectives

This study surveyed librarians, researchers, administrators, and others engaged in research on the impact of academic libraries on student success. This study, sponsored by an ACRL Impact Grant, specifically sought to expand the ACRL Academic Library Impact report, which defined strategic directions for library impact research, largely defined from the perspective of high-level administrators. This study addressed this limitation by surveying and interviewing professional librarians who are directly conducting library impact research, asking about their research experience, their attitudes about impact research, and their response to the ACRL report. Notable findings include differences in attitudes between librarians and library administrators about the helpfulness of impact research, administrators’ greater agreement with the ACRL report themes, and some pushback among librarians regarding quantitative impact research who are interested in qualitative research and findings that lead to more actionable improvement.


Research on the impact of academic libraries is not new, as noted by Megan Oakleaf in The Value of Academic Libraries, but specifically examining library impact on student success is a more recent and growing theme in this body of work.1 Oakleaf’s report, which suggested an agenda for future research, ultimately led to a report on the state of this research seven years later, the Association of College & Research Libraries (ACRL) report Academic Library Impact: Improving Practice and Essential Areas to Research.2 This report defined and explored recommendations from research in six strategic directions related to academic library impact. These strategic priority areas were developed by examining themes in three data sources: 1) a selected literature review of 535 papers in LIS and higher education; 2) focus group interviews with 14 library administrators; and 3) semistructured individual interviews of 14 provosts. However, as stated in the limitations of the Academic Library Impact (ALI) report, this data collection largely represented the perspective of high-level administrators rather than front-line librarians.

The current research project described in this article addressed this limitation by surveying and interviewing professional librarians who were directly conducting research on the impact of academic libraries on student success. We focused on three research questions:

  • RQ1) What are the characteristics of the population of professional librarians who conduct research on academic library impact?
  • RQ2) What are their beliefs about their own research and about LIS research broadly regarding academic library impact?
  • RQ3) What are the thoughts and perspectives of this group in response to the ALI report?


Research on the impact of academic libraries on outcomes such as student success and retention has steadily increased over the past decade, particularly since the publication of The Value of Academic Libraries, Oakleaf’s foundational report that proposed a research agenda for demonstrating value in academic libraries.3 An impetus behind Oakleaf’s report and this research has been higher education’s accountability movement and its emphasis on outcomes-based metrics. As accountability becomes standard practice and university budgets decline, and amidst some perceptions that libraries are less relevant when so much content is readily available online, academic libraries increasingly feel the pressure to demonstrate their relevance to higher education administrators and specifically to show their impact on students.

This focus on proving academic library value to an external audience is found throughout ACRL’s Academic Library Impact (ALI) report, which was designed as a review of the current field of library impact research as well as defining a research agenda to guide future work and suggested effective practices for libraries in the six priority areas identified. These areas are primarily external-facing strategies: communicate the library’s contributions, match library assessment to institutional mission, include library data in institutional data collection, quantify the library’s impact on student success, enhance teaching and learning, and collaborate with educational stakeholders. When coding the literature and interviews to identify these themes, the ALI report authors found that, when articulating library value, the library literature and academic library administrators focused on service and collaboration, while library administrators and provosts focused on communication, and provosts alone focused on mission alignment and strategy.4 The ALI report authors identified communication as the most common theme overall, and they emphasized that the other five priority areas support effective communication.

While many of the papers reviewed for the report were written by librarians, their perspective was neither solicited nor represented in the report in the direct way that the library administrators’ and provosts’ were. The present study was driven by curiosity about how the librarian-practitioner perspective might further inform research on academic library impact and how the ALI report might currently inform their work.

Literature Review

In the near decade since Oakleaf’s initial report, literature on the value of academic libraries has increasingly focused on linking library use (defined in many ways) to institutional outcomes—in particular to student success and retention. Oakleaf’s research agenda encouraged the field to increasingly use statistical analysis such as correlation studies, rather than relying on anecdotal data, and these data-driven studies did increase. In 2016, Ashlynn Kogut reviewed 38 articles published since Oakleaf’s 2010 report and found that every article used quantitative statistical analysis; seven used a mixed methods approach.5 However, Kogut critically mentioned that many of the studies used very simple statistical methods, only four provided a theoretical framework, and she concluded that additional outcome measures should be used beyond student retention, graduation rates, and GPA. Kogut encouraged researchers in the field to expand their statistical methods, not only by using more advanced types of quantitative methods but also by including more qualitative designs, grounding the research in theory, and publishing their work in the field of higher education, to expand readership of academic library value beyond librarianship.

Adam Murray and Ashley Ireland approached the value of academic libraries by surveying college and university provosts on their opinions about libraries, the types of data they find compelling, and their preferred methods of receiving value messages.6 Overall, the respondents indicated that correlational data related to retention and success was highly influential and was much preferred to library use or user satisfaction data that wasn’t tied to outcomes. Qualitative evidence was found to be less meaningful for provosts, unless it was used along with quantified evidence. The study indicated that tightly linking library services and resources to institutional mission and outcomes is important when communicating with campus administrators, something previously noted in the ALI report.

Lise Doucette studied the motivations of researchers examining topics in library assessment, including library impact and value research.7 She classified those who were externally focused on showing libraries’ relevance and value to their broader institutions as “proving the library,” in contrast to researchers internally focused on “improving the library” who were driven by the desire to change something specific, often to improve the user experience. Doucette found that, of the conference papers that she examined, 92 percent focused on “improving” and 46 percent on “proving.” Thirty-three percent contained both themes, driven by the dual desires to improve library services or resources for users and to prove the value of the library. Doucette noted that the papers focused on “improving” were often directly focused on how to benefit library users.

As this study also considers the research experiences of those who research library impact, it is relevant to consider literature that examines the characteristics and preparation of library and information practitioners as researchers. Marie Kennedy and Kristine Brancolini surveyed academic librarians in 2010 and again in 2015 about their research practices, their confidence with different research steps (including planning, data, and reporting), their research methods education, and their experiences with institutional or administrative research support.8 These surveys built on and furthered the results of a similar 2000 survey by Ronald Powell, Lynda Baker, and Joseph Mika.9

Kennedy and Brancolini noted that, for the 2015 survey, confidence across all research steps was an effective predictor of an academic librarian’s likelihood of conducting research.10 Respondents rated their confidence for analyzing data as the lowest confidence levels among the research steps, something that is perhaps not surprising given that only 53 percent reported having taken at least one course in statistical analysis, 46 percent of whom took the course during their undergraduate education. In a related question, 55 percent of academic librarians rated their MLIS degrees as adequate preparation for consuming research literature, but only 17 percent believed that the degree prepared them to conduct original research. Kennedy and Brancolini further noted that there was a statistically significant relationship between having conducted research and having completed another graduate degree in addition to the MLIS. The findings of Kennedy and Brancolini from the 2015 survey show librarians’ decreased confidence in ability to conduct original research compared to both the 2010 and 2000 surveys.

Rebecca Watson-Boone examined research articles authored by academic librarians to identify methodologies used by and characteristics of librarians as practitioner-researchers.11 Watson-Boone described many librarian practitioner-researchers as being driven by the desire to improve their practice, provide evidence for decision making, and solve specific problems. Virginia Wilson built on Watson-Boone’s study, comparing librarian practitioner-researchers to practitioner-researchers in the fields of nursing, social work, and education.12 Wilson described these researchers as bridging the culture of scholar and practitioner. She noted concerns among some in the LIS field that there were many cases of what she called “how we done it good” papers, superficial description of practices without systematic investigation, reflection, or contextualization. In her next steps section, she called for more dissemination of LIS practitioner research and for LIS curriculums to incorporate more practitioner research so that students become familiar with a broader type of research in the field and how to incorporate research with practical problem-solving.


The current research project employed mixed methods, a combination of a survey with follow-up semistructured interviews, to address the research questions: Examining the characteristics of librarians who perform research and how that research impacts libraries, as well as their response to the ALI report. The survey was used to gather a broad but shallow range of responses and was used to address primarily RQ1 due to the survey’s ability to reach a larger population, while the semistructured interviews were used to gather in-depth responses and to address primarily RQ2 and RQ3 due to the interviewers’ ability to follow up on responses or to explore related topics. Surveys and interviews were chosen because the researchers’ primary focus was to obtain more direct insight into librarians’ experiences with impact research and their impressions of the ALI report.

The 34 survey questions and 16 interview questions were created for this research project and were not piloted or pretested. However, the topics and response options were primarily adapted from the ALI report itself, particularly from the Research Questions Requiring Further Study sections of each agenda item and from Appendix G: Library Administrator Focus Group Interview Protocol. For example, for the survey question “I believe my research has helped…” we added response options related to funding and accreditation because the agenda item Match Library Assessment to Institution’s Mission posed the following research questions requiring further study: “4. How are budget constraints affecting the support by library administrators and staff of the institution’s mission and specific goals related to student learning and success outcomes?” and “5. How do library administrators and staff support accreditation efforts, and are these efforts recognized by the institution?” Though these questions were perhaps complicated to be answered via a survey, we wanted to get a general gist of what the answers could be. See appendix A for the survey instrument and appendix B for the interview questions.

The population was limited to a judgment sample of members of library listservs who indicated they were planning, were currently carrying out, or had already conducted research on the impact of academic libraries on student success. Luo described a judgment sample as “a type of non-probability sampling in which the study units are selected on the basis of the researcher’s judgement about which ones will be the most useful or representative.”13 The listservs were chosen because of their relevance to library assessment, instruction, and other areas of academic librarianship connected with impact research. While we were interested in the perspectives of librarians primarily, our invitation to take the survey was kept broad to gather a larger pool of respondents, and because within the survey there was a question for respondents to self-identify their role on which we could filter responses during data analysis.

The research project proposal was submitted to the UNLV Office of Research and Economic Development’s Social/Behavioral Institutional Review Board on July 27, 2018, and was determined to be exempt on August 27, 2018. The survey was developed in Qualtrics and was conducted from October 5, 2018, to November 14, 2018, with the survey link distributed through ARL-ASSESS, acrlassessdg@lists.ala.org, assessment@lists.ala.org, ACRL Forum, ILI-L, ULS-L, EBSS-L, RUSA-L, Infolit, and LIRT-L once each.

There were 43 complete surveys and 65 partially incomplete surveys. A true response rate could not be calculated because the survey was sent to multiple listservs instead of individuals. The small number of responses compared to similar studies is not surprising, given the specificity of the chosen population. Prior studies by Kennedy and Brancolini, and by Powell, Baker, and Mika were inclusive of the academic librarian population broadly. This study was limited to academic librarians who reported having conducted original research and who had done so specifically in the area of academic library impact.

Those who completed the survey averaged a little more than 105 minutes to finish (M = 105.24, SD = 451.32), with a median of 17.58 minutes, although some respondents took more than an hour to complete it. This average was far greater than our original estimate of 30 minutes; it likely accounted for the large number of partially incomplete surveys. In examining the partially incomplete surveys, there were three particular sections where respondents stopped: 1) right after the demographic questions; 2) at the questions about their own confidence of their research; and 3) at the questions about the ALI report. Forty-three out of 65 partially incomplete respondents made it through the demographic questions about themselves and their institutions, but only one out of 65 made it to the ALI report questions toward the end. The high number of incomplete responses beyond the demographic questions led us to use only complete responses for data analysis.

The last question of the survey asked respondents if they wished to volunteer for follow-up interviews. Of the 43 complete surveys, 16 (37%) indicated an interest in participating in follow-up interviews. These 16 librarians were contacted once via an email script and were invited to sign up for one-hour interview sessions through Google Calendar’s appointment slot function in a calendar set up specifically for this project. Seven of the 16 (44%) contacted volunteers participated in semistructured interviews through the online videoconferencing software Zoom. All interviews, except for one, were attended by both researchers of this project, with one researcher assigned the primary duty of asking the questions and the other primarily taking notes. The interviews were audiorecorded both through Zoom and on an iPhone using the Voice Memos app. The mp3s generated by Zoom were selected due to its audio quality, and submitted to Rev, a third-party human transcription service. Transcripts were returned as Microsoft Word files and were used as the primary data source.

Survey data were downloaded from Qualtrics as a .csv file, then cleaned and transformed using Python 3.0 and pandas, an open source library for data structure manipulation and data analysis. Further data analysis and data visualization was performed in Tableau Desktop. This research project was proposed as an exploratory descriptive research project; as such, there were no plans to conduct comparative statistical analyses or generate correlations, as the study’s purpose was not to formulate hypotheses about differences between librarians and administrators or any other groups. We analyzed the content of the interview transcripts and survey comments qualitatively using analytic memos and values coding.14 (Analytic memos are reflective notes written by the researcher during the coding process to narratively interpret the data.) We coded the interview transcripts using values coding, pulling out statements that indicated either a belief, value, or attitude. We later went through the interview transcripts and survey comments and more closely coded them related to concepts in our study, including confidence, rigor, and meaningfulness. Survey data, interview audio recordings, transcripts, and Tableau files were kept on password-protected servers accessible primarily to the researchers and managed by the UNLV Libraries IT department. Disaggregated data and audio recordings are retained indefinitely on these servers.


The results were organized by research question to address topical areas that emerged from the survey and interview results.

RQ1: Characteristics of librarians and administrators who conduct academic library impact research

Demographic and Institutional Characteristics

Overall, respondent demographics were similar to the 2017 ALA Demographic Study for race, ethnicity, and age.15 However, there were potential differences in the population; respondents of the current study had more male respondents. Respondents also attained high levels of education in terms of degrees. Thirty-three respondents (77%) had a Master’s in Library and Information Sciences (MLIS), 13 respondents (30%) had been involved with academic libraries for 11–20 years. Those who indicated they had achieved other educational degrees, whether master’s or PhD, typically listed degrees in areas of the social sciences such as education/higher education and history, though there was a wide variety of fields such as psychology, statistics, business, and philosophy.

Job Titles and Responsibilities

Survey respondents were asked to write their job title and a short description of their responsibilities for their job. Job titles were coded by their title and for their functional areas: the majority of job titles included the title of “Librarian” (52%, n = 24), followed by “Director” (14%, n = 6), “Head” (9%, n = 4), and “Dean” (9%, n = 4), with various other titles comprising the remaining job titles. Responses about where survey respondents sat in their organization were used to categorize them as either librarians (combined “Front-line position: Professional staff” and “Front-line position: Faculty member,” 56%, n = 24) and administrators (combined “Administrator” and “Manager,” 35%, n = 15), while the remaining four categorized themselves as “Other.” This “Other” category was not included in subsequent comparisons. Functional areas included in a job title varied substantially and did not overlap much except for the area of “Assessment,” which was included in 13 out of 43 (30%) job titles. Otherwise, survey respondents served in functional areas such as “Reference,” “Scholarly Communications,” “E-resources,” and “Health Sciences.” Seven (16%) job titles had “Assessment” paired with one or more other functional areas, while five (12%) job titles had “Assessment” as their sole functional area.

Job responsibilities were also coded by their functions. In describing their typical responsibilities, 18 out of 43 survey respondents (42%) described instruction and/or teaching-related responsibilities, 16 respondents (37%) described assessment-related responsibilities, 15 respondents (35%) described managerial-related responsibilities, 13 respondents (30%) described reference-related responsibilities, 9 respondents (21%) described discipline and/or subject-related liaison responsibilities, 7 respondents (16%) described collections-related responsibilities, and 5 respondents (12%) described research-related responsibilities. As with job titles, the remaining job responsibilities also varied substantially; examples included responsibilities of providing copyright support, creating learning objects, marketing, or managing their ILS. There was no particular consensus in how survey respondents with assessment-related responsibilities described their work in this area, but the most common description was the coordination or management of assessment activities, typically through project-based work. Other descriptions of assessment-related responsibilities were conducting user experience or usability, researching in general, gathering data and statistics, and consulting with colleagues.

Prior Experiences in Research Preparation

Research is a skill that is developed at some point in each individual’s career or education. Understanding the origins of where this skill was developed, and how librarians and administrators perceive how much their experiences have prepared them to conduct research, is important to understanding a librarian’s or administrator’s views of their own research and the research of the field in general. While librarians and administrators felt prepared by their previous experiences overall, administrators generally felt less prepared by their previous experiences than librarians.

Figure 1

Ratings of experiences in preparation for conducting research related to academic library impact. Responses to question “Please rate how the following experiences have prepared you to conduct research related to academic library impact.”

Figure 1. Ratings of experiences in preparation for conducting research related to academic library impact

Of particular note are the librarians’ and administrators’ perceptions of their previous library or information science educational experiences. More librarians felt their education prepared them to conduct original research (83%, n = 20) than administrators (57%, n = 8). Six out of 14 administrators (43%) indicated they felt unprepared to conduct research based on their LIS education, the highest ranking of “Unprepared” of all prior experiences.

Most interviewees had undergraduate degrees in the humanities or social sciences, and they typically characterized these degrees as unrelated to their preparedness to conduct research. All interviewees possessed the MLIS degree; and, though some mentioned they were required to take a research methods class for their MLIS degree, they indicated that it still did not prepare them to conduct research. However, those who spoke of their research methods course believed that it helped create an awareness of what academic research is. One administrator explained their research methods course provided an awareness that, “This is something I will need to do as a librarian…” but that their “…hands-on experience helped [them] better understand” the process of research. A librarian simply stated, “My library school training did not prepare me for this.” Another administrator, who taught research methods at a library school for more than a decade, had similar negative views of the research preparation for future librarians, saying, “I just felt in my experience in teaching research methods to library school students, they were afraid of statistics and they then weren’t ready to do assessment when they went out into their jobs.”

Three of the administrators interviewed mentioned they had a doctoral degree in the area of education, and one librarian mentioned they had a second master’s degree, also in education. Each pointed to these experiences as the time when they felt they gained preparation to conduct the rigors of research. In particular, they mentioned specific research skills they gained during their doctoral or master’s degree such as research design and methodology, survey design, and statistical analysis. Another librarian explained:

I’m definitely feeling that I need to grow to get some more education, and I don’t know that I can do a PhD right now… So I wonder, am I good enough?

Many survey comments and all interviewees recognized the importance of professional development in being prepared to conduct research, some possibly because of their previously mentioned lack of education with research and some possibly because of the enculturation through their second doctoral or master’s programs. They mentioned activities such as reading literature, going to conferences, keeping tabs on what colleagues are doing through social media, or taking short courses and workshops. The importance of being engaged with the library impact or library assessment community was apparent; one interviewee said, “I think just being tapped into those conversations has been influential for me,” while another said, “I go to conferences pretty regularly. I mean, I’m not ashamed to say I steal good ideas.” Professional development was seen as crucial, both to maintaining a current understanding of research activities in the field and as a source of inspiration for activities to implement at their own institutions. However, while workshops and short courses were mentioned as ways to improve their research skills, professional development in this aspect was not seen as the most effective way to gain research skills; as discussed previously, formal education through a second master’s or PhD was prized above all else as the primary method for gaining preparation to conduct research.

Partnering with Others on Research or Expertise

Those who conduct research typically do not do it by themselves. Whether to share the workload, to supplement their own skills, or simply for the social aspect of it, librarians and administrators partner with others for help. Overall, 28 out of 43 survey respondents (65%) partnered with others for more specialized statistical analysis or help with research. Librarians were less likely to partner with others (Yes: 58%, n = 14; No: 38%, n = 9; Unsure: 4%, n = 1) when compared with administrators (Yes: 73%, n = 11; No: 27%, n = 4). Most partnerships were sought for help with quantitative or statistical analysis or design, though others were geared toward assistance using specific statistical software such as SPSS, survey design, data visualization, data management, and manuscript review. One administrator wrote:

When conducting statistical analysis, my research partner and I typically consult with a statistician on campus to make sure our data is set up correctly and we’re running appropriate tests for our hypotheses.

By partnering with experts, librarians and administrators were able to improve the rigor of their research and bolster their confidence.

The partnerships took many forms, from working with another faculty member outside the library, statisticians, another office or unit on campus such as the institutional research office, graduate or undergraduate students in disciplines strong in statistics or research methods, and their peers in their library. For the most part, the relationships were described as consultative where survey respondents sought advice for short amounts of time. However, other relationships described were partnerships where research was divided and conducted throughout phases of their project or people were hired for a specific skill set. Interviewees described partnerships as sometimes having a social dimension. An administrator who partnered with a statistician said:

We spent a lot of time just bouncing ideas back and forth when it came to creating my research questions … we spent a lot of time talking about whether something was measurable and how it would be measured.

Working with others brought different perspectives and ideas to their research, but it also provided informal opportunities to discuss their research.

RQ2: Beliefs about their own research and about LIS research

Confidence and Rigor of Own Research

Overall, librarians were less confident than administrators in all aspects of their own research (see figure 2). Along with feeling less confident with their statistical results and their research data management skills, librarians were less confident in the level of rigor of their own research compared to administrators; 14 out of 23 librarians (61%) agreed (combined strongly agree and somewhat agree) they were confident in their rigor in comparison to 15 out of 15 administrators (100%).

Figure 2

Confidence in aspects of research related to academic library impact. Responses to the question “I am confident in the … of my research.”

Figure 2. Confidence in aspects of research related to academic library impact. Responses to the question “I am confident in the … of my research.”

This theme of confidence and rigor arose during the interviews as well. When asked about their confidence in the results of their own research, all except one interviewee positively affirmed that they were confident. However, there were different interpretations of confidence, and confidence was often placed in contrast with rigor.

A common view of confidence was whether their research results had utility in their local context. A librarian said:

We see this work as demonstrating value and impact in ways that do align with our goals … if someone were to say, “You need to make this the most replicable, representative study and publish it in a peer review [journal],” would I feel confident in that realm? No.

Other interviewees found the applicability of the results to be the primary determinant of whether they were confident in their research. Issues about rigor or transferability did not seem to matter as much.

Another view of confidence was whether their research was externally validated, such as by someone with more expertise or in comparison with other data sources. This view was typically brought up by the administrators interviewed. One explained, “I got quite a bit of positive feedback from quite a few individuals,” while another said, “It’s not a statistical representative sample that we used or anything. But like I said, it trangulates well with other data that the campus has gathered. So I feel fairly confident in what it tells us.” In their view, their own confidence in their research results was primarily determined through outside factors.

When specifically asked if they found their own research rigorous, all interviewees felt that they could have done more to improve this aspect of their research. Overall, they described rigor as something their research minimally possessed, always striving for, but never reaching.

Rigor took on different dimensions for the interviewees whether they were librarians or administrators. As with confidence, typically librarians had tensions between the rigor of their research and the value of their research. One librarian said:

It’s such a hard question because sometimes I worry that rigor can be, I don’t know, dismissive, and I do think that a lot of the work is really valuable, like whether or not a colleague, research, went through a validity reliability for their in-house survey process. The survey results that came out and the reflection that impacted the practice, that, to me, seems really valuable. But whether or not you could consider it rigorous, I guess, would be another question.

Another librarian said:

I’m not as worried about the rigor, and more about: Is it helping the community? Is it enabling our partners to trust us and see that we take our work seriously?

Once again, the function of academic library impact research for librarians was to drive practical, action-oriented results.

In contrast, administrators spoke about their research in terms of external standards. One administrator described placing more importance on particular aspects of rigor due to criticism from an audience member at a conference and from their partners on campus. They go on to say, “So I think rigor is really important if our work is going to be meaningful… for our research to be accepted, it must meet the standards of other professions.” Another administrator who did not think their research was particularly rigorous said, “Whether it would meet the standards of research, I don’t know. I don’t know how well it would reach the standards of a PhD faculty member doing. I’m not sure about that.” While a couple administrators were interested in the practical value of research, they still seemed more concerned about the acceptance of their research from others, while librarians were not as concerned with this aspect of their research.

Meaningfulness of Own Research

Ratings for confidence in the meaningfulness of their research was more even, with 19 librarians (79%) agreeing compared to 12 administrators (86%) (see figure 2). Librarians and administrators also both believed their research helped show the value of the library, but librarians were more likely to believe their research improved student academic success than administrators (see figure 3). On the other hand, administrators were more likely to believe their research helped with accreditation. Beliefs about their own research showing the value of the library was highly rated by both groups, with 14 out of 21 librarians (67%) and 12 out of 15 administrators (80%) agreeing with the statement. However, while librarians maintained similar rates of agreement for improving student academic success (Agree: 65%, n = 11; Unsure: 18%, n = 3; Disagree: 18%, n = 3), fewer administrators agreed (Agree: 40%, n = 6; Unsure: 20%, n = 3; Disagree: 40%, n = 6).

Figure 3

Beliefs in the helpfulness of the survey respondent’s own academic library impact research. Responses to the question “I believe my research has helped…”

Figure 3. Beliefs in the helpfulness of the survey respondent’s own academic library impact research. Responses to the question “I believe my research has helped…”

All of the interviewees said their work was meaningful, regardless of their thoughts of their own confidence or rigor. Notably, all described meaningfulness of their research in terms of the practical value, particularly for the student population at their institution. An administrator said:

To me, it’s most meaningful when it is really practical, and that you can do something with the results. So not just doing assessment for assessment’s sake. I think ours have been meaningful, because we really do follow up on the results.

While another administrator said:

I want what we do in the library to have an impact and to have positive force on students in their education career … Whatever we can learn that allows us to have that, have a bigger impact, play a positive role, to me, that’s important.

A librarian echoed these statements, saying:

I personally think that the work that we’re doing is really meaningful as well, because I think that I see the way that it benefits our students.

Overall, the purpose of their research was clearly linked to impacting students at their institution, regardless of the confidence or rigor in their own research, or whether they were an administrator or librarian.

Confidence, Rigor, and Meaningfulness of LIS Research

Librarians and administrators were generally less positive about aspects of LIS research overall than about similar aspects of their own research; when comparing between the two groups, librarians were more uncertain about the helpfulness of LIS research, while administrators disagreed more about its helpfulness (see figure 4). Both librarians (Agree: 43%, n = 10; Unsure: 5%, n = 1; Disagree: 52%, n = 12) and administrators (Agree: 47%, n = 7; Unsure: 13%, n = 2; Disagree: 40%, n = 6) were split about their beliefs in the methodological rigor of LIS research. Beliefs about the meaningfulness of LIS research, though lower than those about their own research, still differed between librarians and administrators, with librarians less likely to find LIS research meaningful than administrators. Librarians and administrators agreed that LIS research showed the value of academic libraries. However, both groups were more uncertain about whether LIS research has improved student academic success and student retention.

Figure 4

Beliefs about LIS research on academic library impact. Responses to the question “I believe that LIS research on academic library impact…”

Figure 4. Beliefs about LIS research on academic library impact. Responses to the question “I believe that LIS research on academic library impact…”

Interviewees responded that the quality of LIS research was mixed, particularly in terms of rigor. Often their responses were initially positive but were followed by critical comments about rigor. An administrator said:

I think there’s some really good research that’s been done, but I think there’s a lot of stuff that’s not very good … Recently there was an article on library instruction and its effect on GPA. I was like, but, and you’re doing these t-tests and this and that. I had some questions about how this was set up and analysis was done.

Another administrator commented:

I have come across some really fantastic research, but I’ve also come across some really dreadful research just from my own dissertation … I cited them in my dissertation, but they were basically worthless because the methodology was bad.

Their perceptions of impact research in the field were generally more negative than about their own research, but interviewees were also quick to say that the field as a whole was improving how it conducts its research and will continue to improve in the future. An administrator shared:

Well, I think it’s still in its infancy as far as rigor is concerned. I mean, it’s really improved lately because we’ve gotten more people who are really serious about it, and I do think in what we do, so much of it is people becoming more sophisticated.

The improvement mentioned by this administrator could be due to the increasing prioritization of assessment and library impact as strategic directions for the field. Reports such as Oakleaf’s The Value of Academic Libraries and ACRL’s ALI report emphasized the importance of using rigorous statistical methods to demonstrate value, but they also mentioned the importance of professional development to acquire these skills.

RQ3: What are the thoughts and perspectives of this group in response to the ALI report?

How ALI Informed Their Work

The ACRL ALI report was recently published in 2017, and this project and its focus provided an opportunity to understand the influence of this work on researchers who conduct research related to academic library impact. One survey respondent, a library administrator, commented:

I have never heard of the Academic Library Impact report. I generally do not read things like this since it tends to be more blind propaganda (Libraries are great!) than anything else.

No other survey respondents directly commented about the report in the open-ended questions. However, the interviewees indicated a mix of familiarity with the report, from little (“I think I skimmed it”) to in-depth experience (“Yeah, I found it very useful and use it quite a bit”).

Interviewees also varied in how the report related to their work and/or research. One interviewee, a librarian, said:

It’s helping me frame what are the things that we need to focus on…. My new boss, how do I clue her in on what’s important in libraries; she doesn’t come from a library background. My job is to give her readings and help…. And so that’s where we’re going to start the conversation, it’s through some of those readings.

In this librarian’s perspective, the ALI report was useful as a way of quickly getting their administrator up to speed on current issues and topics involving academic library impact so everyone had an even, ground level of knowledge for further discussions. In contrast, a library administrator expressed frustration with the report and the research it is influencing:

Yeah, I don’t know that it has [informed my work or research] too much. I think, maybe this doesn’t answer your question, but looking at a lot of this stuff, I think one of my frustrations to you was if you look at a lot of what assessment is in libraries, we’ve kind of picked all the low-hanging fruit…. How do we dig deeper and how do we go in and really answer really more complex questions? So I don’t see a lot of that out there or being reported on or being talked about. It’s a lot of the surface level stuff that a lot of libraries have done or are easy to do, but let’s move on to the more technical stuff. I kind of see us get stalled right here.

This administrator’s general perception of research in the field as being too basic or simplistic influenced their view of the ALI report as unhelpful, though it reinforces the previous librarian’s use of the ALI report as a starting place to understanding academic library impact. Another librarian explained how the ALI report can be used as a framework to adhere to:

One of the reasons I want to go back and look at it [the report] again, is trying to think about some of the research I’ve been doing and then how to maybe align it a little better…. I want to study this and look at this particular thing because it’s an institutional concern, but then …how can we think about that in a way that might help tell a larger story in making impact in libraries.

Six ALI Report Priority Areas

The ALI report’s six priority areas form ACRL’s research agenda for student learning and success. Figure 5 is a barbell chart that visualizes survey respondents’ own perceptions of importance and their own involvement of each ALI priority area from very important/involved to not important/involved. The gap between librarians (green dots) and administrators (purple diamonds) is represented by a black line. This distance between the two points suggests similar or different perceptions between groups. Librarians reported being less involved in all of these areas than administrators, which is shown by the placement of green dots left of purple diamonds in the very important/involved column. In areas that may have involved conversations with campus administrators, such as “Communicate the Library’s Contribution” and “Including Library Data in Institutional Data Collection,” this was unsurprising. It was more surprising in practice-based areas such as “Enhancing Teaching and Learning,” or “Quantifying the Library’s Impact on Student Success,” in which librarians would be more likely to participate, since 18 out of 43 survey respondents (42%) were engaged with instruction. However, the data may be showing levels of involvement divided by librarians’ primary functional area. For instance, 18 out of 43 survey respondents (42%) reported instructional responsibilities, potentially corresponding to the 29 percent (7 out of 24) that reported being “Very involved” in “Enhancing Teaching and Learning.”

Figure 5

Ratings of the importance and the involvement of survey respondents themselves with the six Academic Library Impact report priority areas.

Figure 5. Ratings of the importance and the involvement of survey respondents themselves with the six Academic Library Impact report priority areas.

Librarians ranked the importance of nearly all priority areas lower than administrators, which is also shown by the placement of green dots to the left of the purple diamonds in the very important/involved column. An exception was “Enhance Teaching and Learning,” which 19 librarians (79%) ranked as very important versus 8 (57%) administrators (see figure 5). Fewer administrators ranked this area as very important than any other area. Administrators consistently ranked both “Communicate the Library’s Contributions” and “Match Library Assessment to Institutions’ Mission” as very important (93%, 13 administrators for both). Librarians also rated “Communicate the Library’s Contributions” as very important more often than any other area (88%, 21 librarians). These high ratings of importance for communication correspond with the ALI report, which ranked communication as the most important priority area overall. In the ALI analysis, communication was the third-most common theme identified in the literature and was the top theme among both library administrators and provosts.16

When asked about communicating library contributions, administrator interviewees mentioned the difficulty of competing with the mass of information already in front of provosts and faculty:

It’s really hard once you take it [ALI report] out and you try to talk to a dean of another college or the provost or the director of this or that, because they have so much going on, so much competing [messaging] it would be difficult for that message to sink in.

Another administrator said, “I think one of the challenges is just, people are so overwhelmed with information that, are they going to read one more report or read one more study? …It’s hard to find new ways and really impactful ways to get to tell your story.” These frustrations from administrators highlighted that disseminating their research was difficult simply because of a lack of attention due to information overload. However, there was a note of optimism in their accounts. If there were some way, any way to garner attention, then the findings of their research would be understood and useful.

In contrast, librarian interviewees focused on internal aspects of communication. One librarian mentioned frustration that their data was not being used for decision making or other internal library purposes:

I thought that the results of that study would inspire people and leadership to do things. Like to make changes to use that data to make an argument for a new building, for example, I don’t see the connections being made internally. But it could be happening. I just, I don’t see it. It’s frustrating.

Another librarian expressed frustration that their library administrators’ traditional view of library services did not align with impact research;

To be frank, I do think there’s maybe some barriers when we are trying to show the value of this [ALI report] to our administrative group. I think that’s the main barrier that I experience. And yet, they’re still giving us funding and supporting the work…. So, I’m not saying they’re not supportive people. But honestly, those folks, I don’t think they fully, fully understand, or fully value going beyond the kind of subject liaison world.

The frustrations experienced by librarians highlighted a more dire situation: that others do not care about using their research to do something even if their research was heard.


Academic Library Impact Report and the Direction It Is Setting for the Profession

The ALI report and the earlier Value of Academic Libraries report were responses to higher education’s growing accountability culture. The reporting and campus politicking that this culture engenders is likely more a part of daily life for library administrators than for most librarians. Thus, it is perhaps unsurprising that the report’s themes seemed to resonate most with the administrator respondents in this study. Library administrators rated all of the ALI report priority areas, the majority of which are externally focused, as more important and themselves as more involved in them than did the librarians in this study. The lone exception was the area “Teaching and Learning,” the most internal and improvement-focused area, which librarians ranked as more important than did administrators. The interviews provided further evidence of this division. The value of impact research for librarians was clearly tied to producing practical, action-oriented results more than to rigor or the ability to quantitatively communicate value to an external audience, which was mentioned by administrative interviewees. Administrators appeared more externally motivated throughout the interviews, not only in the context of communicating library value to their campus, but also to valuing external standards of rigor and validating their research with external sources (such as comparing data or seeking feedback). These findings corresponded with Doucette’s study, which found that researchers in library assessment (many of whom are librarians) tended to focus on “improving” services more often than “proving” library value and to, in particular, be concerned with directly benefiting their users.17 In contrast, the ALI report seems to be clearly setting a direction for library impact research that is externally focused and largely quantitative in nature.

Some survey respondents, both librarians and administrators, indicated skepticism regarding library impact research in general. One librarian said, “I also believe it is impossible to show the value of a library in any meaningful way.” A library administrator commented:

I believe in my ability to do this work, I just don’t believe in the work itself and how I’m currently practicing it, and how it is being practiced within the library assessment community…. I am very leery of the overall “showing impact of” xx program. We need to resist the businessification of higher ed.

Others mentioned that the number of factors contributing to student success makes it difficult to show library impact; one respondent said “it’s mostly superficial.” An administrator commented:

Library impact is so dependent on factors outside the library’s control that I’m never fully confident that we’ve found that one piece of truly meaningful measurement. It’s good to talk about and keep trying. It’s also smart to be realistic about what “impact” means and the limitations inherent in measuring it.

Both librarians and administrators indicated these misgivings or mixed feelings, which coalesced around concerns that academic library impact is something that may not be possible to measure. It is unclear from these survey comments whether this skepticism is related primarily to the rigor of such research, the feasibility of measuring impact, or the purpose of the research. That is, is the skepticism focused on impact research’s ability to communicate library value effectively to campus administrators or to effectively influence the improvement of library practices?

Several participants emphasized the value of qualitative research, which was minimized in the ALI report but mentioned in Kogut’s suggestions for improvement, which also mentioned looking at alternative metrics and connecting to theory.18 A librarian commented in the survey that their “qualitative research background helped prepare me to conduct library impact research, using a mixed qualitative methods approach.” A librarian interviewee articulated the value of qualitative methods as, “I want to feel confident that, when we’re representing the impact on students, they are also more involved in that piece of the work.” Related to alternative metrics, one librarian commented that their research involves “supporting the ‘whole student’ and is not tied to student academic success/GPA.” Another respondent echoed Kogut’s call for theory, commenting, “I think we need theories to support how we demonstrate impact and value”; an interviewee said, “I do like the process of asking the question, having a theoretical framework.” These methods of expanding impact research are largely undiscussed in the ALI report.

Some interviewees questioned the common reliance on correlations in impact research. An administrator acknowledged the meaningfulness of impact research while being skeptical of the methods, saying this:

I think it’s all really important work… I think there’s probably been too much emphasis on some of these things like how the library impacts grades and retention. I just think it’s way too hard to draw specific correlations and certainly not causations, that using the library… I think there are just too many factors that go into grades and retention to say the library has a direct impact.

A librarian provided a mixed view: “I think there’s meaning in correlation, I think there’s meaning in showing relationships between things. But you control for variables.” Views on quantitative methods generally were mixed. An administrator who articulated later in an interview that quantitative data was useful for communicating with the provost implied the value of quantitative methods as improving their research rigor: “we were moving from a lot of qualitative data to more quantitative, and so we are working kind of diligently to improve what we do.” This same administrator later tempered this comment by acknowledging the value of qualitative methods:

We can’t totally abandon the qualitative, because we’re helping to develop people’s skills to survive in life in a lot of what we’re doing and there’s this important piece…. Still have the human piece to it, the qualitative piece to it and marry that together to make it more powerful.

Overall, there seem to be varied opinions about the direction of and methodologies used in impact research. Many, particularly administrators, see the value in performing quantitative studies such as those suggested in the Value and ALI reports. There are concerns about confidence and rigor in research across the field, although individually there is greater confidence in their own research. However, there is a sizable group that believes qualitative research is important to the field, and another group that is skeptical about impact research overall.

The Purpose and Meaning of Research for Librarians versus Administrators (Improving for Students vs Proving Value for Funding)

As seen throughout the results, librarians tended to emphasize the purpose of their impact research as improving the student experience, whereas priorities differed for administrators, who more often mentioned the need to demonstrate library value externally. The ALI report examined the administrator view, positioning academic library impact research’s purpose as strategically communicating library value to campus administrators (and, to a lesser extent, faculty). The report’s introduction stated that a related goal for this research was to “investigate how libraries can increase student learning and success”; however, this goal was addressed less directly throughout the majority of the report. One of the six priority areas, “Quantify the Library’s Impact on Student Success,” was related; but, as the title suggests, it emphasized quantitative impact data nearly to the exclusion of qualitative data, making it clear that this area was also externally focused. (As seen in the section above, several interviewees mentioned the value of qualitative methods of assessing impact.) This underlying tension in impact research, exhibited in the results of the current survey, echoed Doucette’s observed tension in LIS literature between “proving” and “improving” the library.19

Interviewees’ beliefs about the purpose of their research, whether “proving” value externally or “improving” for students, were often revealed in conversation around the meaningfulness of their research. For some interviewees, the meaningfulness of their research was its ability to inform improvement in the library and in particular to benefit students. Both administrators and librarians commented about the value of seeing specific benefits and changes for students in the library that resulted from their impact research.

Interviewee opinions about the meaningfulness of such research for “proving” value, communicating impact to campus administrators, were mixed. A librarian said, “one study that we did, we shared with the provost and the Provost took note of it…. Does that stay in the back of your mind that libraries play a role or student success will support the library? I’ll never know where the connections are.” This simultaneously communicated the usefulness of putting such research in front of campus administrators and also uncertainty whether this practice was effective in persuading administrators of library value. An administrator articulated the external motivation behind their research: “There was a lot of pressure to demonstrate our impact because of moving to responsibility center budgeting,” mentioning that this evidence specifically needed to be quantitative because “I didn’t feel in my conversations with library deans and provost, I didn’t feel that qualitative captured … They were very data-oriented.”

While the externally focused, “proving” direction that the ALI report set for library impact research clearly resounded with many administrators (and a few librarians) in this study, there was a sizable group of librarians, and some library administrators, who saw the primary purpose of this research as its ability to inform library improvements and thus impact students. While these differing purposes of research are related (improving library services and resources for students in turn may provide measurable evidence of that impact that may be communicated to campus administrators), the methodologies and philosophies of such research often diverged. It is perhaps better to consider the ALI report as a useful guide for impact research in a “proving” context, but it should not necessarily be used to inform the direction of impact research as a whole.

Librarians’ Preparation for and Confidence in Their Research

The experiences that participants described around preparing for and conducting research largely corresponded to earlier studies in the literature, in particular to the findings of Kennedy and Brancolini, strengthening confidence in this area. This study’s findings that librarians were less confident in research stages such as statistical results and data analysis echoed the 2015 study’s finding that librarians rated themselves as least confident in the data analysis stage of research (see figure 2).20 Educational experiences were also similar: this study similarly found that librarians who reported having conducted original research often have additional graduate degrees beyond the MLIS; Kennedy and Brancolini found an additional graduate degree to be a statistically significant predictor for conducting research.21 Findings related to the MLIS as preparation to conduct research were mixed in this study, with more librarians feeling prepared for research than administrators; in contrast, the 2015 study’s finding was dramatically lower, with only 17 percent of respondents feeling that their MLIS prepared them to conduct research.22 However, as the study from Kennedy and Brancolini also included participants who had never conducted original research, it is perhaps not surprising that this response was more negative.

An interesting finding was the comparison of responses about preparation for research compared to those about confidence in research. When comparing across all aspects of experiences preparing for research, librarians responded that they felt more prepared for conducting impact research than did administrators (see figure 1). However, librarians rated themselves less confident in all areas of research than did administrators (see figure 2).

Poor preparation for research across the field was a clear concern. For both librarians and administrators, fewer than half of survey respondents agreed that LIS research on library impact was methodologically rigorous. Further, many survey and interview comments mentioned concerns about lack of research preparation either for themselves or in the field broadly. An administrator said:

I think rigor, for our research to be accepted, it must meet the standards of other professions, I think that in libraries, we came late to research other than qualitative because we felt we were a public good… particularly if you’re in the academic arena, that it has to be consumable by serious researchers in other fields to be meaningful, and it has to be compared to other data that’s being collected, not just library data.

To further the field and better prepare both future academic librarians for rigorous research and future library administrators to communicate library value to campus administrators, LIS programs may need to change their approach to research methods education. Since the literature indicates that only a little more than half of accredited LIS programs require even one research methods course (61% in 2010), adding that as a minimum requirement to all programs, at least for the academic librarian track, is a good start.23 Other good suggestions from the literature include adding more in-depth methods courses to LIS programs, allowing students to take similar courses in disciplines such as psychology as electives, and offering a thesis option in more programs.24

In the meantime, respondents indicated they are dealing with rigor in their own research by educating themselves through other professional development opportunities, through informal learning, and by partnering with experienced researchers (often in fields other than LIS). Currently, there appears to be a healthy skepticism around methodological rigor in LIS; perhaps in the longer term this attitude will help provide further impetus behind strengthening research preparation in LIS curricula.


There were a small number of survey respondents and interviews, which made it hard to draw conclusions. We allowed the population to self-define their characteristics and whether their work was “research” or not. The length of the survey led to many respondents dropping out. The questions were not pretested nor validated, which led to two of the questions being interpreted as confusing by respondents.

Survey respondents also had concerns about the survey design of the current research project, with one administrator saying, “I don’t think my research itself CAUSED student retention to improve, but I know through my research that there is a relationship between student retention and library use. I’m concerned about how this question may be interpreted.” Another administrator critiqued the survey’s measure of research preparedness because it is self-reported.


Research on academic library impact has come a long way in the near-decade since Oakleaf’s Value of Academic Libraries, with many studies being published in the area, as evidenced by literature reviews by Kogut and in the 2017 ALI report.25 Librarians and administrators engaged in this research largely reported to find it meaningful and helpful related to specific areas of their work, such as communicating library value and improving services for students. However, despite their perceptions of its value, there were concerns about methodological rigor across research in the field, as well as a few notable concerns about the validity of impact research related to student success at all, given the library’s indirect role and the multitude of factors at play in student success. Additionally, respondents emphasized expanding methods used across LIS research, both quantitative and qualitative, and increasing the use of theory. Given that several interviewees voiced concerns that LIS education does not currently provide adequate research preparation, the practices cited in this study of professional development opportunities and partnering with experienced researchers may continue to be the best path forward for current library professionals who seek to research in this area.

It is worth mentioning that, although the ALI report is largely designed to respond to the motivation of “proving” library value externally, the common librarian motivation of “improving” library services is key to the success of the former motive. That is, demonstrating library impact is only sustainable if work is also being done to maximize that impact through library improvement. It is perhaps most useful to view the ALI report as merely one direction for academic library impact research, rather than as a broad agenda for all research in this area. Future studies should consider how research that seeks to “prove” library value and that to “improve” its services could become more tightly linked. It may also be helpful for researchers in library impact to be aware of their distinct motivations, clearly communicate them, and remain aware of the contributions that each perspective offers.

APPENDIX A. Survey Questions

Please select the educational degree(s) you have achieved.

  • Bachelor’s (1)
  • Master’s, library and information science (LIS) (2)
  • Master’s, other (3) ________________________________________________
  • PhD, LIS (4)
  • PhD, other (5) ________________________________________________
  • Other (6) ________________________________________________
  • Choose not to answer (7)

How long have you been involved with academic libraries?

  • Less than one year (1)
  • 1–3 years (2)
  • 4–6 years (3)
  • 7–10 years (4)
  • 11–20 years (5)
  • More than 20 years (6)
  • Choose not to answer (7)

What is your age?

  • Under 30 (1)
  • 30–39 (2)
  • 40–49 (3)
  • 50–59 (4)
  • 60–69 (5)
  • 70 or older (6)
  • Choose not to answer (7)

What is your racial and/or ethnic identification?

  • Asian (1)
  • Black, non-Hispanic (2)
  • Latino/Latina/Hispanic (3)
  • Mixed race (4)
  • Native American/Alaska Native (5)
  • Pacific Islander (6)
  • White, non-Hispanic (7)
  • Other (8) ________________________________________________
  • Choose not to answer (9)

What is your gender identity?

  • Female (1)
  • Male (2)
  • Transgender (3)
  • Other (4) ________________________________________________
  • Choose not to answer (5)

Please provide your job title.

Please provide a short description of your job’s main responsibilities.

Where do you sit in your organization?

  • Front-line position: Classified staff (1)
  • Front-line position: Professional staff (2)
  • Front-line position: Faculty member (3)
  • Manager (4)
  • Administrator (5)
  • Other (6) ________________________________________________
  • Choose not to answer (7)

What is the faculty status of librarians at your current institution?

  • Professional or classified staff (1)
  • Faculty, with eligibility for tenure (2)
  • Faculty, not eligible for tenure (3)
  • Other (4) ________________________________________________
  • Choose not to answer (5)

What is your institution’s basic Carnegie classification?

  • R1: Doctoral Universities – Highest research activity (1)
  • R2: Doctoral Universities – Higher research activity (2)
  • R3: Doctoral Universities – Moderate research activity (3)
  • M1: Master’s Colleges and Universities – Larger programs (4)
  • M2: Master’s Colleges and Universities – Medium programs (5)
  • M3: Master’s Colleges and Universities – Smaller programs (6)
  • Other (7) ________________________________________________
  • Choose not to answer (8)

Please rate how the following experiences have prepared you to conduct research related to academic library impact.

Very unprepared (1)

Somewhat unprepared (2)



Somewhat prepared (4)

Very prepared (5)



Previous library or information science educational experience (1)

Previous other educational experience (2)

Previous work experience (3)

Current work experience (4)

Professional development experience (5)

Other (6)

Please provide any comments on how your experiences have prepared you to conduct research related to academic library impact.

Please rate the following statements about your skills and knowledge and how they have prepared you to conduct research related to academic library impact.

My skills and knowledge in… prepared me to conduct research related to academic library impact.

Strongly disagree (1)

Somewhat disagree (2)

Unsure (3)

Somewhat agree (4)

Strongly agree (5)

N/A (6)

Communication (1)

Data visualization (2)

IRB process (3)

Qualitative data analysis (4)

Quantitative data analysis (5)

Research data management (6)

Research design/methodology (7)

Survey design (8)

Other (9)

Please provide any comments on how your skills and knowledge have prepared you to conduct research related to academic library impact.


What is your primary research methodology when researching academic library impact?

  • Quantitative methods (1)
  • Qualitative methods (2)
  • Mixed methods (3)
  • Varies dependent the specific project (4)
  • Other (5) ________________________________________________

Please rate the following statements based on your own confidence of your research related to academic library impact.

I am confident in the…of my research.

Strongly disagree (1)

Somewhat disagree (2)

Unsure (3)

Somewhat agree (4)

Strongly agree (5)

N/A (6)

Communication (1)

Data analysis (2)

Level of rigor (3)

Meaningfulness (4)

Research data management (5)

Research design/methodology (6)

Statistical results (7)

Survey design (8)

Other (9)

Please rate the following statements based on your research related to academic library impact.

I believe my research has helped…

Strongly disagree (1)

Somewhat disagree (2)

Unsure (3)

Somewhat agree (4)

Strongly agree (5)

N/A (6)

Influence decisions related to funding from administrators (1)

Show the value of the library (2)

Improve student retention (3)

Improve student academic success (4)

Improve faculty research productivity (5)

Accreditation (6)

Other (7)

Please provide any comments about your own confidence of your research related to academic library impact.

Please rate the following statements based on overall LIS research related to academic library impact.

I believe that LIS research on academic library impact…

Strongly disagree (1)

Somewhat disagree (2)

Unsure (3)

Somewhat agree (4)

Strongly agree (5)

N/A (6)

Is methodologically rigorous (1)

Is meaningful (2)

Has broadly influenced decisions related to funding from administrators (3)

Shows the value of academic libraries (4)

Has improved student retention (5)

Has improved student academic success (6)

Has improved faculty research productivity (7)

Has positively impacted accreditation (8)

Other (9)

Please provide any comments about your own confidence in the field of LIS research related to academic library impact.

Did you partner with others for more specialized statistical analysis or help on research?

  • Yes (1)
  • No (2)
  • Unsure (3)
  • Choose not to answer (4)

If yes, please describe the nature of the partnership.

Please estimate for the following question:












What percentage of your total workload consisted of your research related to academic library impact?

slider image

Please provide any comments about the effect of your research (related to academic library impact) on your workload.

What is your institution’s current undergraduate enrollment?

  • Fewer than 2,500 (1)
  • 2,500–6,000 (2)
  • 6,001–12,000 (3)
  • 12,001–18,000 (4)
  • More than 18,000 (5)
  • Choose not to answer (6)

Is your institution public or private?

  • Public (1)
  • Private (2)
  • Other (3) ________________________________________________
  • Choose not to answer (4)

Please rate the following based on the importance to you and your involvement with the six Academic Library Impact report priority areas (listed below).

Importance to you

Involvement by you

Not important (1)

Marginally important (2)

Somewhat important (3)

Very important (4)

N/A (5)

Not involved (1)

Marginally involved (2)

Somewhat involved (3)

Very involved (4)

N/A (5)

Communicate the Library’s Contributions (1)

Match Library Assessment to Institution’s Mission (2)

Include Library Data in Institutional Data Collection (3)

Quantify the Library’s Impact on Student Success (4)

Enhance Teaching and Learning (5)

Collaborate with Educational Stakeholders (6)

Please provide any comments on your views on the importance of and your involvement with the six Academic Library Impact report priority areas.

Please rate the following based on the importance to your organization and your organization’s involvement with the six Academic Library Impact report priority areas.

Importance to your organization

Involvement of your organization

Not important (1)

Marginally important (2)

Somewhat important (3)

Very important (4)

N/A (5)

Not involved (1)

Marginally involved (2)

Somewhat involved (3)

Very involved (4)

N/A (5)

Communicate the Library’s Contributions (1)

Match Library Assessment to Institution’s Mission (2)

Include Library Data in Institutional Data Collection (3)

Quantify the Library’s Impact on Student Success (4)

Enhance Teaching and Learning (5)

Collaborate with Educational Stakeholders (6)

Please provide any comments on your organization’s views on the importance of and your organization’s involvement with the six Academic Library Impact report priority areas.

Which of the following ways have you disseminated your research related to academic library impact?

  • Presentation within your library (1)
  • Presentation limited to your institution (outside the library) (2)
  • Presentation at a LIS conference (3)
  • Presentation at a non-LIS conference (4)
  • Inclusion in an internal library report (5)
  • Inclusion in a report circulated institutionwide (6)
  • Publication in a LIS journal (7)
  • Publication in a non-LIS journal (8)
  • Publication in a book or as a book chapter (9)
  • Social media (such as Twitter or Facebook) (10)
  • Blog post (11)
  • Podcast (12)
  • Video (like YouTube) (13)
  • Slide share (14)
  • Data visualization (for instance, infographic) (15)
  • Press release (16)
  • Other (17) ________________________________________________
  • Choose not to answer (18)

Please provide any comments about how you have disseminated your research related to academic library impact.


Which stakeholders have you targeted when disseminating your research related to academic library impact?

  • Librarians within my library (1)
  • Administrators in my library (2)
  • Staff in my library (3)
  • Administrators at my institution (4)
  • Faculty at my institution (5)
  • Students at my institution (6)
  • Librarians outside my institution (7)
  • Administrators outside my institution (8)
  • Faculty outside my institution (9)
  • Students outside my institution (10)
  • The general public (11)
  • Other (12) ________________________________________________
  • Choose not to answer (13)

Please provide any comments about how you have disseminated your research related to academic library impact.

Please enter your email if you are interested in participating in follow-up interviews.

APPENDIX B. Interview Questions

  1. Please tell us your current institution, your job title, and what responsibilities are involved with your role.
  2. Can you describe your research related to academic library impact?
    1. What was the purpose or motivation for conducting this research?
    2. What were some of your previous work experiences and how did they prepare or not prepare you to conduct research?
    3. What is your educational background and how has it prepared or not prepared you to conduct research?
    4. Beyond your previous work experience and education, how did you prepare to conduct research about academic library impact if at all?
    5. Did you partner with other people or organizations for help on your research?
    6. If so, can you describe the nature of the partnership? If not, why not?
    7. How did your research about academic library impact affect your day-to-day workload?
  3. Were you pleased with how your research turned out (or is currently progressing)?
    1. Why or why not?
  4. Are you confident in the results of your research? Follow-up: Are you confident in the results of research in the field overall?
    1. Do you find your research is rigorous enough? Follow-up: Do you find the field’s research rigorous enough?
    2. Do you find your research meaningful? Follow-up: Do you find the field’s research meaningful?
  5. How have you disseminated your research either formally or informally?
    1. Which audiences have you targeted?
  6. What information do you or your institution use to measure the effectiveness/impact of the library?
    1. In your view, do you think these measures are sufficient and appropriate in measuring success/impact?
    2. How involved is your institution’s academic library with each of the following high-impact practices: first-year seminars and first-year experiences; common intellectual (curricular or co-curricular) experiences; learning communities; writing-intensive courses; collaborative assignments and projects; undergraduate research; diversity and global learning; service learning and community-based learning; internships; capstone courses and projects.
    3. Are there specific library services, resources, or practices that stand out as evidence of involvement with the high-impact practices we just discussed?
  7. Have you read the ACRL Academic Library Impact Report? Follow-up: How has the ACRL Academic Library Impact report informed either your work or your research, if at all?

    Depending on how the interview progresses and the direction it takes after the previous question, any of the following may be addressed:

  8. One of the six priorities of the ACRL Academic Library Impact report is the communication of the library’s contributions.
    1. In your view, are there barriers or challenges to communicating your library’s contributions? Why or why not?
  9. One of the six priorities of the ACRL Academic Library Impact report is matching library assessment to the institution’s missions.
    1. Can you explain or tell us how your academic library/libraries has/have succeeded in supporting the mission and goals of your institution?
  10. One of the six priorities of the ACRL Academic Library Impact report is including library data in institutional data collection.
    1. Can you describe your institution’s data collection strategies? How has your library contributed if at all?
    2. Have there been any discussions about the confidentiality or privacy of this data?
  11. One of the six priorities of the ACRL Academic Library Impact report is quantifying the library’s impact on student success.
    1. What measures do you use to determine student success and how do they relate to your library?
  12. One of the six priorities of the ACRL Academic Library Impact report is enhancing teaching and learning.
    1. How do you measure the impact of library instruction on student learning outcomes?
  13. One of the six priorities of the ACRL Academic Library Impact report is collaborating with education stakeholders.
    1. How do you collaborate with other academic institutions to increase student learning and success?
  14. Where are you going in the future with your research or work in this area?
  15. Based on your knowledge of our project and the topics we have just covered, is there anything I did not ask you that you think I should have asked?
  16. Is there anything else you would like to tell us?


1. Association of College and Research Libraries (ACRL), The Value of Academic Libraries: A Comprehensive Research Review and Report, prepared by Megan Oakleaf (Chicago, IL: ACRL, 2010).

2. Association of College and Research Libraries, Academic Library Impact: Improving Practice and Essential Areas to Research, prepared by Lynn Silipigni Connaway, William Harvey, Vanessa Kitzie, and Stephanie Mikitish of OCLC Research (Chicago, IL: ACRL, 2017).

3. Oakleaf, The Value of Academic Libraries.

4. Connaway, Academic Library Impact, 20–21.

5. Ashlynn Kogut, “Academic Library Services and Undergraduate Academic Success: Trends in Research Literature,” in Library Assessment Conference: 2016, Arlington, Virginia, 365—79 (Proceedings), (Washington, DC: Association of Research Libraries, 2016).

6. Adam Murray and Ashley Ireland, “Provosts Perceptions of Academic Library Value & Preferences for Communication: A National Study,” College & Research Libraries 79, no. 3 (2018): 336–65, https://doi.org/10.5860/crl.79.3.336.

7. Lise Doucette, “Acknowledging the Political, Economic, and Values-Based Motivators of Assessment Work: An Analysis of Publications on Academic Library Assessment,” in Library Assessment Conference: 2016, Arlington, Virginia, 288–98 (Proceedings), (Washington, DC: Association of Research Libraries, 2016).

8. Marie R. Kennedy and Kristine R. Brancolini, “Academic Librarian Research: A Survey of Attitudes, Involvement, and Perceived Capabilities,” College & Research Libraries 73, no. 5 (Jan. 2012): 431–48, https://doi.org/10.5860/crl-276; Marie R. Kennedy and Kristine R. Brancolini, “Academic Librarian Research: An Update to a Survey of Attitudes, Involvement, and Perceived Capabilities,” College & Research Libraries 79, no. 6 (2018): 822–51, https://doi.org/10.5860/crl.79.6.822.

9. Ronald R. Powell, Lynda M. Baker, and Joseph J. Mika, “Library and Information Science Practitioners and Research,” Library & Information Science Research 24, no. 1 (2002): 49–72, https://doi.org/10.1016/s0740-8188(01)00104-9.

10. Kennedy and Brancolini, “Academic Librarian Research: An Update.”

11. Rebecca Watson-Boone, “Academic Librarians as Practitioner-Researchers,” Journal of Academic Librarianship 26, no. 2 (2000): 85–93, https://doi.org/10.1016/s0099-1333(99)00144-5.

12. Virginia Wilson, “Formalized Curiosity: Reflecting on the Librarian Practitioner-Researcher,” Evidence Based Library and Information Practice 8, no. 1 (2013): 111, https://doi.org/10.18438/b8zk6k.

13. Lili Luo, “Fusing Research into Practice: The Role of Research Methods Education,” Library and Information Science Research 33, no. 3 (2011): 191–201, https://doi.org/10.1016/j.lisr.2010.12.001.

14. Johnny Saldaña, The Coding Manual for Qualitative Researchers, 3rd ed. (Los Angeles, CA: SAGE, 2016).

15. Kathy Rosa and Kelsey Henk, 2017 ALA Demographic Study (Chicago, IL: ALA Office for Research and Statistics, 2017).

16. Connaway, Academic Library Impact, 43.

17. Doucette, “Acknowledging the Political, Economic, and Values-Based Motivators of Assessment Work.”

18. Kogut, “Academic Library Services and Undergraduate Academic Success.”

19. Doucette, “Acknowledging the Political, Economic, and Values-Based Motivators of Assessment Work.”

20. Kennedy and Brancolini, “Academic Librarian Research: An Update,” 826, 832, 834.

21. Kennedy and Brancolini, “Academic Librarian Research: An Update,” 835.

22. Kennedy and Brancolini, “Academic Librarian Research: An Update,” 834.

23. Luo, “Fusing Research into Practice.”

24. Luo, “Fusing Research into Practice”; Kennedy and Brancolini, “Academic Librarian Research: An Update”; Soyeon Park, “Research Methods as a Core Competency,” Journal of Education for Library and Information Science 44, no. 1 (2003): 17, https://doi.org/10.2307/40323939.

25. Oakleaf, The Value of Academic Libraries; Connaway, Academic Library Impact; Kogut, “Academic Library Services and Undergraduate Academic Success.”

* James Cheng is the Library Data Analyst in the University Libraries at the University of Nevada–Las Vegas; email: james.cheng@unlv.edu. Starr Hoffman is Director of Planning and Assessment in the University Libraries at the University of Nevada–Las Vegas; email: starr.hoffman@unlv.edu. This project was supported by an Academic Library Impact Research Grant from ACRL’s Value of Academic Libraries committee. ©2020 James Cheng and Starr Hoffman, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC

Copyright James Cheng, Starr Hoffman

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

January: 27
February: 30
March: 32
April: 28
May: 45
June: 15
July: 23
August: 21
September: 26
October: 0
January: 35
February: 51
March: 44
April: 31
May: 36
June: 20
July: 21
August: 32
September: 21
October: 36
November: 43
December: 24
January: 29
February: 37
March: 45
April: 34
May: 38
June: 26
July: 40
August: 19
September: 30
October: 53
November: 32
December: 25
January: 0
February: 0
March: 0
April: 445
May: 107
June: 57
July: 45
August: 52
September: 44
October: 108
November: 22
December: 30