07_Beile_etal

Aligning Library Assessment with Institutional Priorities: A Study of Student Academic Performance and Use of Five Library Services

This large-scale study was conducted for the purposes of determining how representative library users are compared to the whole student population, to explore how library services contribute to student success, and to position the library to be included in the institution’s learning analytics landscape. To that end, data were collected as students at University of Central Florida (n = 25,336) interacted with five library service points over four semesters. Analysis revealed a positive association between students who used one or more library services and higher end-of-semester GPAs. The article emphasizes how results were disseminated and ongoing work to build an interactive learning analytics library dashboard that complements existing institutional dashboards.

Introduction

Strategic plans generally reflect the priorities and goals of the academic institution, in turn providing clear direction for institutional stakeholders and a basis for funding allocations. Often those priorities and goals are driven by external pressures. Two factors influencing the University of Central Florida (UCF) Collective Impact Strategic Plan are the bid for preeminent status1 and the state performance-funding model.2 While commonalities exist, each contains unique criteria: for example, preeminence metrics include, among others, the number of patents awarded and the institution’s research rank, while performance-funding metrics address topics such as Pell grant recipients and median wages of graduates. Metrics that the factors have in common are indicators of student success and are designed to evaluate the efficacy of the institution’s teaching mission. First-year retention rates, time to graduation and graduation rates, and the number of degrees awarded relative to enrollments reside at the intersection of preeminence and performance-funding. It is therefore not surprising that they have found their way into the strategic plan.

In Florida, as in many other states, performance-funding allocations can comprise a substantial part of an institution’s total budget. Universities are striving to reach and surpass benchmarks to secure funding and status within their state. This is the case at UCF, where there is a renewed focus on student success—and associated metrics. At a 2017 campus forum3 dedicated to the topic, the provost presented a shared philosophy that student success must inform everything from the way we teach to the units we work with and that our individual efforts must be aligned to strengthen our impact. He also identified several goals, including increasing retention rates of first-year students, and went on to mention that retaining even as few as 40 more first-year students (at a campus of 64,000 at the time) could bump UCF’s scores on the retention metric as defined by the Florida Board of Governors. With $39.3M in performance-based funding received by the institution in 2016–2017,4 it is easy to see that retaining even a relatively small number of students can have large financial implications for the institution as well as enhanced success for students.5

The library had already started an initiative to collect student interaction data at five service points for the primary intention of program improvement. However, when the provost asked what measurable impact the library had on student success, we were challenged to further align services to institutional markers of student success. Librarians had discussed necessary steps to match the student interaction data with demographic and academic information, and the provost’s comment provided the impetus to begin.

It was within this environment that librarians, statisticians, and members of the institutional effectiveness unit at the University of Central Florida undertook a large-scale study to investigate whether use of library services was positively associated with academic success. Goals of the project included determining which groups of students were using identified library services compared to which groups were not, investigating whether library use contributes to student academic success, and positioning the library for inclusion in institutionwide learning analytics efforts. And, while a positive association to student grades was found (comparable to other studies reported in the literature), this article focuses on how the results were acted upon for program improvement, disseminated to campus administrators, and used to build an interactive learning analytics dashboard with institutional partners.

Literature Review

The move to objective performance indicators for institutions of higher education is not a new trend; many academic librarians are being asked to demonstrate how library use supports student learning. The 2010 Value of Academic Libraries Report prepared by Megan Oakleaf and others for the Association of College and Research Libraries calls on librarians to demonstrate “the value of academic libraries in clear, measurable ways ” by investigating the impact of library services on student grades and retention.6 However, despite this call to action and the growing body of published research on the topic, there still appears to be a disconnect between correlating library use with academic performance and subsequently acting upon those results.

The ACRL Environmental Scan 2017 anticipates stagnant budgets across all of academia and predicts that pressures to contain costs may come to bear heavily on libraries.7 Responses to the 2016 Ithaka Survey confirm that library deans and directors see decreasing institutional support overall, as well.8 Further, eight out of ten library directors indicated the most important priority for their library is supporting student success and recognized the need to communicate this value; however, only about half acknowledged articulating how their libraries do this.9 In this climate of decreasing financial support and increasing accountability, academic libraries are well advised to proactively establish programs that align to the mission of the institution and demonstrate their value to stakeholders.

This climate is reason enough to embark on research that attempts to demonstrate connections between library use and student learning and success, but the work doesn’t end there. To be meaningful, assessment results need to be acted upon. These activities could include using results to evaluate and improve library programs, describing how academic libraries support student success to campus administrators, and leveraging library data into broader campus learning analytics efforts.

The remainder of this section summarizes 46 library impact research studies, examining them by interaction point, findings, and how results were used or acted upon. Reports included in the literature review were published between 1995 and 2018, are primarily composed of articles in peer-reviewed journals, and describe findings of research conducted at higher education institutions located in Australia (4), Chile (1), China (2), Jordan (1), South Africa (1), Turkey (1), United Kingdom (6), and United States (30). A table summarizing each study by author, date of publication, institutional affiliation, goals of the study, library interaction points included in the study, the number and unit of analysis, study findings, and how results were disseminated or acted upon is included in the appendix.

The studies reviewed investigate academic libraries’ impact on student success by measuring library use in many ways. Libraries have traditionally measured outputs such as the number of books circulated or reference questions answered. To quantify library use and tie these outputs to academic achievement, data must be collected at interaction points and for the library services and resources students are using. The number of interaction points studied in these 46 reports ranges from one to 21, with an average of 3.6 interaction points per study. The research report listing 21 interaction points is an outlier; the rest have 13 or fewer, with 19 studies measuring student use of only one service or resource, most often library instruction (10 studies) or electronic resource login (five studies).

The interaction points most commonly studied were circulation (the number of times a user checked out books or multimedia from the library) and electronic resource access, each with 24 studies using data from these interaction points. Eighteen studies collected user data from library instruction sessions and three included workshops or research clinics.10 Ten studies used computer workstation logins as a measure of library use,11 and another 10 recorded data from in-person or virtual reference transactions.12 Library visits (typically measured by gate or turnstile access) were recorded in nine of the research reports.13 Other interaction points include study room use (in four studies)14 and off-campus access15 and peer-research consultations16 (each in three studies). Additional interaction points, which were each mentioned in only one study, include a library assignment,17 PDF downloads,18 technology lending,19 3D printer use,20 a credit course,21 and use of a writing center.22

Seven studies measured library use at eight or more interaction points.23 Two of these studies relied on surveys to determine which services or resources students were interacting with.24 More interaction points may mean a more complete picture of library use.

Data for these interactions were collected in a number of different ways, including accessing existing data logs (computer workstation, electronic resource, or off-campus logins), by student card swipe at the interaction point (library gate entrance, instruction, or reference transactions), and by class roll (students who attended library instruction). In several studies, data were gathered via surveys asking students how and the frequency with which they use the library.25 Two of the studies did not look at traditional library service points but rather expenses per FTE on services at more than 1,000 institutions26 or the number of staff, collections, and services across almost 100 institutions.27

Of the studies reviewed, the overwhelming majority (26 studies) used GPA as the sole dependent variable.28 Retention was used as the sole dependent variable in three studies,29 and eight studies used both GPA and retention.30 Only a few other dependent variables were reported, and these included one study that looked at retention with credit load,31 two that used graduation rates with retention,32 two that relied exclusively on graduation rates,33 and one that analyzed by degree results.34 Degree results is a classification system used in the UK to reflect overall academic performance.

Researchers reported positive associations between library use and GPA in 19 (73%) of the 26 studies using GPA only.35 Among those who reported strength of relationship, seven were considered weak36 and two were strong.37 Five studies reported mixed results, such as higher grades with library use for upperclassmen, but not freshmen;38 by type of interaction, notably higher grades associated with book loans, but not other materials;39 and higher grades if a variety of library resources was used or if staff assistance was sought, but not with other interaction points;40 by level of exposure, such as if a student attended at least three instruction sessions over the course of their program;41 and one attributing mixed results to potentially mitigating factors.42 Two studies reported no relationship between library use and GPA.43

Similarly, five of the eight (62.5%) studies that analyzed by GPA and retention reported positive associations for library users compared to nonusers on both dependent variables,44 and three found mixed results.45 Of the mixed results, one study found no relationship between library instruction and grades but did find a positive correlation for reference desk visits, along with higher retention, in general.46 Thorpe, Lukes, Bever, and He reported that students who used library services had a higher GPA in one semester and better retention rates than average for the institution; however, the GPA differential was not seen in the second semester of the study.47 And Vance, Kirk, and Gardner discovered a positive association with GPA and library use, but not with retention.48 All three studies that used retention as a success marker found positive associations for library users.49

Six studies analyzed library use with student success markers other than or in different combinations with GPA and retention. Of them, five (83%) found positive associations between library use and student outcomes50 and one was mixed.51 Two studies reported increased graduation rates for library users,52 two studies looked at graduation rates and retention and found library users performed better on each marker,53 and one study reported higher retention rates and heavier credit loads for library users.54 A study out of the UK noted a positive relationship between some library interaction points (book loans and use of e-resources) and degree results, but not gate entries.55

Although 46 articles were reviewed, several reported on the same study, leaving 43 unique studies that investigated associations between library use and academic performance indicators. Of the unique studies reviewed, 31 (72%) reported higher academic performance of library users compared to nonusers on one or more markers,56 10 (23%) found mixed results,57 and two (5%) found no association.58 Several studies noted the potential for mitigating factors and urged caution in interpreting results. Factors that might have a confounding effect on study results include lack of random assignment of students to groups, grading differences across instructors, the amount of student contact time, complexities associated with student classification and previous academic experience, and the fact that not all variables can be controlled for.59

As noted earlier, a primary focus of this article is how results were acted upon for program improvement, disseminated to campus administrators, or used to build an interactive learning analytics dashboard with institutional partners. Research reports included herein found that 11 (24%) of the 46 studies did not mention how results could or would be used,60 while several noted that the study was exploratory,61 baseline,62 or intended to be ongoing or expanded upon.63 One report suggested that results be used to refine library use/student success studies using GPA as a performance indicator,64 and others noted how conducting their respective studies resulted in elevating the profile of the library with campus partners and administrators.65

Over one-third of the reports mentioned using results for internal decision-making or program improvement, including making staffing decisions, allocating resources, informing library renovations and strategic plans, and serving as a basis to make improvements to or expanding services or resources.66 Also common were plans to share study outcomes with faculty, advisors, and other academic support units to advocate for use of library services; to students or targeted student groups to share how the library supports their academic success and market services and resources; and to campus administrators to demonstrate how the library contributes to student success, to increase interest in the library, and to position for favorable budget allocations.67 Only one study expressly mentioned tying ongoing data collection and analysis to campuswide learning analytics. Early on, seeing the value of connecting library usage data to broader campuswide student success assessment efforts, Cox and Jantti describe how University of Wollongong Library (UWL) Cube development began in 2009 and became part of the enterprise reporting system in 2012.68

Mention of library participation in campus learning analytics efforts was limited in the reviewed research studies. Describing efforts to merge library interactions into enterprise learning analytics may be beyond the scope of many of the articles, but it does illustrate the disconnect between ongoing efforts to understand and optimize library contributions to student success and being part of the institutional learning analytics landscape. Oakleaf, Whyte, Lynema, and Brown note a variety of challenges that libraries face when undertaking correlation studies or attempting to leverage library interactions into institutional learning analytics efforts, among them data availability and detail, data silos, data storage, and interoperability standards.69 Yet, interest in endeavors of this type continue to grow, with several libraries at the beginning stages or successfully integrating library data into enterprise initiatives.70 The remainder of this report describes a large-scale effort to correlate library interaction data with student end-of-semester GPA and reports on subsequent efforts to create an interactive web form that can be leveraged into institutional learning analytics work.

Methodology

One of the most important—and lively—discussions among the research team occurred early on: deciding which interaction points to include in the study. The team wanted to capture undergraduate and graduate students’ relevant interactions at the library’s service points, but some interaction points proved more difficult to collect than others, had potential patron privacy concerns attached, or were deemed not substantive enough to collect in relation to the act of data collection as a potential barrier to service.

The service points considered were circulations and course reserves, course-integrated instruction (requested by faculty and usually taught in person by a librarian), online information literacy modules and a library research strategies course embedded in the learning management system (LMS), interlibrary loan usage, reference desk and virtual reference service statistics, research consultations, and study room reservations. Each of these points was debated in relation to whether data were authenticated (requiring login with a student identifier) or if student identifiers were manually collected, whether there were privacy concerns or if requesting student information could be a barrier to accessing the service, and whether the research team thought the interaction would be meaningful enough to actually impact student performance.

After deliberating, the library team decided that the challenge of collecting student information from the state library cooperative and privacy concerns outweighed the probable impact of collecting circulation and course reserves at the time. Circulation and course reserves will be revisited for inclusion in the study when data can be collected about the number of transactions the student made while maintaining confidentiality about what the student borrowed, and when this can be more easily obtained from the state cooperative. Interlibrary loan data is pulled at the institution and will be included after migration to a new platform; again, simply capturing the transaction and no item-specific information. Reference statistics are collected annually; review of them revealed that the majority of questions asked were directional, such as restroom or campus locations. Based on this information, it was agreed that in-person and virtual reference transactions generally were too brief to warrant data collection. The authors also were concerned that asking students for their identification numbers could pose a barrier to service.

Ultimately, the five interaction points in the study included attending course-integrated instruction (usually taught in person), completing online library instruction (Information Literacy modules or a library course that resides in the learning management system), using study rooms, and meeting with a librarian for an in-depth research consultation. Table 1 lists service points that were considered and criteria for inclusion associated with each. Bolded entries were included in the study.

TABLE 1

Service Points Considered (bolded items included in study)

Service Points

Data Collection

Patron Concerns

Impact

Circulation

Authenticated Login

Privacy Concerns

Probable Impact

Course Reserves

Authenticated Login

Privacy Concerns

Probable Impact

Course-integrated Instruction

Manually Collected

Low Concerns

Probable High Impact

Info Literacy Modules

Authenticated Login

Low Concerns

Probable High Impact

LMS/Canvas Library Course

Authenticated Login

Low Concerns

Probable High Impact

Interlibrary Loan

Authenticated Login

Privacy Concerns

Probable Impact

Reference Desk

Manually Collected

Barrier To Service

Probable Low Impact

Virtual Reference

Manually Collected

Barrier To Service

Probable Low Impact

Research Consultations

Manually Collected

Low Concerns

Probable High Impact

Study Room Reservations

Authenticated Login

Low Concerns

Uncertain Impact

Course-integrated instruction. Course-integrated instruction is library instruction requested by the course instructor and usually offered in person. It may consist of a single instructional session (generally 50 minutes or longer) or multiple sessions spaced throughout the semester. Course-integrated instruction can be assigned in conjunction with online instruction options. Initial attempts to collect student identifiers at the time of the class yielded inconsistent and incorrect ID numbers and took an inordinate amount of class time. The investigators requested access to the institution’s Reporting Database Service (RDS), which is a dynamic database that includes course enrollment information. After a mandated training session, the researchers began pulling course rosters of all students enrolled in classes that had scheduled a library instruction session. While there may be a slight overcounting of students who attend course-integrated instruction (as some may be absent from class on the day of the session), it is most likely a small fraction of the number of students who were undercounted previously. Collecting student attendance by course roll has been used in prior research of this type.71

Online library instruction. Online library instruction includes both the Information Literacy (IL) Modules and the Library Research Strategies course that is embedded in the library management system (LMS). Each of the twelve IL modules and the LMS course have learning objectives, content, and formative and summative assessments. Course instructors have the option of including scores on the summative assessments to the overall course grade. The online instruction options require students to log in, so data are authenticated. Student identifiers were pulled from the various platforms the online options reside upon, which is either a system developed at the institution (for the IL modules) or Canvas (the LMS platform).

Research consultations. Research consultations consist of intensive, one-on-one meetings with a Subject Librarian. These consultations can take hours to prepare and usually last an hour. Sometimes multiple consultations are warranted. Students were asked to submit their ID numbers when requesting a session in the online form.

Study rooms. A final interaction point, study room reservations, was included. Study rooms in the library can be reserved by students for up to four hours and they can accommodate quiet, individual study or hold groups up to 12 people. Only one student identifier is required to schedule a study room, so students who use the rooms with other students are undercounted.

Data Collection and Handling

Students are assigned two identification numbers at UCF. One is the network ID, which links to private student information, and the other is the more broadly used UCF ID, which is commonly used for interoffice communication and does not allow access to restricted information. However, the UCF ID can be used by units that have the authority to access restricted data to obtain student information. Student UCF IDs were collected over four semesters at the five interaction points. IRB approval was not contingent upon administering informed consent, as data handling protocols met institutional requirements and the involvement of key faculty and units who handle student data on a routine basis. These included a professor in the Statistics department, a Statistics teaching assistant and PhD student, and staff in the institutional effectiveness unit. However, per IRB stipulations, any students under the age of 18 were expected to be identified and excluded from the study. No students were identified who met that criterion.

As noted earlier, some of the library service points under consideration automatically collected student information via authenticated login, while others required manual input of the student UCF ID. As the research team had moved from manually collecting course-integrated instruction student ID numbers to pulling course rosters from the RDS, only research consultations remained as nonauthenticated data. Further, all data collection was done by staff who had completed university FERPA training. Data were extracted from five different systems: institutional effectiveness’ RDS for course-integrated class rosters, LibCal for the study room reservation system, Canvas for online library instruction embedded in the LMS, a local platform called Obojobo that hosts the information literacy modules, and a Springshare form used to collect research consultation information.

The IRB Approval of Human Research for this study stipulated that all data be retained and secured for a minimum of five years and that access to data be limited to authorized individuals involved in the study. Data were stored on a shared network drive accessible only by the research team. Data were compiled monthly during a period of four semesters and saved to the shared drive. File naming conventions suggested by the analysts were used.

The collected student identifiers (UCF ID) were delivered to the institutional effectiveness unit, which pulled demographic and academic information for library users, including end-of-semester course grades. Data supplied by institutional effectiveness included gender, ethnicity, and first time in college or transfer status, among others. UCF is consistently ranked by US News and World Report as an institution with quality online programs72 and web-based, online instruction accounts for about one-third of student credit hours by modality.73 This is compared to 58 percent of courses taught in person and 10 percent mixed mode or reduced seat time. As such, information regarding instructional modality also was requested for analysis.

To strengthen any claims of library impact, a propensity scoring model was used, which entailed pulling information for library nonusers enrolled in the same courses. For example, if there were 20 sections of a course and eight had a library instruction component, then students in those eight sections were compared to students in the sections not assigning library instruction. This practice made on exception for students in nonlibrary instruction sections who interacted with the library of their own volition. The institutional effectiveness analyst also de-identified the data to protect student anonymity, leaving proxy identifiers in place. The files were collected from the analyst by the project lead, who backed up and stored the data on the network drive and then shared with the statisticians. Statistics faculty analyzed the data using SAS, a statistical analysis software.

Results

Analysis of student interactions revealed that 25,336 unique students (~40% of the student population) used one or more library services 66,860 times over a period of four semesters, for an average of 2.64 interactions per student user. One finding that was very eye opening was that the majority of students (83.54%) used only one library service, such as completing four information literacy modules or reserving a study room six times. Much less typical was the student who used multiple services. Less than 15 percent of students who used any of the library services used two services, less than 2 percent used three services, and less than 1/10th of 1 percent used four services. No students used all five library services. It appears that students tended to engage with the library at one service point but did not venture beyond that to explore additional library services. Table 2 summarizes use of library service points included in the study. The two most heavily used interaction points were the assigned Information Literacy Modules and use of study rooms.

TABLE 2

Library Service Use (by interactions and percent of overall use)

Library Service Points

Total Number of Interactions

Percent of Overall Use

Number of Unique Users, by Service

Average Number
of Transactions,
by User

Information Literacy Modules

29,824

44.61%

9,585

3.11

Study Room Reservations

24,591

36.78%

9,979

2.46

Course-integrated Instruction

7,735

11.57%

6,205

1.25

LMS-embedded Library Course

4,327

6.47%

3,917

1.10

Research Consultations

383

.57%

352

1.09

TOTAL

66,860

100.00%

The first goal of the study was to determine how representative library users were compared to the full student population. Descriptive statistics revealed that the ratio of library users to nonusers was comparable by gender and ethnicity. Female students accounted for 54.9 percent of enrollments by gender at the time of the study, and 38.6 percent of them used one or more library services during the study period. This is in comparison to males, who accounted for 45.1 percent of enrollments, but 41.5 percent of library users, showing that males were slightly heavier users than females. By ethnicity, students who identified as Asian were the heaviest users (48.0%), followed by Black/African American (42.7%), Hispanic/Latino and Other (both at 40.7%), White (39.4%), and Multiracial (38.3%). Table 3 provides a summary of student users by gender and ethnicity.

TABLE 3

Library Users by Gender and Ethnicity (compared to total student population)

Gender

Library Users

UCF Enrollments

% of Population Using Library Services

Female

Male

13,622

12,047

35,290 (54.9%)

29,028 (45.1%)

38.6%

41.5%

Ethnicity

White

Hispanic/Latino

Black/African American

Asian

Multiracial

Other

12,885

6,231

3,045

1,878

884

1,189

32,718 (50.9%)

15,325 (23.8%)

7,130 (11.1%)

3,915 (6.1%)

2,308 (3.6%)

2,922 (4.5%)

39.4%

40.7%

42.7%

48.0%

38.3%

40.7%

A major focus of the institution at the time of the study was involvement in the John N. Gardner Institute’s Foundations of Excellence initiative (now referred to as the Transfer Alliance), which is a multiyear program to identify and enhance transfer students’ experiences and academic achievement.74 Transfer students comprise the majority of enrollments at the institution and thus are considered a high-profile group of the student population.75 Due to this factor, analysis was conducted comparing users by those who started at UCF (first time in college, or FTIC) with students who transferred from another institution. Results suggested that transfer students were not using library services at the same rate as those who started at the institution. Table 4 provides the percentage of the undergraduate student population using library services compared to the total population of FTIC and transfer students.

TABLE 4

Undergraduate Library Users by Enrollment Status (compared to total student population)

Enrollment Status*

Library Users

UCF Enrollments

% of Population Using Library Services

Started at UCF (FTIC)

Transfer students

11,508

10,137

24,689 (41.18%)

26,415 (44.06%)

46.6%

38.3%

TOTAL

52,539* (85.24%)

*Does not include post-bac, graduate, or early admit students.

Comparisons were made between students who used the library and their counterparts who were enrolled in the same courses but did not use one or more of the five library services during the data collection period. Analysis revealed that library users enjoyed an average end-of-semester GPA of 3.20 (N = 273,137, SD = 0.95) compared to library nonusers, who averaged 3.05 (N = 376,713, SD = 1.05). Note that the unit of analysis moved from individual students to course grades over four semesters, hence the large N. Students who used the library enjoyed higher course grades and average GPAs than students who did not.

FIGURE 1

End-of-Semester Average GPA, Library Users and Nonusers

Figure 1. End-of-Semester Average GPA, Library Users and Nonusers

A trend in distribution of grades also was found, with 48.2 percent of library users receiving A grades in individual classes compared to 42.6 percent of library nonusers, a similar number of library users and nonusers receiving B grades, and nonusers receiving more C, D, and F grades. This trend line is what might be expected given the association of higher GPAs with library use; library users received more As and library nonusers received more C, D, and F grades, which are less desirable.

FIGURE 2

Grade Distribution, Library Users and Nonusers, by Percent

Figure 2. Grade Distribution, Library Users and Nonusers, by Percent

Due to the variety of course modalities offered at the institution and the reputation of the institution offering quality online programs, scores were further analyzed by modality: face to face/in person, mixed mode, or wholly online. The same trend was found: students who used one or more library services were more likely to receive A grades than those who did not, regardless of modality. Library services appear to support online/distance students equally as well as those who use services in person. Table 5 summarizes by overall percent of library users and nonusers.

TABLE 5

Grade Distribution by Instructional Modality, Library Users and Nonusers

End-of-semester
Course Grades

Face to Face

Mixed Mode

Web/Online

Library
User

Library
Nonuser

Library
User

Library
Nonuser

Library
User

Library
Nonuser

A

32.1%

30.6%

5.3%

4.5%

11.6%

10.4%

B

23.5%

23.7%

2.7%

3.3%

5.5%

5.3%

C

11.0%

11.6%

0.9%

1.3%

1.8%

2.0%

D

2.4%

2.8%

0.2%

0.3%

0.3%

0.4%

F

2.0%

2.9%

0.1%

0.3%

0.6%

0.7%

Percentages add up to 100% for library users across modality and 100% for nonusers across modality. Library users: 71.0% face to face, 9.2% mixed mode, and 19.8% web/online. Library non-users: 71.6% face to face, 9.7% mixed mode, and 18.8% web/online.

Discussion

Three questions drove this project. The first was exploring who used library services to determine the library’s reach or “segment penetration.” Addressing this question required comparing library user demographics to the full student population. The second goal was investigating whether library use contributed to student success, which was defined as end-of-semester GPA in this study. Students who interacted with the library through one or more of the five interaction points were matched with nonusers who were enrolled in the same courses. Finally, it was hoped that results would serve as a “proof of concept” for demonstrating that library use has potential to impact student success and should be another data point in institutional learning analytics initiatives. The remainder of this section describes how results were acted upon by each question, followed by next steps and limitations to the study.

Goal 1: Determine segment penetration. Who are library users and what are they using? Equally important, which student groups are not using the library?

When looking at which students are library users in comparison to nonusers, data revealed that users were representative of the whole student population in many instances, but a marked difference was found (over 8%) between the proportion of FTIC users and transfer student users to nonusers. Transfer students do not appear to use the library at the same rate as students who started their academic careers at the institution. The library offers orientations targeted to FTIC students, but no systematic way to reach transfer students. To address this programmatically, a percent of a librarian position has been dedicated to transfer student outreach and programming. The transfer engagement librarian works closely with the campus Office of Transfer and Transition Services, provides workshops and programs designed to enhance transfer student use of the library (or instructs on library-related concepts), and assesses transfer student needs and perceptions through surveys and focus groups. Over time it is expected that the percent of transfer students who use the library will increase.

Further, when it was discovered that most students were interacting with the library at only one service point and not taking advantage of the range of services offered, a marketing campaign was implemented. In response, posts were made to social media promoting library resources and workshops, a student brochure was developed with assistance from a graphic artist and refined through student feedback, and information about programs and services was sent directly to faculty and campus support units with the expectation that they would encourage students to use library services. The library will continue to assess which services students use and whether some services are no longer needed to continually refine service models.

Goal 2: Investigate whether use of library resources and services is associated with student GPA.

Similar to prior studies reported in the literature, this study likewise found a positive association with library use and academic performance. Students who used the library tended to have better academic outcomes, in the form of end-of-semester course grades, than their counterparts who did not use the library. This trend was further seen in the distribution of course grades, both overall and by modality, in turn offering some evidence that the library supports online students equally as well as those who used services in person.

When looking at the results, one can conclude that successful students are more likely to be engaged and knowledgeable of academic support available on campus. However, these results also provide compelling evidence of the library’s role in supporting student success, and there is opportunity to “push” students to the library at various points in their academic careers. This may occur early on, when first entering the university, or at critical points during their coursework. To that end, results have been shared widely through library publications sent directly to faculty, institutional newsletters distributed by the faculty development office, and newsletters distributed by a senior vice president’s office.

Institutional response to this study has been overwhelmingly positive. The president of UCF included study results in his report to the Board of Governors on the status of a library renovation project and the return on investment in libraries. This information also was requested by the Vice President for Information Technologies and Resources, who had heard the report at a presentation given to a contingent visiting from the Gates Foundation. Results also were shared with the Vice President for Institutional Effectiveness, who oversees the collection and reporting of performance funding metrics; the Vice President for the Division of Digital Learning, especially modality metrics and their relationship to web-delivered courses; the Dean for Teaching and Learning, who requested that a librarian sit on the General Education Program (GEP) curriculum redesign project to promote library services and resources; and the Vice President for Student Development and Enrollment Services, whose office coordinates almost all of the undergraduate student success programs, including transfer student initiatives.

In general, these reports have provided an opportunity to further communicate how the library can support students, and they have led to a reenergized atmosphere conducive to working more closely with our campus partners on student initiatives. Another positive outcome was the invitation to present to the board of the institution’s Student Success Investment Model (SSIM). The SSIM was formed and funded by the provost to hear requests for funding projects that could directly impact student academic performance. Results of the study were shared at the presentation and a Student Success librarian position was funded to continue this work.

Goal 3: Position the library for inclusion in institutional learning analytics efforts.

There is continued interest from university administrators, and we are now seeing the impact of this dissemination across campus. The vice president to whom library staff reports has requested that the project continue and that the number of interaction points be expanded. Library interaction data are being added to the Education Advisory Board’s Student Success Collaborative platform for predictive analytics, and the library is now working with the institutional effectiveness office to develop, implement, and populate an analytics dashboard that matches student library interactions with user and nonuser demographic and academic information.

Over time the number of library interaction points will increase as more capabilities come to pass (for example, card swipe at entrances or implementation of OpenAthens for electronic resource use). Already the number of interaction points has grown to include interlibrary loan requests and computer logins within the library. As these and other interaction points are added, every precaution will be taken to only collect the information that the student used the service or resource and not what was accessed or borrowed. Ultimately the library dashboard will complement existing learning analytics dashboards at the institution.

To date, a secure portal for uploading student identifiers has been created, file naming conventions established, semester timetables determined, and responsibility for collecting, cleaning, and uploading data assigned. As noted earlier, the number of library interaction points has been expanded. Deciding how the library wanted to analyze data led to grouping interaction points into three types: services, space, and resources. Services are defined as interactions that generally require human mediation, including procurement of materials from other libraries, in-person or online instruction, and research consultations, among others. Space interaction points are characterized by use of physical library space, including building access, study room use, and computer logins in the physical library. Use of databases and circulations, including reserves and loaned technology, constitute the third category, Resources.

Privacy concerns associated with circulations and electronic resource logins remain key to the decision to collect student identifiers at these interaction points. The investigators will only collect the student identifier and date of the interaction from the host management system; any data that identifies what the student accessed or checked out will not be used for analysis or otherwise stored apart from the integrated library system (ILS) or other. The interactive web form also allows for more specific analysis by individual service point. Table 6 summarizes current and planned interaction types and points. At the time of this writing, italicized entries are not currently collected.

TABLE 6

Services and Resources Currently Being Collected or Planned (italicized are planned)

Interaction Type

Interaction Point

Notes

Services

InfoLit Modules

Information Literacy modules that are hosted on a UCF-developed platform.

LMS/Canvas Module

An introductory library course embedded in the Canvas shell.

Course-integrated Instruction

Course content that is tailored to a specific assignment or learning objectives and generally offered in person.

Workshops

Library programming that is designed to promote awareness, build skills, and educate on issues associated with research and learning.

Consultations

Intensive, one-on-one research assistance with a subject librarian that usually last an hour.

Interlibrary Loan

Items not held by the UCF Libraries that are requested from other libraries.

Space

Study Rooms

Reserved by students for up to four hours: smaller rooms accommodate quiet, individual study while larger rooms can hold groups up to 12 people.

Computer Logins

All library computers: public PC desktops, collaboration workstations, and study room PCs.

Space

Card-swipe Entrance

Not currently collected; will be added as the library adds card-swipe entrances to the building and moves to 24/5 status.

Resources

Electronic Resources

Not currently collected; will be added when OpenAthens is implemented. Data at the article level will not be collected to protect patron privacy.

Reserves

Not currently collected. Reserves will be added when/if data are provided at the state level. Individual title information will not be collected, just the number of circulations.

Circulations

Not currently collected. Circulations will be added when/if data are provided at the state level. Individual title information will not be collected, just the number of circulations.

This project relied on each unit’s respective strengths. The institutional effectiveness unit is well versed in maintaining student privacy and has in-depth knowledge of learning analytics and platforms. Involvement of the centralized Information Technology division was essential to setting up secure drives and mediating technology issues associated with data collection. The library has identified multiple interaction points and is committed to ensuring that student interaction data will continue to be collected and contributed. Successful “institutionalization” of the project will be characterized by a commitment to ongoing population and use of the dashboard to evaluate library effectiveness and reach. Library interactions will be available for inclusion in campuswide learning analytics initiatives.

Next Steps

More remains to be done. Data have not yet been analyzed by persistence; we would like to know whether students who used library services tended to return to school the following major semester at a greater rate than those who did not use library services. Also, many of the student success initiatives at the institution focus on a single group of students with a shared characteristic, like the previously mentioned transfer students. STEM students and “murky middle” students—those who complete their first year with a GPA between 2.0 and 2.59 but are still at risk of dropping out before completing their degree—are other high-profile populations that we would like to analyze separately. If the same trend holds true for these students, results will be shared with STEM faculty and grants investigators who are exploring how to retain STEM students, as well as with administrators who oversee performance funding metrics. Programmatically, the library and student support units could provide intense support for students who fall into the murky middle with the goal of increasing their academic success. Student outcomes also have not been analyzed by various interaction points to see which potentially have the greatest impact. This would inform how to allocate resources to best support student success.

Limitations

Several limitations are associated with this study. First, only five service points were included in the study. These service points constitute a minority of services and resources that students have access to, so it is highly likely that the number of students who used the library during the data collection period simply were not included. For example, a student who only checked out books would not appear at all, while a student who checked out books and attended instruction would only be counted for instruction. Going forward, with the larger number of interaction points now included or planned, a more robust picture of library users will be available.

Further, interaction points that have authenticated data, or require login, are generally considered “clean” in that the data collected are accurate, but manual input of student ID numbers (such as the research consultation request form) often does not equate to correct information and requires verification through a student information database. Some of these requests were not successfully resolved and were pulled from the report. Additional over- or undercounting is a possibility. The time it took to collect student ID numbers during in-person instruction sessions—and the disruption to the class—called for a different solution. As such, class rosters were pulled and input, which may have led to slight overcounting if students enrolled in the course were absent from the instruction session. Similarly, study rooms require only one student ID to reserve, but they are usually used by more students, so that interaction point is undercounted.

Perhaps most fundamental is the caveat that quantitative analysis applied to human studies is by nature a limitation. As such, it is recommended that results of any quantitative analysis be supplemented with interpretive assessment, such as focus groups, surveys, individual interviews, or observations. This is necessary to more fully understand the role of the library in the student experience. An example of this is Zaugg and Rackman, who developed a series of library personas using interpretive measures to further connect library services to undergraduate student needs.76 The personas, or user groups, can provide a deeper understanding of patron use patterns and needs, thereby not only informing what services people use at various points in their academic careers, but also the refinement and development of services. Results of this study suggest that some student populations are not using library services to the same degree as other groups, and developing personas around these user groups may inform both the provision of needed services and outreach or marketing that resonates with the user group.

Another limitation associated with studies designed to tease out factors that may contribute to student success is the very complexity of the student environment. The library is one small part of any student’s ecosystem, and positive associations may be more of an indication of engagement as a characteristic of successful students than actual library use. Certainly, putting quantitative studies such as the one described here in the context of students’ broader academic lives may provide a more complete understanding of the library’s role. Asher et al. examined the academic and personal lives of students through an ethnographic study of more than 200 students across eight institutions in the United States.77 Reviewing data collected on spatial patterns, such as travel and commute times, and activities from across the range of the social and academic landscape, the researchers concluded that understanding the complexity of student experiences is prerequisite to understanding their needs and priorities and providing library services and spaces that are sensitive to those realities.

Conclusion

An early contribution to the literature is Soria, Fransen, and Nackerud’s article on library use and undergraduate student outcomes, which recommends that libraries identify outcomes important to their institutions.78 We concur that libraries should use the language and the arguments that resonate with our respective audiences. Our president has stated that UCF is a university of access, so our narrative connects the dots between access to information and services from day one and student success. It also is important to understand the language of assessment, student success, and learning analytics.

The recommendation to tie assessment efforts to institutional goals leads to another suggestion, and that is ensuring that the resources necessary to design and conduct the study and analyze data have been secured. Some libraries may have a statistician or assessment officer on staff; if not, one might consider scanning the institution to see if there is opportunity to build partnerships or collaborations. It is logical that a study designed by people who are familiar with library interaction points, data collection procedures and handling of sensitive data to meet privacy mandates, matching of student identifiers to demographic and academic information, analysis and interpretation of data, and dissemination and acting upon results will be a stronger study than one planned by people without a solid background in these areas. Rarely would a library employ staff able to fill all of these roles. The team for this study included numerous people from the library, the statistics department, and information technology and institutional effectiveness units.

Further, if campus partners are brought in, consider who has a stake in the data and what their expectations are. For example, who owns the data? How are workflows determined? How long is the commitment? How will results be shared? Are grant submissions and publications expected? If so, consider negotiating them from the outset. Also determine a project manager and duties associated with that role. Designing the study and collecting data are critical components, but analysis and dissemination—and communicating and maintaining relationships with campus partners—may constitute an even larger part. An estimation of time and resources needed, internal to the library, should be brought to the attention of the dean or director with the goal of securing a commitment to provide the resources necessary for successful implementation.

Although there is a growing body of evidence that library use positively correlates with student success, academic libraries typically do not contribute student interaction data to campuswide learning analytics initiatives.79 Certainly, constraints abound. It may be difficult to be included in learning analytics at the institutional level, especially given that many libraries do not fall under the same organizational umbrella as academic advising, tutoring, and other student support services. Despite these challenges, a final recommendation is that academic libraries that collect student data move beyond correlation results and position their libraries to contribute to institutional learning analytics initiatives. At UCF the library is doing this in phases, with the first phase the interactive web form.

The library had long reported output data for various reports and rose to the challenge when asked by the provost to provide evidence of measurable impact on student academic performance. We now have hard evidence that our services and resources are positively associated with student success. Currently we are developing an interactive web form connecting library use on an expanded set of interaction points with GPA and persistence. The web form will allow us to analyze by student demographic variables (FTIC/transfer, STEM/nonSTEM, gender, ethnicity, and others) as well as by library interaction type (services, space, and resources) or by specific interaction points (for example, consultations or study room use).

But the interactive dashboard is not the culmination of the project; it is simply the next step. Sending library interaction data to a centralized unit positions the library to be part of the larger learning analytics initiative on campus. The library will continue to build the case that use of its resources plays a part in student success and that library interactions could constitute a significant piece of the learning analytics puzzle. Additionally, the data will be used to evaluate existing services and resources or to create new ones. One possibility is collaborating with campus partners to provide intensive training on topics such as effective studying and test-taking, writing for academic purposes, and using library resources for course assignments. Assuming activities like these are impactful, we will reallocate resources—and advocate for additional support—to further align library services and resources to student success goals defined by the institution.

Given that the majority of library deans and directors indicated in 2016 that the most important priority for their library was supporting student success yet acknowledged that they had not articulated or demonstrated how their libraries do this,80 the research team anticipated that few of the studies reviewed for this article would have shared their results beyond the library. However, we were encouraged to find that more than half of research reports mentioned that results had been shared with students, faculty, campus administrators, and/or campus academic support units. Still, only one study explicitly described how the library is now part of their institutional learning analytics initiative. With the growing body of evidence that students who engage with library services and resources enjoy better academic outcomes, we believe that libraries should strongly consider placing library interaction data into an enterprise data warehouse and advocate for inclusion in institutional learning analytics efforts. This is predicated, of course, on stewardship of private student information and a commitment to holistic assessment that improves library programs and student outcomes.

Notes

1. UCF Institutional Knowledge Management, “UCF Preeminence,” available online at https://analytics.ucf.edu/performance/preeminence/ [accessed 4 March 2020].

2. Board of Governors, State University System of Florida, “Performance Based Funding,” available online at https://www.flbog.edu/finance/performance-based-funding/ [accessed 12 October 2019].

3. Dale Whittaker, Provost Forum, Student Success, presentation slides, February 13, 2017, https://ucf.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=0b408d62-fc2a-434c-87de-5ee1a0a05bf5 [accessed 14 December 2019].

4. Board of Governors, State University System of Florida, “Florida Board of Governors Performance Funding Allocation, 2016–2017,” available online at https://www.flbog.edu/wp-content/uploads/Allocation-Year-3-2016-17-Revised-6_30_16.pdf [accessed 20 June 2019].

5. UCF Institutional Knowledge Management, “Student Success Rates by Cohort Year: FTIC,” available online at https://ikm.ucf.edu/facts-and-reports/interactive-facts/retention-graduation-2/ [accessed 13 October 2019]. This graph indicates that 89.6 percent of 2016/2017 first-time college students (those who enter as freshmen) were retained, and 84.2 percent of those students were still active students at the end of the second year, for a 5.4 percent attrition rate.

6. Association of College and Research Libraries (ACRL), The Value of Academic Libraries: A Comprehensive Research Review and Report, researched by Megan Oakleaf (Chicago, IL: ACRL, 2010), available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf [accessed 15 September 2019].

7. Association of College and Research Libraries, ACRL Research Planning and Review Committee, Environmental Scan 2017, available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/whitepapers/EnvironmentalScan2017.pdf [accessed 13 June 2019].

8. Christine Wolff-Eisenberg, Ithaka S+R US Library Survey 2016, available online at https://sr.ithaka.org/wp-content/uploads/2017/03/SR_Report_Library_Survey_2016_04032017.pdf [accessed 13 June 2019].

9. Wolff-Eisenberg, Ithaka S+R US Library Survey 2016, 3–4.

10. Andrew Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success (Bloomington, 2017), 1, available online at https://scholarworks.iu.edu/dspace/handle/2022/21277 [accessed 13 June 2019]; Elizabeth L. Black and Sarah Anne Murphy, “The Out Loud Assignment: Articulating Library Contributions to First-Year Student Success,” Journal of Academic Librarianship 43 no. 5 (2017): 410, https://doi.org/10.1016/j.acalib.2017.06.008; Joni Blake et al., “The Impact of Information Literacy Instruction on Student Success: A Multi-Institutional Investigation and Analysis,” Central University Libraries Research 13 (2017): 6, available online at https://scholar.smu.edu/libraries_cul_research/13 [accessed 7 October 2019]; Melissa Bowles-Terry, “Library Instruction and Academic Success: A Mixed-Methods Assessment of a Library Instruction Program,” Evidence Based Library and Information Practice 7, no. 1 (2012): 83, https://doi.org/10.18438/B8PS4D; Felly Chiteng Kot and Jennifer L. Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance: A Propensity Score Matching Design,” College & Research Libraries 76, no. 5 (2015): 570, https://doi.org/10.5860/crl.76.5.566; Jean Marie Cook, “A Library Credit Course and Student Success Rates: A Longitudinal Study,” College & Research Libraries 75, no. 3 (2014): 274, https://doi.org/10.5860/crl12-424; Priscilla Coulter, Susan Clarke, and Carol Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” Public Services Quarterly 3, no. 1 (2007): 148, https://doi.org/10.1300/J295v03n01_08; Ula Gaha, Suzanne Hinnefeld, and Catherine Pellegrino, “The Academic Library’s Contribution to Student Success: Library Instruction and GPA,” College & Research Libraries 79, no. 6 (2018): 741, https://doi.org/10.5860/crl.79.6.737; Laura W. Gariepy, Bettina Peacemaker, and Valeriana Colon, “Stop Chasing Unicorns: Setting Reasonable Expectations for the Impact of Library Instruction Programs (and Other Library Services) on Student Success,” Performance Measurement and Metrics 18, no. 2 (2017): 104, https://doi.org10.1108/PMM-05-2017-0025; Dennis Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” Evidence Based Library and Information Practice 13, no. 2 (2018): 6, https://doi.org/10.18438/eblip29402; Lisa Massengale, Pattie Piotrowski, and Devin Savage, “Identifying and Articulating Library Connections to Student Success,” College & Research Libraries 77, no. 2 (2016): 230, https://doi.org/10.5860/crl.77.2.227; Adam Murray, Ashley Ireland, and Jana Hackathorn, “The Value of Academic Libraries: Library Services as a Predictor of Student Retention,” College & Research Libraries 77, no. 5 (2016): 636, https://doi.org/10.5860/crl.77.5.631; Shane Nackerud et al., “Analyzing Demographics: Assessing Library Use Across the Institution,” portal: Libraries and the Academy 13, no. 2 (2013): 135, https://doi.org/10.1353/pla.2013.0017; Krista M. Soria, Jan Fransen, and Shane Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” College & Research Libraries 78, no. 6 (2017): 815, https://doi.org/10.5860/crl.78.6.812; Krista M. Soria, Jan Fransen, and Shane Nackerud, “Library Use and Undergraduate Student Outcomes: New Evidence for Students’ Retention and Academic Success,” portal: Libraries and the Academy 13, no. 2 (2013): 151, https://doi.org/10.1353/pla.2013.0010; Sara Davidson Squibb and Susan Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” College & Research Libraries 77, no. 2 (2016): 165, https://doi.org/10.5860/crl.77.2.164; Angie Thorpe et al., “The Impact of the Academic Library on Student Success: Connecting the Dots,” portal: Libraries and the Academy 16, no. 2 (2016): 377, https://doi.org/10.1353/pla.2016.0027; Jason M. Vance, Rachel Kirk, and Justin G. Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence: A Quantitative Analysis,” Communications in Information Literacy 6, no. 1 (2012): 52; Shun Han Rebekah Wong and Dianne Cmor, “Measuring Association between Library Instruction and Graduation GPA,” College & Research Libraries 72, no. 5 (2011): 464, https://doi.org/10.5860/crl-151.

11. Ellen Collins and Graham Stone, “Understanding Patterns of Library Use among Undergraduate Students From Different Disciplines,” Evidence Based Library and Information Practice 9, no. 3 (2014): 54, https://doi.org/10.18438/B8930K; Gaby Haddow and Jayanthi Joseph, “Loans, Logins, and Lasting the Course: Academic Library Use and Student Retention,” Australian Academic & Research Libraries 41, no. 4 (2010): 236, https://doi.org/10.1080/00048623.2010.10721478; Kot and Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance,” 570; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 636; Nackerud et al., “Analyzing Demographics,” 135; Krista M. Soria, Jan Fransen, and Shane Nackerud, “Beyond Books: The Extended Academic Benefits of Library Use for First-Year College Students,” College & Research Libraries 78, no. 1 (2017): 12, https://doi.org/10.5860/crl.78.1.8; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 151; Krista M. Soria, Jan Fransen, and Shane Nackerud, “Stacks, Serials, Search Engines, and Students’ Success: First-Year Undergraduate Students’ Library Use, Academic Achievement, and Retention,” Journal of Academic Librarianship 40, no. 1 (2014): 86, https://doi.org/10.1016/j.acalib.2013.12.002 ; John K. Stemmer and David M. Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” College & Research Libraries 77, no. 3 (2016): 373, https://doi.org/10.5860/crl.77.3.359.

12. Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 6; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230; Nackerud et al., “Analyzing Demographics,” 136; Soria, Fransen, and Nackerud, “Beyond Books,” 12; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 815; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 152; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines, and Students’ Success,” 86; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 373; Thorpe et al., “The Impact of the Academic Library on Student Success,” 377; Jennifer Wells, “The Influence of Library Usage on Undergraduate Academic Success,” Australian Academic & Research Libraries 26, no. 2 (1995): 124, https://doi.org/10.1080/00048623.1995.10754923.

13. Collins and Stone, “Understanding Patterns of Library Use among Undergraduate Students From Different Disciplines,” 56; Karin de Jager et al., “The Use of Academic Libraries in Turbulent Times,” Performance Measurement and Metrics 19, no. 1 (2018): 44, https://doi.org/10.1108/PMM-09-2017-0037; Deborah Goodall and David Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” Library Management 32, no. 3 (2011): 163, https://doi.org/10.1108/01435121111112871; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230; John Renaud, Scott Britton, Dingding Wang, and Mitsunori Ogihara, “Mining Library and University Data to Understand Library Use Patterns,” The Electronic Library 33, no. 3 (2015): 357, https://doi.org/10.1108/EL-07-2013-0136; Graham Stone, David Pattern, and Bryony Ramsden, “Library Impact Data Project,” SCONUL Focus 54 (2012): 25, available online at https://www.sconul.ac.uk/sites/default/files/documents/8_0.pdf [accessed 12 September 2019]; Graham Stone and Bryony Ramsden, “Library Impact Data Project: Looking for the Link between Library Usage and Student Attainment,” College & Research Libraries 74, no. 6 (2012): 548, https://doi.org/10.5860/crl12-406; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 124; Sue White and Graham Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” Serials: The Journal for the Serials Community 23, no. 2 (2010), 84, https://doi.org/10.1629/2383.

14. Kot and Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance,” 570; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230; Maximiliano Montenegro et al., “Library Resources and Students’ Learning Outcomes: Do All the Resources Have the Same Impact on Learning?” Journal of Academic Librarianship 42, no. 5 (2016): 552, https://doi.org/10.1016/j.acalib.2016.06.020; Stemmer and Mahan, “Investigating the Relationship of Library Usage,” 373.

15. DeeAnn Allison, “Measuring the Academic Impact of Libraries,” portal: Libraries and the Academy 15, no. 1 (2015): 32, https://doi.org/10.1353/pla.2015.0001; Collins and Stone, “Understanding Patterns of Library Use among Undergraduate Students from Different Disciplines,” 56; Thorpe et al., “The Impact of the Academic Library on Student Success,” 377.

16. Nackerud et al., “Analyzing Demographics,” 135; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 151; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 86.

17. Black and Murphy, “The Out Loud Assignment,” 410.

18. Collins and Stone, “Understanding Patterns of Library Use among Undergraduate Students From Different Disciplines,” 56.

19. Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230.

20. Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230.

21. Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 631–42.

22. Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 631–42.

23. Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 230; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 636–37; Nackerud et al., “Analyzing Demographics,” 134–36; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 151-52; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 86; Stemmer and Mahan, “Investigating the Relationship of Library Usage,” 373; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 124.

24. Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 363; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 124.

25. Atif Yousef Odeh, “Use of Information Resources by Undergraduate Students and Its Relationship with Academic Achievement,” Libri 62, no. 3 (2012): 225, https://doi.org/10.1515/libri-2012-0018; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 363; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 124.

26. Gregory A. Crawford, “The Academic Library and Student Retention and Graduation: An Exploratory Study,” portal: Libraries and the Academy 15, no. 1 (2015): 46, https://doi.org/10.1353/pla.2015.0003.

27. Mark Emmons and Frances C. Wilkinson, “The Academic Library Impact on Student Persistence,” College & Research Libraries 72, no. 2 (2011): 131, https://doi.org/10.5860/crl-74r1.

28. Allison, “Measuring the Academic Impact of Libraries,” 35; Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 1; Bowles-Terry, “Library Instruction and Academic Success,” 88; Yakup Çetin and Vivian Howard, “An Exploration of the Relationship between Undergraduate Students’ Library Book Borrowing and Academic Achievement,” Journal of Librarianship & Information Science 48, no. 4 (2016): 384, https://doi.org/10.1177/0961000615572404; Ed Cherry, Stephanie Havron Rollins, and Toner Evans, “Proving Our Worth: The Impact of Electronic Resource Usage on Academic Achievement,” College & Undergraduate Libraries 20, no. 3/4 (2013): 389, https://doi.org/10.1080/10691316.2013.829378; Coulter, Clarke, and Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” 154; Brian Cox and Margie Jantti, “Discovering the Impact of Library Use and Student Performance,” EDUCAUSE Review Online (July 2012), available online at www.educause.edu/ero/article/discovering-impact-library-use-and-student-performance [accessed 12 September 2019]; Karen S. Davidson, Stephanie Havron Rollins, and Ed Cherry, “Demonstrating Our Value: Tying Use of Electronic Resources to Academic Success,” Serials Librarian 65, no. 1 (2013): 75, http://doi.org/10.1080/0361526X.2013.800630; de Jager et al., “The Use of Academic Libraries in Turbulent Times,” 44; Gaha, Hinnefeld, and Pellegrino, “The Academic Library’s Contribution to Success,” 738; Gariepy, Peacemaker, and Colon, “Stop Chasing Unicorns,” 104; Goodall and Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” 166; Kot and Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance,” 572; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 231; Montenegro et al., “Library Resources and Students’ Learning Outcomes,” 553; Nackerud et al., “Analyzing Demographics,” 140; Richard Nurse, Kirsty Baker, and Anne Gambles, “Library Resources, Student Success and the Distance-Learning University,” Information and Learning Sciences 119, no. 1/2 (2018): 81, https://doi.org/10.1108/ILS-03-2017-0022; Odeh, “Use of Information Resources by Undergraduate Students and Its Relationship with Academic Achievement,” 226; Renaud et al., “Mining Library and University Data to Understand Library Use Patterns,” 366; Sue Samson, “Usage of E-Resources: Virtual Value of Demographics,” Journal of Academic Librarianship 40, no. 6 (2014): 622, https://doi.org/10.1016/j.acalib.2014.10.005; Mitchell Scott, “Interlibrary Loan Article Use and User GPA: Findings and Implications for Library Services,” Journal of Access Services 11, no. 4 (2014): 230, https://doi.org/10.1080/15367967.2014.945116; Squibb and Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” 167; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 125; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 85; Wong and Cmor, “Measuring Association between Library Instruction and Graduation GPA,” 464; Shun Han Rebekah Wong and T.D. Webb, “Uncovering Meaningful Correlation between Student Academic Performance and Library Material Usage,” College & Research Libraries 72, no. 4 (2011): 363, https://doi.org/10.5860/crl-129.

29. Gaby Haddow, “Academic Library Use and Student Retention: A Quantitative Analysis,” Library & Information Science Research 35, no. 2 (2013): 129, https://doi.org/10.1016/j.lisr.2012.12.002; Haddow and Joseph, “Loans, Logins, and Lasting the Course,” 238; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 638.

30. Black and Murphy, “The Out Loud Assignment,” 412; Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 6; Tiffany LeMaistre, Qingmin Shi, and Sandip Thanki, “Connecting Library Use to Student Success,” portal: Libraries and the Academy 18, no. 1 (2018): 118, https://doi.org/10.1353/pla.2018.0006; Soria, Fransen, and Nackerud. “Library Use and Undergraduate Student Outcomes,” 152; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 87; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 363; Thorpe et al., “The Impact of the Academic Library on Student Success,” 379–82; Vance, Kirk, and Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence,” 50.

31. Blake et al., “The Impact of Information Literacy Instruction on Student Success,” 14.

32. Crawford, “The Academic Library and Student Retention and Graduation,” 48; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 129.

33. Cook, “A Library Credit Course and Student Success Rates,” 274–75; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 815.

34. Stone, Pattern, and Ramsden, “Library Impact Data Project,” 25; Stone and Ramsden, “Library Impact Data Project,” 552.

35. Allison, “Measuring the Academic Impact of Libraries,” 36; Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 3; Çetin and Howard, “An Exploration of the Relationship between Undergraduate Students’ Library Book Borrowing and Academic Achievement,” 385; Cherry, Rollins, and Evans, “Proving Our Worth,” 392; Cox and Jantti, “Discovering the Impact of Library Use and Student Performance”; Davidson, Rollins, and Cherry, “Demonstrating Our Value,” 77; de Jager et al., “The Use of Academic Libraries in Turbulent Times,” 45–46; Gaha, Hinnefeld, and Pellegrino, “The Academic Library’s Contribution to Success,” 743; Goodall and Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” 166; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 231–32; Montenegro et al., “Library Resources and Students’ Learning Outcomes,” 554; Nackerud et al., “Analyzing Demographics,” 140; Nurse, Baker, and Gambles, “Library Resources, Student Success and the Distance-Learning University,” 84; Renaud et al., “Mining Library and University Data to Understand Library Use Patterns,” 368; Samson, “Usage of E-Resources,” 624; Scott, “Interlibrary Loan Article Use and User GPA,” 232; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 85; Wong and Cmor, “Measuring Association between Library Instruction and Graduation GPA,” 469; Wong and Webb, “Uncovering Meaningful Correlation between Student Academic Performance and Library Material Usage,” 366.

36. Allison, “Measuring the Academic Impact of Libraries,” 37; Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 5; Cherry, Rollins, and Evans, “Proving Our Worth,” 394; Cox and Jantti, “Discovering the Impact of Library Use and Student Performance”; Davidson, Rollins, and Cherry, “Demonstrating Our Value,” 77; Gaha, Hinnefeld, and Pellegrino, “The Academic Library’s Contribution to Success,” 744; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 231–32.

37. Cox and Jantti, “Discovering the Impact of Library Use and Student Performance”; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 85.

38. Bowles-Terry, “Library Instruction and Academic Success,” 88.

39. Odeh, “Use of Information Resources by Undergraduate Students and Its Relationship with Academic Achievement,” 229.

40. Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 126.

41. Wong and Cmor, “Measuring Association between Library Instruction and Graduation GPA,” 469–70.

42. Coulter, Clarke, and Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” 159.

43. Gariepy, Peacemaker, and Colon, “Stop Chasing Unicorns,” 106; Squibb and Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” 173.

44. Black and Murphy, “The Out Loud Assignment,” 412; LeMaistre, Shi, and Thanki, “Connecting Library Use to Student Success,” 125–29; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 154; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 88–89; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 369–70.

45. Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 7–8; Thorpe et al., “The Impact of the Academic Library on Student Success,” 384; Vance, Kirk, and Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence,” 54–56.

46. Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 8.

47. Thorpe et al., “The Impact of the Academic Library on Student Success,” 384.

48. Vance, Kirk, and Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence,” 56.

49. Haddow, “Academic Library Use and Student Retention,” 131; Haddow and Joseph, “Loans, Logins, and Lasting the Course,” 240; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 639.

50. Blake et al., “The Impact of Information Literacy Instruction on Student Success,” 18; Cook, “A Library Credit Course and Student Success Rates,” 276; Crawford, “The Academic Library and Student Retention and Graduation,” 53; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 143; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 817.

51. Stone, Pattern, and Ramsden, “Library Impact Data Project,” 25.

52. Cook, “A Library Credit Course and Student Success Rates,” 276; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 817–19.

53. Crawford, “The Academic Library and Student Retention and Graduation,” 55; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 145.

54. Blake et al., “The Impact of Information Literacy Instruction on Student Success,” 18.

55. Stone, Pattern, and Ramsden, “Library Impact Data Project,” 26.

56. Allison, “Measuring the Academic Impact of Libraries,” 37; Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 3; Black and Murphy, “The Out Loud Assignment,” 412; Blake et al., “The Impact of Information Literacy Instruction on Student Success,” 18; Çetin and Howard, “An Exploration of the Relationship between Undergraduate Students’ Library Book Borrowing and Academic Achievement,” 385; Cherry, Rollins, and Evans, “Proving Our Worth,” 392; Cox and Jantti, “Discovering the Impact of Library Use and Student Performance”; Crawford, “The Academic Library and Student Retention and Graduation,” 53; Davidson, Rollins, and Cherry, “Demonstrating Our Value,” 77; de Jager et al., “The Use of Academic Libraries in Turbulent Times,” 45–46; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 145; Gaha, Hinnefeld, and Pellegrino, “The Academic Library’s Contribution to Success,” 743; Goodall and Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” 166; Haddow, “Academic Library Use and Student Retention,” 131; Haddow and Joseph, “Loans, Logins, and Lasting the Course,” 240; LeMaistre, Shi, and Thanki, “Connecting Library Use to Student Success,” 125–29; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 231–32; Montenegro et al., “Library Resources and Students’ Learning Outcomes,” 555; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 639; Nackerud et al., “Analyzing Demographics,” 140; Nurse, Baker, and Gambles, “Library Resources, Student Success and the Distance-Learning University,” 84; Renaud et al., “Mining Library and University Data to Understand Library Use Patterns,” 368; Samson, “Usage of E-Resources,” 624; Scott, “Interlibrary Loan Article Use and User GPA,” 233; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 817; Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 154; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 88–89; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 369–70; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 85; Wong and Cmor, “Measuring Association between Library Instruction and Graduation GPA,” 469; Wong and Webb, “Uncovering Meaningful Correlation between Student Academic Performance and Library Material Usage,” 366.

57. Bowles-Terry, “Library Instruction and Academic Success,” 88; Kot and Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance,” 582–83; Coulter, Clarke, and Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” 159; Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 7–8; Odeh, “Use of Information Resources by Undergraduate Students and Its Relationship with Academic Achievement,” 229; Stone, Pattern, and Ramsden, “Library Impact Data Project,” 26; Stone and Ramsden, “Library Impact Data Project,” 554; Thorpe et al., “The Impact of the Academic Library on Student Success,” 384; Vance, Kirk, and Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence,” 56; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 125–26.

58. Gariepy, Peacemaker, and Colon, “Stop Chasing Unicorns,” 106; Squibb and Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” 175.

59. Black and Murphy, “The Out Loud Assignment,” 413; Coulter, Clarke, and Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” 159–60; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 145–46; Gariepy, Peacemaker, and Colon, “Stop Chasing Unicorns,” 106–07.

60. Allison, “Measuring the Academic Impact of Libraries,” 39; Çetin and Howard, “An Exploration of the Relationship between Undergraduate Students’ Library Book Borrowing and Academic Achievement,” 387; Cook, “A Library Credit Course and Student Success Rates,” 282; Coulter, Clarke, and Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” 162; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 146; Nurse, Baker, and Gambles, “Library Resources, Student Success and the Distance-Learning University,” 85; Renaud et al., “Mining Library and University Data to Understand Library Use Patterns,” 370; Soria, Fransen, and Nackerud, “Beyond Books,” 21; Vance, Kirk, and Gardner, “Measuring the Impact of Library Instruction on Freshmen Success and Persistence,” 58; Wells, “The Influence of Library Usage on Undergraduate Academic Success,” 128; Wong and Cmor, “Measuring Association between Library Instruction and Graduation GPA,” 472.

61. Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 5; Çetin and Howard, “An Exploration of the Relationship between Undergraduate Students’ Library Book Borrowing and Academic Achievement,” 387.

62. Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 5; Crawford, “The Academic Library and Student Retention and Graduation,” 56; Emmons and Wilkinson, “The Academic Library Impact on Student Persistence,” 146.

63. Blake et al., “The Impact of Information Literacy Instruction on Student Success,” 18; Goodall and Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” 167–68; Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 11; LeMaistre, Shi, and Thanki, “Connecting Library Use to Student Success,” 137; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 234.

64. Gariepy, Peacemaker, and Colon, “Stop Chasing Unicorns,” 107.

65. Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 11; Squibb and Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” 176.

66. Bowles-Terry, “Library Instruction and Academic Success,” 91; Cherry, Rollins, and Evans, “Proving Our Worth,” 395–96; Collins and Stone, “Understanding Patterns of Library Use among Undergraduate Students from Different Disciplines,” 63; Davidson, Rollins, and Cherry, “Demonstrating Our Value,” 78; Goodall and Pattern, “Academic Library Non/Low Use and Undergraduate Student Achievement,” 168; Haddow, “Academic Library Use and Student Retention,” 135; Haddow and Joseph, “Loans, Logins, and Lasting the Course,” 241–42; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 234; Murray, Ireland, and Hackathorn, “The Value of Academic Libraries,” 641; Nackerud et al., “Analyzing Demographics,” 143; Samson, “Usage of E-Resources,” 624; Scott, “Interlibrary Loan Article Use and User GPA,” 234; Soria, Fransen, and Nackerud, “The Impact of Academic Library Resources on Undergraduates’ Degree Completion,” 821; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 90–91; Stemmer and Mahan, “Investigating the Relationship of Library Usage to Student Outcomes,” 372; Thorpe et al., “The Impact of the Academic Library on Student Success,” 388; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 89.

67. Asher, Evaluating the Effect of Course-Specific Library Instruction on Student Success, 6; Black and Murphy, “The Out Loud Assignment,” 415; Crawford, “The Academic Library and Student Retention and Graduation,” 55; Davidson, Rollins, and Cherry, “Demonstrating Our Value,” 78; de Jager et al., “The Use of Academic Libraries in Turbulent Times,” 50–51; Gaha, Hinnefeld, and Pellegrino, “The Academic Library’s Contribution to Success,” 745; Kot and Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance,” 583; Krieb, “Assessing the Impact of Reference Assistance and Library Instruction on Retention and Grades Using Student Tracking Technology,” 11; LeMaistre, Shi, and Thanki, “Connecting Library Use to Student Success,” 137; Massengale, Piotrowski, and Savage, “Identifying and Articulating Library Connections to Student Success,” 234; Montenegro et al., “Library Resources and Students’ Learning Outcomes,” 556; Nackerud et al., “Analyzing Demographics,” 143; Odeh, “Use of Information Resources by Undergraduate Students and Its Relationship with Academic Achievement,” 231; Samson, “Usage of E-Resources,” 624; Soria, Fransen, and Nackerud. “Library Use and Undergraduate Student Outcomes,” 161; Soria, Fransen, and Nackerud, “Stacks, Serials, Search Engines and Students’ Success,” 90–91; Squibb and Mikkelsen, “Assessing the Value of Course-Embedded Information Literacy on Student Learning and Achievement,” 177; Stone and Ramsden, “Library Impact Data Project,” 557; Thorpe et al., “The Impact of the Academic Library on Student Success,” 388; White and Stone, “Maximizing Use of Library Resources at the University of Huddersfield,” 89–90; Wong and Webb, “Uncovering Meaningful Correlation between Student Academic Performance and Library Material Usage,” 368.

68. Cox and Jantti, “Discovering the Impact of Library Use and Student Performance.”

69. Megan Oakleaf et al., “Academic Libraries & Institutional Learning Analytics: One Path to Integration,” Journal of Academic Librarianship 43, no. 5 (2017): 454–61, https://doi.org/10.1016/j.acalib.2017.08.008.

70. Megan Oakleaf, “The Problems and Promise of Learning Analytics for Increasing and Demonstrating Library Value and Impact,” Information and Learning Sciences 119, no. 1/2 (2018): 16–24, https://doi.org/10.1108/ILS-08-2017-0080; Megan Oakleaf, Scott Walter, and Malcolm Brown, “The Academic Library and the Promise of NGDLE,” Educause Review (Aug. 14, 2017), available online at http://er.educause.edu/articles/2017/8/the-academic-library-and-the-promise-of-ngdle [accessed 12 September 2019].

71. Soria, Fransen, and Nackerud. “Library Use and Undergraduate Student Outcomes,” 162.

72. UCF Online, “A Top 15 Institution,” available online at https://www.ucf.edu/online/ [accessed 12 July 2019].

73. UCF Institutional Knowledge Management, “Interactive Facts: SCH & FTE” (web form requires login).

74. Gene Kruckmeyer, “UCF Recognized for Programs Benefitting Transfer Students,” UCF Today (Nov. 1, 2018), available online at https://www.ucf.edu/news/ucf-recognized-programs-benefiting-transition-students/ [accessed 12 July 2019].

75. Alina Tugend, “Colleges and Universities Woo Once-Overlooked Transfer Students,” The New York Times (Aug. 2, 2018), available online at https://www.nytimes.com/2018/08/02/education/learning/transfer-students-colleges-universities.html [accessed 10 July 2019].

76. Holt Zaugg and Scott Rackham, “Identification and Development of Patron Personas for an Academic Library,” Performance Measurement and Metrics 17, no. 2 (2016): 124–33.

77. Andrew Asher et al., “Mapping Student Days: Collaborative Ethnography and the Student Experience,” Collaborative Librarianship 9, no. 4 (2017), available online at https://digitalcommons.du.eu/collaborativelibrarianship/vol9/iss4/7 [accessed 10 October 2019].

78. Soria, Fransen, and Nackerud, “Library Use and Undergraduate Student Outcomes,” 160.

79. Oakleaf, Walter, and Brown, “The Academic Library and the Promise of NGDLE.”

80. Wolff-Eisenberg, Ithaka S+R US Library Survey 2016.

* Penny Beile is Associate Director of Research, Education, & Engagement in the UCF Libraries at the University of Central Florida; email: pbeile@ucf.edu. Kanak Choudhury is a PhD Candidate and Teaching Assistant in the Department of Statistics at Iowa State University; email: kanakc@iastate.edu. Rachel Mulvihill is Head of UCF Downtown Library at the University of Central Florida; email: Rachel.mulvihill@ucf.edu. Morgan Wang is Professor and Director of Data Mining, Statistics and Data Science at the University of Central Florida; email: Chung-ching.wang@ucf.edu. For their many contributions and assistance, we thank Meghal Parikh, UCF Institutional Knowledge Management (now Director of Institutional Analytics at Rollins College, Orlando, FL); Terri Gotschall, UCF College of Medicine Library; Megan Haught, UCF Libraries; and Joel Lavoie, UCF Information Technology. ©2020 Penny Beile, Kanak Choudhury, Rachel Mulvihill, and Morgan Wang, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Penny Beile, Kanak Choudhury, Rachel Mulvihill, Morgan Wang


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2021
January: 56
February: 36
March: 36
April: 44
May: 9
2020
January: 0
February: 0
March: 0
April: 322
May: 95
June: 94
July: 59
August: 38
September: 74
October: 83
November: 63
December: 45