DeFrain_etal

Effectiveness of Academic Library Research Guides for Building College Students’ Information Literacy Skills: A Scoping Review

Academic librarians invest significant time and effort in developing and maintaining research guides, yet the extent to which these tools effectively support college students’ information literacy development remains uncertain. This scoping review aimed to comprehensively examine the existing literature on the effectiveness of academic library research guides in building students’ information literacy skills. Following a rigorous screening process of 1,724 publications, 61 studies met the inclusion criteria for analysis. The review reveals that much of the research in this area stems from usability studies and exploratory single site case studies, many of which are characterized by limited methodological transparency and a lack of clearly defined outcomes related to student learning. These findings highlight both the growing interest in evaluating research guides and the need for more robust, outcome-based research that directly examines their impact on information literacy. This review provides a foundation for future studies that seek to assess and improve the pedagogical value of research guides in academic settings.

Introduction

The overwhelming information landscape has presented myriad challenges for society; information overload and increased exposure to mis- and dis-information have made it more important than ever to ensure that universities equip students with information literacy skills (IL). Working to ensure students information literacy has been a longtime concern for academic librarians; however, the need to develop effective IL practices and programs has become increasingly important due to a number of factors, including the damaging persistence of anti-intellectualism (Stewart, 2022); students’ rapid evaluative heuristics, which often fail to detect misleading and false information (Wineburg et al., 2022); and increased pressures from employers to align new graduates’ critical thinking abilities with workplace and workforce expectations (Head & Eisenberg, 2010; Taylor et al., 2022).

Over the past few decades, academic library research guides have become one of the most widely adopted devices through which librarians and other information professionals strive to teach students to navigate, select, locate, and use relevant sources of information for their academic and learning needs (Gardois et al., 2012; Hemmig, 2005; Hennesy & Adams, 2021). Also referred to as pathfinders, finding aids, subject guides, course guides, and topic guides (henceforth referred to as guides), guides are typically created for “a subject area, a type of user, a tool, or a class and contain links, videos, and handouts that are intended to help a user access a resource or learn something” (German et al., 2017, p. 162). Born from traditional bibliographic approaches to compiling information, in which librarians presented carefully curated topical collections to guide researchers (Dunsmore, 2002), the first guides were viewed as efforts towards scaling reference services, as “the librarian cannot always help and is not always asked” (Harbeson, 1972, p. 111). Today’s guides continue to promote the idea of scalability of researcher support to an ostensibly global audience. In addition to their potential to educate en masse, numerous presumed benefits have helped to drive and sustain this approach, including beliefs that: guides attract a user base largely reluctant to seek help from librarians; they train students in fundamental information seeking skills and help introduce them to navigating academic libraries; and they assist in providing training in engaging with scholarly resources (Jackson & Stacy-Bates, 2016). Additionally, guides are considered an efficient and practical means for collaborating with instructors and appending IL into a course that might already be full of content (Kline et al., 2017).

Historically, research guides have enjoyed widespread acceptance as beneficial to learning (Dalton & Pan, 2014). Early proponents lauded their ability to teach information-seeking strategies and support disciplinary research practices, emphasizing the “immediate feedback” provided in real-world searches (Harbeson, 1972, p. 113). Despite this long-held belief in their effectiveness, critical research examining their actual impact lagged significantly. While extensive best practices literature exists on guide design (Goodsett, 2020), it’s important to note that these recommendations lack strong underpinnings from actual research on student use. In 2005, Hemmig described a “continuity of pathfinder theory” upholding consistent design and evaluation criteria but could find “no published studies of actual research guide use, using actual research guide users” (p. 84).

This disconnect between assumptions about guide effectiveness, as well as the limited research available, calls for a more critical approach to understanding how students interact with research guides and how these interactions impact their learning. Without a comprehensive overview of guide effectiveness studies, assertions surrounding best practices cannot be validated as there is little to no consensus about content, audience, user engagement, placement, or the effectiveness of these guides for meeting established IL learning outcomes (Hemmig, 2005; J. Lee et al., 2021; Paschke-Wood et al., 2020). As we were unable to locate any other published or in-progress reviews on the effectiveness of guides for learning, the aim of this scoping review was to provide a comprehensive overview of the study design characteristics, evaluation and assessment methods, and a summary of findings regarding the effectiveness of guides in developing or improving the IL skills of college students. Our review was guided by the following research questions: (1) What are the IL-related learning outcomes that are associated with guides? (2) How are guides evaluated or assessed? and (3) What does the existing evidence say regarding their effectiveness at developing or improving the IL skills of college students?

Methods

This scoping review adheres to the Preferred Reporting Items for Systematic Reviews and Meta-analysis extension for Scoping Reviews (PRISMA-ScR) (Tricco et al., 2018). Following the a priori protocol development guidance from members of the JBI Scoping Review Methodology Group (Peters et al., 2022), we preregistered our review protocol on November 3, 2022 with the Open Science Framework (DeFrain et al., 2022). In our review, we adhered to Arksey and O’Malley’s (2005) five-stage framework for conducting a scoping study: research question identification; collection of relevant studies; study selection; data charting; and summarizing results.

Eligibility Criteria

The full inclusion and exclusion criteria (see Appendix A) were structured around the PICOS (Population, Intervention, Comparison, Outcomes, and Study Characteristics) framework (Thomas et al., 2023). Studies were eligible for inclusion if they were guided by an explicit or implied research question regarding the effectiveness of guides for developing college students’ IL. Our definition of research was intentionally broad and inclusive: with no expectation that guides be examined in clinical or controlled environments, we sought to consider the full spectrum of “real-world practice” approaches characteristic of learning effectiveness studies (Singal et al., 2014, p. 1). Therefore, we considered any study whose author asserted the work as research or assessment. Our definition of IL was similarly broad. As we were interested in understanding the role that guides play in student learning, rather than a specific model of IL that was associated with any set of guides, we included conceptualizations of IL that were current or historic; individually, institutionally, or professionally generated; and locally or globally defined.

The study population must have included college students and gathered empirical data from or about this population as part of the study’s assessment of research guide effectiveness. No publication date limiters were used, as pedagogical interest in and critiques of library guides go back decades (Vileno, 2007), and the purpose of guides as providing introductory academic research training has been an historically consistent objective (Dalton & Pan, 2014). Although the scalability of online dissemination can remove barriers to access, whether the content is delivered physically or virtually does not inherently alter its effectiveness for learning (Bowen, 2014); therefore, we included studies of online and print-based, guides. We did not actively limit results to any language, however the publications indexed within the included databases are predominately written in English, and, as we explain later, we ultimately made the decision to exclude the few non-English language studies found due to our own language limitations.

Information Sources

We searched five scholarly databases for comprehensive coverage and broad disciplinary representation: Academic Search Premier (EBSCO, multidisciplinary); APA PsycINFO (EBSCO, psychological and behavioral sciences); ERIC (ProQuest, educational research); LISTA (EBSCO, library and information science); Web of Science Social Sciences Citation Index (Clarivate). We searched three additional databases to capture relevant grey literature or in-progress works: Dissertations & Theses Abstracts & Indexes (ProQuest); EdArXiv; and LIS Scholarship Archive (LISSA). Full electronic search strategies for each of the included databases can be viewed in the preregistered protocol (DeFrain et al., 2022). The first search was conducted January 4, 2023, and rerun on January 12, 2024.

Selection of Sources

All citations were imported into Zotero, and citation metadata manually checked by a student research assistant for accuracy and completeness. Duplicates were automatically removed when imported into Covidence systematic review software, with an additional 19 manually removed during subsequent screening stages.

Two screeners worked in duplicate during both title and abstract and full text review stages applying the predetermined inclusion and exclusion criteria. Disagreements or discrepancies between screeners were resolved by discussion with the full research team. Once the initial corpus of literature was reviewed, the citations of included studies were scanned for additional literature that may not have been captured in the initial searches. Although this snowball search practice has been critiqued as a possible source of introduced bias (Vassar et al., 2016), when conducted carefully, hand searching can still be a valuable method for locating literature from outside a review’s named databases (Craane et al., 2012). An additional 65 possible studies were discovered after duplicate studies were removed. These studies were then screened using the same multi-stage review techniques with two independent reviewers, adding a total of 12 studies into the final data extraction stage.

Data Charting Process

Through several iterations, we developed a data charting table in Covidence to gather study characteristics aligned with our original research questions. We used the Template for Intervention Description and Replication (TIDieR) checklist to improve completeness in the reporting of interventions in research studies (Hoffmann et al., 2014). Table 1 presents our approach to data charting and the features we considered necessary for identifying, summarizing, and mapping the outcomes, evidence, and effectiveness findings from the entire body of literature analyzed in this review. Two independent screeners charted study characteristics for each item meeting the inclusion criteria, and we worked as a team to resolve discrepancies.

Table 1

Explanation of Data Charting Process Aligned with Research Questions

Research Question

Field

Definition

Field input options

What are the IL-related learning outcomes associated with research guides?

Study purpose

Overall goal or reason for the study or publication

Open text

Theory or framework

Knowledge systems or beliefs held by authors that assumed the validity of their study

Open text

Outcomes measured

IL-related behaviors, attitudes, goals measured by authors

Open text

How are research guides evaluated or assessed?

Study location

Country where study was conducted

Open text

Investigatory foci

Subject of study associating guides with learning

Usability; usage; satisfaction; utility; evidence of learning

Guide integration

Type of guide and its use as intervention / within educational setting

Subject guide

Course guide

Embedded into LMS

Supplemental to library instruction

Print-based

Other:

(N) Population

Study sample / participant characteristics

Open text

Data sources

Data gathered or provided as evidence; marked if used as pre/post

Survey; Web stats; Test performance; Usability testing; Assignment performance; Interviews; Citation analysis; Focus group; Content analysis

Other:

Study funding

Grants, awards, or internal funds supporting study

Yes; No; N/A

What does the existing evidence say regarding their effectiveness at developing or improving IL skills of college students?

Findings

Directionality of findings re. learning effectiveness

Positive; neutral; negative; mixed

Explanation

Authors’ explanation of findings

Open text

Limitations

Study weaknesses per study authors

Open text

 

Summary of Results

We followed a narrative review approach to describing and summarizing the body of studies in this review (Arksey & O’Malley, 2005). By gathering standard information from each individual study in a uniform way, we were able to identify dominant practices, novel approaches, and significant gaps. Our summary also includes basic numerical distributions of the studies aligned with our original research questions.

Critical Appraisal

As this scoping review sought to identify and compile the entire body of evaluation of guide literature, we did not critically appraise individual sources of evidence for methodological rigor nor evaluate claims. Because of this practice, it should not be assumed that the effectiveness findings reported by study authors can be understood as valid evidence towards the overall effectiveness of guides for learning.

Results

The PRISMA flow diagram (see Figure 1) illustrates the search results and study selection process for each stage of screening.

figure 1

PRISMA Flow Diagram

Figure 1. PRISMA Flow Diagram

A total of 1,724 records were located through database and hand citation searching, 563 of which were identified as duplicates and removed. The review team screened titles and abstracts of 1,161 records, excluding 934 as irrelevant. During full text screening, the study team sought 227 publications for consideration, although they were not able to retrieve the full text for two articles. The study team excluded an additional 164 studies during this stage, with 69 removed because no relevant research questions were expressed, and another 64 deemed as non-research. A total of 61 studies were determined as meeting the criteria for inclusion in the study (see Appendix B).

Table 2

Publication Characteristics of included Studies (N = 61)

Publication decade

1970s

1

1.6%

1980s

2

3.3%

1990s

1

1.6%

2000s

7

11.5%

2010s

40

65.6%

2020–January 2024

10

16.4%

Publication type

Journal article

58

95.1%

Encyclopedia

1

1.6%

Report

1

1.6%

Thesis or dissertation

1

1.6%

Study location

Canada

5

8.2%

Ireland

1

1.6%

South Africa

3

4.9%

Tanzania

1

1.6%

United States

51

83.6%

N/A

1

1.6%

Funding

Yes

4

6.6%

N/A

57

93.4%

Characteristics of Sources of Evidence

As shown in Table 2, the full body of studies in the review were published between 1977 and 2023, with the first investigation of guides’ helpfulness to its users reported within the entry of “Pathfinders, Library” in the Encyclopedia of Library and Information Science (Gardner, 1977). Most studies located were published since 2010, conducted in the United States, and published as journal articles. Only four studies attributed any source of funding in support of the research.

Study Purpose

Thirteen (21.3%) of the publications were conducted specifically to investigate guides as tools for learning (Bisalski et al., 2017; Bowen, 2014; Greenwell, 2016; Hansen, 2014; Hsieh et al., 2014; Lauseng et al., 2021; L. Lee et al., 2003; Y. Y. Lee & Lowe, 2018; Magi, 2003; Miner & Alexander, 2010; Paul et al., 2020; Pickens-French & Mcdonald, 2013; Rothstein, 1989; Stone et al., 2018). For most studies however, the research into the learning effectiveness of guides was a smaller component of a larger investigation. Several studies in this subset focused more broadly on the use and perceptions of guides as one element contributing to the overall value of the library and its services to its users (D. Becker et al., 2017; D. A. Becker et al., 2022; Bowen, 2012; Brewer et al., 2017; Carey et al., 2020; Chiware, 2014; Gerrish & Martin, 2023; Li, 2016; Mubofu & Malekani, 2021; Mussell & Croft, 2013; Tang & Tseng, 2014; Tomlin et al., 2017). Much of the remaining research focused more generally on the creation, use, usability, satisfaction, and preferences for guide design as a means of identifying and justifying their value as tools for learning.

Guiding Theories and Frameworks

Despite the importance of contextualizing and structuring research according to a methodological foundation, thirteen (21.3%) of the studies did not explicitly situate their examinations within any identifiable theory or guiding frameworks (Almeida & Tidal, 2017; Archer et al., 2009; Barker & Hoffman, 2021; D. Becker et al., 2017; Carey et al., 2020; Daly, 2010; Hsieh et al., 2014; Lauseng et al., 2021; Pickens-French & Mcdonald, 2013; Rafferty, 2013; Rothstein, 1989; Stone et al., 2018; Wharton & Pritchard, 2020). Though IL and other library generated professional standards are central factors in evaluating the effectiveness of library guides as learning tools, only seven (11.5%) of the studies explicitly discuss disciplinarily derived frameworks (D. A. Becker et al., 2022; Bowen, 2012; Gilman et al., 2017; Y. Y. Lee & Lowe, 2018; Little, 2010; Mubofu & Malekani, 2021; Scoulas, 2021). Of the studies published after the 2016 release of the ACRL Information Literacy Framework, only one (Y. Y. Lee & Lowe, 2018) discussed how the Framework was used to shape and inform their study.

Several theories and frameworks external to library science were referenced, echoing Lee and Lowe’s (2018) drawing upon “decades of research on how students learn and impediments to learning … [especially] cognitive load theory, how students learn new ideas, and impediments to learning, specifically research anxiety” (p. 207). Eight (Bowen et al., 2018; Fagerheim et al., 2017; Gibbons, 2003; Lierman et al., 2019; Miles & Bergstrom, 2009; Mussell & Croft, 2013; Slemons, 2013; Thorngate & Hoden, 2017) focused on use and usability as a means of guiding their studies. This was seen in Thorngate and Hoden (2017), who wrote “If these guides are to support student learning well, it is critical that they provide an effective user experience” (p. 844). Several referenced constructivist theories (Bowen et al., 2018; Brewer et al., 2017; Hansen, 2014); three considered student mental models (Y. Y. Lee & Lowe, 2018; Leighton & May, 2013; Sinkinson et al., 2012); and two applied the Technology Acceptance Model (D. A. Becker et al., 2022; Sharrar, 2017). Six studies were informed by cognitive load theory (Baker, 2014; Bowen et al., 2018; Y. Y. Lee & Lowe, 2018; Metter & Willis, 1993; Miner & Alexander, 2010; Paul et al., 2020).

Outcomes Measured

Most of the studies measured outcomes regarding student satisfaction, preferences, engagement, and other affective states. Fifty-four (88.5%) of the 61 total studies measured such outcomes, 48 of which focused solely on these emotional outcomes. Forty-one (67.2%) included a question asking students whether they found guides helpful to their research needs. Fourteen (23.0%) studies explored knowledge and skills more directly related to IL outcomes (Archer et al., 2009; Bisalski et al., 2017; Bowen, 2014; Bowen et al., 2018; Greenwell, 2016; Hansen, 2014; Hsieh et al., 2014; Lauseng et al., 2021; L. Lee et al., 2003; Y. Y. Lee & Lowe, 2018; Miner & Alexander, 2010; Rafferty, 2013; Soskin & Eldblom, 1984; Stone et al., 2018). These studies generally sought to associate guide use with test performance and course grades, where outcomes included students’ ability to find and use primary resources (Archer et al., 2009), students’ self-reported skills on an exam (Bisalski et al., 2017), and knowledge checks testing students’ advanced search techniques, such as understanding of Boolean searching (Bowen, 2014; Greenwell, 2016; Hsieh et al., 2014; Lauseng et al., 2021; L. Lee et al., 2003; Soskin & Eldblom, 1984).

At least one study reported challenges in setting measurable outcomes. Archer et al (2009) began their study as an evaluation of a guide’s effectiveness for developing primary source research skills, but ultimately shifted when they struggled to operationalize relevant learning outcomes: “As we interacted with the students and analyzed the results over the following months, it became clear that the most important outcome of the study was not so much what it told us about the effectiveness of the guide but rather how it helped clarify our understanding of what constitutes primary source literacy” (p. 411).

Investigatory Foci

We found that guide investigations could be characterized according to five central foci: usability (can students use the guides?); usage (do students use the guides?); satisfaction (do students like the guides?); utility (do students consider the guides useful?); and evidence of learning (are the guides effective tools for learning?). Though the latter two categories are most explicitly relevant to the scope of this review, the preceding foci were included when study authors directly tied approaches to findings associated with learning effectiveness. For example, Almeida and Tidal (2017) equated usability with learning by explicitly connecting “design features with cognitive practices” (p. 64); Barker and Hoffman (2021) concluded their review of the literature on usability studies by stating, “How well students are able to use guides has a direct impact on their ability to learn” (76); Smith (2007) suggested his meta-assessment model made it possible to associate web usage stats with student learning engagement, stating, “Ideally, it would be nice if everyone became fully engaged in each guide’s content each time they visited, but the analysis model is still applicable even if they do not” (p. 91); and Hansen (2014) called students’ perceptions “vital for developing [guides] into a successful learning tool” (p. 16).

Fourteen (23.0%) of the studies had a singular focus (Baker, 2014; Barker & Hoffman, 2021; Cobus-Kuo et al., 2013; Courtois et al., 2005; Dotson, 2021; Griffin & Taylor, 2018; Hsieh et al., 2014; Lierman et al., 2019; Miles & Bergstrom, 2009; Miller, 2014; Rafferty, 2013; Slemons, 2013; Soskin & Eldblom, 1984; Thorngate & Hoden, 2017), where the remainder employed two or more, including one that integrated all five (Bowen, 2014). Investigations focusing on guide usage were the most common (n = 37), followed by utility (n = 35), satisfaction (n = 31), usability (n = 17), and evidence of learning (n = 15).

Though mixing of investigatory foci is frequent throughout the included studies, not all areas of study are valued by all authors, and skepticism over other approaches is common. Griffin and Taylor (2018), for example, seem to argue against the controlled environment of usability studies in favor of gathering analytics data to understand “actual user patterns rather than idealized or hypothetical users” (p. 157). Similarly, Lee and Lowe (2018) criticized usability studies of guides as only gauging a student’s ability to navigate, ignoring learning, writing:

students can apply filters in databases for scholarly sources by checking a box without knowing what a scholarly source is … the findings of this study demonstrate that database navigability alone is not sufficient to improve students’ learning experience as well as their interaction with the guide and resources linked from the guide (p. 223).

Library Guide Educational Integrations

Throughout the studies we reviewed, guides were introduced into educational settings in several ways. Most studies investigated guides created and delivered as online subject or course guides. Only five studies considered students’ use of print-based guides, two of which (Magi, 2003; Mahaffy, 2012) looked at differences between the two mediums. The use of guides to supplement library instruction was examined by several researchers (Archer et al., 2009; Hansen, 2014; Hsieh et al., 2014; L. Lee et al., 2003; Leighton & May, 2013; Magi, 2003; Miller, 2014; Olshausen, 2018; Rafferty, 2013; Soskin & Eldblom, 1984; Wharton & Pritchard, 2020). Soskin and Eldblom (1984) conducted a study to determine the effectiveness of a “Guide to Writing the Term Paper” sheet that was designed to “partially fulfill the bibliographic instructional objective [of helping] students locate sufficient quality information on their industries” (p.13). After concluding from their literature search that embedded guides were more likely to be used, Leighton and May (2013) developed a survey instrument to determine the helpfulness of a guide that was created to support students in a business class.

In tandem with research into the effectiveness of guides as supplements to instruction, many researchers devoted time to assessing how the placement of guides impacts students’ learning and use of library resources. Several (Daly, 2010; Dotson, 2021; Gibbons, 2003; Gilman et al., 2017; Murphy & Black, 2013; Pickens-French & Mcdonald, 2013; Wharton & Pritchard, 2020) explore the function and effectiveness of guides that are embedded into campus learning management systems. In response to survey results suggesting library resources were underused, Duke University librarians looked to embedding guides into the campus learning management system in part because it “was obvious to librarians that students enrolled in courses with a research component could benefit from increased collaboration with librarians” (Daly, 2010, p. 209). In another study, Bowen (2012) uses responses to student survey data to argue that placing guides within the campus learning management system makes connections that “include improved learning and quality-of-research benefits to students, higher quality coursework turned in to instructors, and a maximized return on the investments a university makes in its library resources and its LMS” (p. 461).

Participants and Populations

Sample characteristics, including sample size, age, gender, or other demographic details of the participating populations in the studies, were inconsistently documented. Most offered only that their data came from “students,” or perhaps a mix of groups, such as undergraduates, graduates, and distance students. Fifteen studies involved students enrolled in specific courses or programs (Baker, 2014; Brewer et al., 2017; Chiware, 2014; Hansen, 2014; Hsieh et al., 2014; L. Lee et al., 2003; Leighton & May, 2013; Magi, 2003; Miller, 2014; Miner & Alexander, 2010; Mussell & Croft, 2013; Rafferty, 2013; Soskin & Eldblom, 1984; Stone et al., 2018; Tang & Tseng, 2014). Additional demographic characteristics were equally underreported. Eight (D. A. Becker et al., 2022; Bisalski et al., 2017; Bowen, 2014; Carey et al., 2020; Greenwell, 2016; Mussell & Croft, 2013; Scoulas, 2021; Soskin & Eldblom, 1984) offered details on the gender makeup of their participants, and two offered sample information regarding race or ethnicity (Carey et al., 2020; Scoulas, 2021). Several others purposely opted not to gather such details deeming them irrelevant (Hansen, 2014; Lauseng et al., 2021; Y. Y. Lee & Lowe, 2018), and one did not summarize sample demographics despite gathering them via their survey (Thorngate & Hoden, 2017).

When sample sizes were provided, they ranged from five to 1,303, where smaller samples were more often from usability and qualitative studies involving interviews or focus groups, and larger samples captured data from student surveys. Eight of the 61 studies did not include any details on the number of participants in their study, however four of those were examinations of website traffic in which the populations were more generally associated with the college student population at large (Dotson, 2021; Griffin & Taylor, 2018; Slemons, 2013; Smith, 2007).

Table 3

Data Sources Identified in Library Guide Effectiveness Studies

Data source

Total studies

Single data source

Pre/Post

Survey

40

16

5

Website traffic

22

5

1

Test performance

17

2

5

Usability testing

10

5

1

Assignment performance

7

0

1

Interviews

6

1

0

Citation analysis

4

1

0

Focus group

4

0

0

Content analysis

1

0

0

Total

111

30

13

Note. Total studies value exceeds N=61 as most studies used multiple data sources

Data Sources

There were nine sources of data gathered or evaluated in the included studies (see Table 3). Most relied upon results from survey data (65.5%), either solely or in combination with other data sources. Quantitative data, such as from website traffic and test performance, were frequently considered alongside qualitative data from interviews and focus groups, indicating a preference towards data triangulation and mixed methods overall.

Data were primarily gathered using self-developed instruments, where only three studies reported on validation or reliability measures (Almeida & Tidal, 2017; Greenwell, 2016; Stone et al., 2018), and five referred to using commercially developed or standardized instruments (Bowen et al., 2018; Gilman et al., 2017; Murphy & Black, 2013; Sharrar, 2017; Tang & Tseng, 2014). Ten studies used data sources to gather pre/post measures (Archer et al., 2009; Barker & Hoffman, 2021; Bowen, 2014; Dalton & Pan, 2014; Hansen, 2014; Hsieh et al., 2014; L. Lee et al., 2003; Magi, 2003; Sinkinson et al., 2012; Stone et al., 2018).

Effectiveness Interpretations

Study authors’ conclusions on the effectiveness of guides for learning varied, falling into four categories: positive, neutral, negative, or mixed (see Table 4).

Table 4

Overall Findings Relating to Guide Effectiveness

Directional

N

Positive

23 (37.7%)

Neutral

9 (14.8%)

Negative

3 (4.9%)

Mixed

26 (42.6%)

Total

61

However, deciphering their interpretations of “effectiveness” proved challenging due to the broad scope of most investigations. Notably, few studies explicitly outlined their expectations for how guides might influence student learning, or the potential benefits they might offer. Only six studies (9.8%) employed a priori hypotheses or assumptions to guide their inquiry (Brewer et al., 2017; Greenwell, 2016; Griffin & Taylor, 2018; Hsieh et al., 2014; Magi, 2003; Sharrar, 2017), while the remainder lacked clear benchmarks against which to assess impact.

Of the 23 studies reporting positive findings, 17 were at least partially derived from affective measures gathered via student surveys (Baker, 2014; D. A. Becker et al., 2022; Bowen, 2012; Daly, 2010; Gardner, 1977; Gibbons, 2003; Gilman et al., 2017; Greenwell, 2016; Lauseng et al., 2021; Li, 2016; Little, 2010; Metter & Willis, 1993; Paul et al., 2020; Rothstein, 1989; Sharrar, 2017; Stone et al., 2018; Wharton & Pritchard, 2020). When asked, students in these studies reported high satisfaction with guide content, or indicated that guides were helpful, relevant, or useful for their academic needs. In these studies, rates of satisfaction were resoundingly high. For example, Rothstein’s (1989) study reported that 90% of the 77 survey respondents were satisfied with the research guides developed for their specific topics, and Daly’s (2010) reported survey results found that “89 percent of the 106 respondents reported that course-specific guides were ‘somewhat useful’ or ‘very useful’ for their research” (p. 212). In Greenwell’s (2016) study, the pre/post testing data yielded no significant differences, and these results were not considered in the discussion section. Rather, the author selected student survey results as evidence of guide effectiveness, where 83.9% of the 112 students surveyed reported that the guide was valuable and made it easier for them to locate resources for their assignments.

Not all studies of student perceptions reported such positive results, however (Courtois et al., 2005; Mubofu & Malekani, 2021; Mussell & Croft, 2013; Ouellelte, 2011; Pickens-French & Mcdonald, 2013; Scoulas, 2021). Courtois et al. (2005), for example, embedded a single question—“was this guide helpful?”—into all library guides for one semester. Of the 210 anonymous responses gathered, 52% rated guides as “Somewhat” to “Very Helpful,” while 40% rated them as “Not Helpful” or “A Little Helpful.” Some differentiation in satisfaction levels according to student characteristics were also revealed, such as in survey results from Scoulas (2021) suggesting that STEM students valued guides significantly less than non-STEM students, and nearly 70% of 33 distance students surveyed by Mubofu and Malekani (2021) study expressed feeling neutral or dissatisfied with research guides overall.

In examining the data presented regarding user perceptions, we found that across several studies, students frequently expressed high satisfaction with the guides while simultaneously indicating their own limited engagement with or need for them (Bisalski et al., 2017; Chiware, 2014; Leighton & May, 2013; Magi, 2003; Ouellelte, 2011; Rothstein, 1989; Sharrar, 2017; Tomlin et al., 2017). In Chiware’s (2014) study, for example, though guide ratings were generally positive, a “significant number of students reported that they simply felt they did not need them” (p. 31). For example, Sharrar’s (2017) summative usability study recorded the highest overall mean of 5.96 on a seven point Likert scale based on 47 undergraduate student survey responses to “It would be a wonderful idea for undergraduates to use library course pages,” whereas questions regarding students’ own intent to use guides received the lowest mean score of 4.49. Similarly, in Rothstein’s (1989) survey, the students who responded negatively to research guides developed for them through a Term Paper Clinic still advocated for the service: “even those few students who had some doubts or denials about its value to themselves felt that the Clinic should be continued on behalf of others” (p. 279).

Usage reports led three study authors to reconsider the effectiveness and overall purpose of their guides (Griffin & Taylor, 2018; Mahaffy, 2012; Mussell & Croft, 2013). Despite early assumptions that student researchers were independently discovering and engaging with guide content, Griffin and Taylor (2018) failed to find evidence of this when exploring use. Interpreting high bounce rates as students hurrying to accomplish specific tasks, they advocated against “verbose, exhaustive library guides harkening back to the pathfinders of old” (p. 158). Four additional studies shared similar guidance in advocating against the type of pathfinder guides that point students towards lengthy lists of resources (Baker, 2014; Hansen, 2014; Hintz et al., 2010; Leighton & May, 2013). In Baker’s (2014) comparative study of pathfinder guides versus more instructional ones, they were surprised to find that most of the students enrolled in two First-Year Experience courses “reported a more positive learning experience with the tutorial guide and they were able to complete the assignment more quickly and with better results” (p. 114). This was echoed in Hintz et al.’s (2010) findings, where their survey of 55 students indicated “that they did not want to simply be pointed to a resource; they wanted to be told how best to make use of it” (p. 46).

Low evidence of use or engagement was not always interpreted as a need to change. Although the earliest study included in this review discontinued its pathfinder program due to low use (Gardner, 1977), several remained optimistic that an audience would be found (Dotson, 2021; Hsieh et al., 2014; Leighton & May, 2013; Magi, 2003; Miner & Alexander, 2010; Murphy & Black, 2013). This hope that students’ curiosity could someday be piqued by guide content was relied upon as justification to continue investing tremendous amounts of time in developing and maintaining large numbers of guides. For example, despite much lower use than anticipated of the library guides created for 460 courses, Dotson (2021) concluded, “the hope is students will see specific items relevant to their course and explore more. They will use the ebooks and/or videos to better understand concepts and to explore search tools to go beyond these sources … Perhaps students will even bring up these sources with their instructor” (p. 256).

Students’ struggle with or resistance to effectively using, applying, or transferring guide-based content was documented in several studies (Bisalski et al., 2017; Griffin & Taylor, 2018; Hansen, 2014; Magi, 2003; Mahaffy, 2012; Ouellelte, 2011; Soskin & Eldblom, 1984). In one study (Hansen, 2014), post-test data showed the international student participants were aware of expectations surrounding use of scholarly sources and could easily locate them, but unintuitive database interfaces and cumbersome search practices, including the use of Boolean logic, created frustrating barriers. In the words of one student, “‘Before I [did] the library research, I only use the Google to do the research because it is very comfortable and convenient, especially using the Wikipedia. But after I knew how to use the library research, our teacher just ask us to use the library research and it’s too difficult for an international student’” (p. 66). In another study, despite substantial time spent training students on course guide resources, when analyzing the number of sources cited in their subsequent research projects, Magi (2003) discovered that most students “relied heavily on free World Wide Web sites not demonstrated or recommended by the librarian” (p. 683). Soskin and Eldblom (1984), in their examination of 23 economics students’ papers gathered during one fall semester, concluded that while the papers receiving higher scores cited more resources, it was the students’ ability to analyze the information that influenced their overall score (p. 18). They also expressed concern that the students’ skills transfer would be inhibited by the search strategies outlined in the guides, writing, “Although the flow-chart type of guide has the advantage of being economical of students’ time, it has the potential disadvantage of prescribing a search strategy so narrow that generalization to future information seeking may be difficult” (p. 20).

Limitations Identified in the Studies

Twenty (32.8%) of the 61 studies did not identify any limitations or weaknesses regarding their research design or conduct that could influence outcomes and interpretations of the research. Thirty-three (54.1%) expressed limitations relating to the sample used for the research, with 16 studies identifying limitations due to a small participant pool (D. Becker et al., 2017; D. A. Becker et al., 2022; Bisalski et al., 2017; Bowen, 2014; Bowen et al., 2018; Brewer et al., 2017; Carey et al., 2020; Cobus-Kuo et al., 2013; Gerrish & Martin, 2023; Hintz et al., 2010; Lauseng et al., 2021; L. Lee et al., 2003; Little, 2010; Mahaffy, 2012; Slemons, 2013; Stone et al., 2018). Other limitations included experimenter effect (Lierman et al., 2019), poor study design (Courtois et al., 2005), participants failing to follow instructions (Hsieh et al., 2014), and results being non-generalizable due to several circumstances (Bowen, 2014; Mubofu & Malekani, 2021; Ouellelte, 2011; Rothstein, 1989; Thorngate & Hoden, 2017).

Discussion

What are the IL Related Learning Outcomes Associated with Guides?

When we began this study, we expected that most learning outcomes associated with guides would be directly aligned with guide objectives, and therefore reflect traditional IL behaviors, skills, and dispositions around information acquisition and use. For example, for subject guides introducing students to disciplinary research practices, we expected to see learning outcomes surrounding dispositional knowledge acquisition. For course guides created to support completion of research assignments, we anticipated learning outcomes indicating how well guides assisted students in this work, including details on specific resources and strategies. While a smaller but noteworthy group of studies did present learning outcomes on knowledge and skills development related to IL, the majority focused instead on student satisfaction, preferences, and engagement.

Although understanding students’ experiences remains crucial, it should be complemented by assessments of how guides translate into tangible learning outcomes more directly relevant to learning goals of guide creators. This could involve incorporating IL frameworks, utilizing learning objectives aligned with specific courses, or employing knowledge-based assessments beyond simple satisfaction surveys. That nearly a quarter of studies lack an explicit theoretical foundation—and even fewer point to professional frameworks such as the ACRL Information Literacy Framework—is striking, and points to the difficulties practitioners continue to face in trying to apply and assess IL concepts overall. Ultimately, a richer understanding of guides’ influence on both immediate user experiences and long-term learning can be achieved through a more nuanced approach to outcome evaluation, embracing both affective and knowledge-based measures.

How are Guides Evaluated and Assessed?

There is no one way to evaluate learning, and the broad spectrum of approaches to guide assessment featured in this review reflects that. For the most part, guide evaluations are exploratory and open-ended. While study authors value mixed methods, often triangulating qualitative student feedback with quantitative website traffic statistics, very few control groups or baseline measures are used as comparators. Data are most often gathered to help practitioners quickly assess guide use and usefulness to students, where data are used to identify areas needing improvement. As such, evaluation practices are most often quick and simple, and rely on data that are easy to access, obtain, and understand: Surveys capture learner preferences and attitudes, web statistics reveal use and interaction, and usability observations are largely used to refine guide design. That most studies were published 2010 and later aligns with the transition to online technologies, including the 2007 release of SpringShare’s LibGuides platform (Lilly, 2022). Where assessing use of physical pathfinders was limited to observational and circulation data, access to web traffic data presented easy access to gauge site visits, resource selection, and user engagement.

Of note is that guide evaluation often does not require participation or support from course instructors. This is a pattern that is seen in the practice and implementation of guides within educational settings in these studies overall: although several studies provide details demonstrating highly participatory collaborations with course instructors, most of the studies indicate practices that occur with little to no instructor support or even awareness of the study. Though we did not gather enough information from the studies during our charting to fully characterize the nature and depth of librarian/instructor partnerships, the invisibility of guide assessment paints an uncomfortable picture that also keeps librarians at an arm’s length from data that could otherwise be used to measure more higher order thinking skills.

Given the small number of studies that identified any source of funding, it’s likely that this lack of financial support signals other resource barriers inhibiting more rigorous investigations. This is not a limitation unique to studies of library guides, but rather a common barrier experienced by librarian practitioners (Clapton, 2010; Smigielski et al., 2014). In Oakleaf’s (2010) critique of library assessment research that formed the basis of the Value of Academic Libraries project, she acknowledged that while conducting rigorous research is out of reach for many practitioners, rigorous assessment is still critical and “should be well planned, be based on clear outcomes …, and use appropriate methods” (p. 31). Assessment activities are clearly valued within the profession, yet without funds, time, resources, and methodological training, it is difficult to conduct this work. Even a small amount of funding could help offset barriers to conducting research aimed at enhancing pedagogical successes.

What Does the Evidence Say?

This scoping review paints a complex picture of the effectiveness of library research guides in supporting student learning. While a significant number of studies highlight positive user perceptions, with students expressing satisfaction and finding guides helpful or relevant, the interpretation of “effectiveness” remains ambiguous due to the lack of clearly defined expectations or benchmarks for impact assessment. Notably, only a small portion of the studies employed specific hypotheses or assumptions, leaving the majority without clear measures to evaluate the guides’ influence. This ambiguity is further compounded by the fact that very few study authors revealed limitations affecting their studies.

Though guide evaluations are primarily conducted to understand students’ learning experiences in highly specific circumstances, effectiveness findings are often shared in ways that suggest broad applicability. Unfortunately, underreporting of sample demographics and study conditions poses a significant challenge to the robustness and generalizability of these studies. Without details on the participants in the study, it becomes difficult to understand whether the findings are being associated with all student populations or only specific subgroups, such as first-year undergraduates or graduate students. Without this crucial information, the findings remain incomplete and their applicability uncertain. To understand the impact of guides, researchers must strive for more comprehensive reporting of sample demographics, allowing for more nuanced interpretations and targeted interventions to cater to the diverse needs of student learners.

Limitations

Although we did not exclude non-English language publications in our search queries, our search terms and the sources of information searched disproportionately privileged English publications. While two non-English language documents provided abstracts in English which we identified as potentially relevant, due to our research team’s own language limitations we made the decision to exclude these articles rather than pursue translation services. We did not want to misrepresent this study’s scope given our own capabilities and the vastly incomplete representation of global literature that could therefore be discovered or considered. Additional limitations stem from the nature of scoping review methods, especially the possibility that relevant publications were possibly missed or omitted, and that critical appraisal of studies and more focused analysis of study findings are necessary to understand the effectiveness of guides for learning.

Future Directions

Focused Assessment of Learning Outcomes

While it is evident from these studies that guides are used to scale, supplement, and even substitute for librarian instruction, it is unclear what learning outcomes are best supported through these tools. Many studies in this review gathered students’ feedback regarding guide helpfulness and satisfaction but given how individualized the guides are in these studies, more work is needed to determine what is or is not particularly helpful or satisfying about guides. Without in-depth exploration, it is challenging to understand what elements of research guides are especially beneficial in most contexts. If a student found a guide helpful, what exactly was helpful? If students report being satisfied with a library guide that was created with an instructional goal of increasing students’ critical evaluation skills, is their satisfaction enough to conclude that the goal was achieved?

Interrogation of What Constitutes Best Practices

Without clarity, assertions surrounding best practices cannot be validated as there is little to no consensus regarding the effectiveness of these guides for meeting their established learning outcomes. Though we emphasize the need for improved assessment practices and greater attention to the use and impact of learning outcomes in this work, caution is also needed against developing cultures of bean counting, self-surveillance, and perpetual audit. Profession-wide decreed value agendas turn our energy toward anxiously, and often individually, demonstrating value rather than collectively contributing to student learning and uplifting librarian labor (Pagowsky, 2021). Nicholson provides an astute critique of value agendas in librarianship, in stating that “Audit culture creates a misalignment or a gap between our aspirations and our approaches. For example, we continue to rely heavily on quantitative methods, even when these may not be the most appropriate, because they are the most expedient” (2017, p. 17). Instead, Nicholson encourages library professionals to spend “more time inquiring into how students are learning and changing as a result of the time they spend with us and less into their customer satisfaction with these interactions” (2017, p. 19).

Deeper Examination of the Role of Guide Integration in Educational Settings

While this review did identify how guides were integrated—such as those embedded within learning management systems or used as supplemental to librarian instruction—it did not examine the relationship between educational integration and learning effectiveness. While guides do offer libraries value in terms of scaling and reach, future research should focus on understanding what the limitations are regarding guides as standalone learning tools and whether or in which circumstances librarian instructional presence makes a difference.

Conclusions

The findings from this scoping review of guide effectiveness studies underscores the enormous presence these tools continue to have within academic libraries. The broad range of instructional applications, subjects covered, content included, and design features tested reveals the many, and varied, ways that practitioners have relied upon these guides in their teaching. The data sources relied upon in these studies indicate a valuing of student perspectives and experiences but restrict much of what we can know regarding the effectiveness of guides for deeper learning. More work is needed to identify and understand the factors contributing to students’ learning, especially regarding specific populations and user groups and their engagement with and application of the information provided within the guides.

Funding

This project was funded by the University of Nebraska Foundation Layman New Directions award.

Acknowledgements

The authors would like to especially thank: the Evidence Synthesis Institute for giving us inspiration and methodological facility; the ACRL Evidence Synthesis Methods Interest Group for pairing us with Amy Riegelman who mentored us through the project’s framework development; our undergraduate student intern, Cecan Porter, for her tireless efforts, enthusiasm for learning, and impeccable attention to detail throughout several stages of the review; the ACRL Education and Behavioral Sciences Section Virtual Research Forum for giving us an opportunity to share our process and progress to a national audience; Fernando Rios, Leslie Delserone, and Joan Konecky, for providing guidance in support of our protocol registration; and Amy Riegelman, Margy MacMillan, and Elizabeth Kline for their expert review of our article draft.

References

Almeida, N., & Tidal, J. (2017). Mixed methods not mixed messages: Improving LibGuides with student usability data. Evidence Based Library & Information Practice, 12(4), 62–77. https://doi.org/10.18438/B8CD4T

Archer, J., Hanlon, A. M., & Levine, J. A. (2009). Investigating primary source literacy. The Journal of Academic Librarianship, 35(5), 410–420. https://doi.org/10.1016/j.acalib.2009.06.017

Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32. https://doi.org/10.1080/1364557032000119616

Baker, R. L. (2014). Designing LibGuides as instructional tools for critical thinking and effective online learning. Journal of Library & Information Services in Distance Learning, 8(3/4), 107–117. https://doi.org/10.1080/1533290X.2014.944423

Barker, A. E. G., & Hoffman, A. T. (2021). Student-centered design: Creating LibGuides students can actually use. College & Research Libraries, 82(1), 75–91. https://doi.org/10.5860/crl.82.1.75

Becker, D. A., Arendse, J., Tshetsha, V., Davids, Z., & Kiva-Johnson, V. (2022). The development of LibGuides at Cape Peninsula University of Technology Libraries and the impact of the COVID-19 lockdown on their usage. IFLA Journal, 48(1), 57–68. https://doi.org/10.1177/03400352211046025

Becker, D., Hartle, H., & Mhlauli, G. (2017). Assessment of use and quality of library services, accessibility and facilities by students at Cape Peninsula University of Technology. South African Journal of Libraries & Information Science, 83(1), 11–25. https://doi.org/10.7553/83-1-1642

Bisalski, H. C., Helms, M. M., & Whitesell, M. (2017). Preparing undergraduate students for the major field test in business. Journal of Education for Business, 92(1), 9–15. https://doi.org/10.1080/08832323.2016.1261791

Bowen, A. (2012). A LibGuides presence in a Blackboard environment. Reference Services Review, 40(3), 449–468. https://doi.org/10.1108/00907321211254698

Bowen, A. (2014). LibGuides and web-based library guides in comparison: Is there a pedagogical advantage? Journal of Web Librarianship, 8(2), 147–171. https://doi.org/10.1080/19322909.2014.903709

Bowen, A., Ellis, J., & Chaparro, B. (2018). Long nav or short nav? Student responses to two different navigational interface designs in LibGuides version 2. Journal of Academic Librarianship, 44(3), 391–403. https://doi.org/10.1016/j.acalib.2018.03.002

Brewer, L., Rick, H., & Grondin, K. A. (2017). Improving digital library experiences and support with online research guides. Online Learning, 21(3), 135–150. https://doi.org/10.24059/olj.v21i3.1237

Carey, J., Pathak, A., & Johnson, S. C. (2020). Use, perceptions, and awareness of LibGuides among undergraduate and graduate health professions students. Evidence Based Library & Information Practice, 15(3), 157–172. https://doi.org/10.18438/eblip29653

Chiware, M. (2014). The efficacy of course-specific library guides to support essay writing at the University of Cape Town. South African Journal of Libraries & Information Science, 80(2), 27–35. https://doi.org/10.7553/80-2-1522

Cobus-Kuo, L., Gilmour, R., & Dickson, P. (2013). Bringing in the experts: Library research guide usability testing in a computer science class. Evidence Based Library & Information Practice, 8(4), 43–59. https://doi.org/10.18438/B8GP5W

Courtois, M. P., Higgins, M. E., & Kapur, A. (2005). Was this guide helpful? Users’ perceptions of subject guides. Reference Services Review, 33(2), 188–196. https://doi.org/10.1108/00907320510597381

Craane, B., Dijkstra, P. U., Stappaerts, K., & De Laat, A. (2012). Methodological quality of a systematic review on physical therapy for temporomandibular disorders: Influence of hand search and quality scales. Clinical Oral Investigations, 16(1), 295–303. https://doi.org/10.1007/s00784-010-0490-y

Dalton, M., & Pan, R. (2014). Snakes or ladders? Evaluating a LibGuides pilot at UCD Library. Journal of Academic Librarianship, 40(5), 515–520. https://doi.org/10.1016/j.acalib.2014.05.006

Daly, E. (2010). Embedding library resources into learning management systems: A way to reach Duke undergrads at their points of need. College & Research Libraries News, 71(4), 208–212. https://doi.org/10.5860/crln.71.4.8358

DeFrain, E., Sult, L., & Pagowsky, N. (2022). Effectiveness of academic library research guides for building college students’ information literacy skills: A scoping review protocol. https://doi.org/10.17605/OSF.IO/2SMQ4

Dotson, D. S. (2021). LibGuides gone viral: A giant LibGuides project during remote working. Science & Technology Libraries, 40(3), 243–259. https://doi.org/10.1080/0194262X.2021.1884169

Dunsmore, C. (2002). A qualitative study of web-mounted pathfinders created by academic business libraries. Libri, 52(3). https://doi.org/10.1515/LIBR.2002.137

Fagerheim, B., Lundstrom, K., Davis, E., & Cochran, D. (2017). Extending our reach: Automatic integration of course and subject guides. Reference & User Services Quarterly, 56(3), 180–188. https://doi.org/10.5860/rusq.56n3.180

Gardner, J. J. (1977). Pathfinders, Library. In A. Kent, H. Lancour, J. E. Daily, & W. Z. Nasri (Eds.), Encyclopedia of library and information science (Vol. 21, pp. 468–473). Dekker.

Gardois, P., Colombi, N., Grillo, G., & Villanacci, M. C. (2012). Implementation of Web 2.0 services in academic, medical and research libraries: A scoping review. Health Information & Libraries Journal, 29(2), 90–109. https://doi.org/10.1111/j.1471-1842.2012.00984.x

German, E., Grassian, E., & LeMire, S. (2017). LibGuides for instruction A service design point of view from an academic library. Reference & User Services Quarterly, 56(3), 162-167.

Gerrish, T., & Martin, S. (2023). Student preferences for reference services at a remote biological station library. Portal: Libraries & the Academy, 23(4), 637–653. https://doi.org/10.1353/pla.2023.a908695

Gibbons, S. (2003). Building upon the MyLibrary concept to better meet the information needs of college students. D-Lib Magazine, 9(3). https://doi.org/10.1045/march2003-gibbons

Gilman, N. V., Sagàs, J., Camper, M., & Norton, A. P. (2017). A faculty-librarian collaboration success story: Implementing a teach-the-teacher library and information literacy instruction model in a first-year agricultural science course. Library Trends, 65(3), 339–358. https://doi.org/10.1353/lib.2017.0005

Goodsett, M. (2020). Best practices for teaching and assessing critical thinking in information literacy online learning objects. The Journal of Academic Librarianship, 46(5), 102163. https://doi.org/10.1016/j.acalib.2020.102163

Greenwell, S. (2016). Using the I-LEARN model for information literacy instruction. Journal of Information Literacy, 10(1), 67–85.

Griffin, M., & Taylor, T. I. (2018). Employing analytics to guide a data-driven review of LibGuides. Journal of Web Librarianship, 12(3), 147–159.

Hansen, L. A. (2014). Second-language writer and instructor perceptions of the effectiveness of a curriculum-integrated research skills library guide [M.A., The University of Utah].

Harbeson, E. L. (1972). Teaching reference and bibliography: The pathfinder approach. Journal of Education for Librarianship, 13(2), 111. https://doi.org/10.2307/40322211

Head, A., & Eisenberg, M. (2010). Assigning inquiry: How handouts for research assignments guide today’s college students (SSRN Scholarly Paper ID 2281494). Social Science Research Network. https://doi.org/10.2139/ssrn.2281494

Hemmig, W. (2005). Online pathfinders: Toward an experience-centered model. Reference Services Review, 33(1), 66–87.

Hennesy, C., & Adams, A. L. (2021). Measuring actual practices: A computational analysis of LibGuides in academic libraries. Journal of Web Librarianship, 15(4), 219–242. https://doi.org/10.1080/19322909.2021.1964014

Hintz, K., Farrar, P., Eshghi, S., Sobol, B., Naslund, J.-A., Lee, T., Stephens, T., & McCauley, A. (2010). Letting students take the lead: A user-centered approach to evaluating subject guides. Evidence Based Library & Information Practice, 5(4), 39–52. https://doi.org/10.18438/B87C94

Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., Altman, D. G., Barbour, V., Macdonald, H., Johnston, M., Lamb, S. E., Dixon-Woods, M., McCulloch, P., Wyatt, J. C., Chan, A.-W., & Michie, S. (2014). Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. BMJ (Clinical Research Ed.), 348, g1687. https://doi.org/10.1136/bmj.g1687

Hsieh, M. L., Dawson, P. H., Hofmann, M. A., Titus, M. L., & Carlin, M. T. (2014). Four pedagogical approaches in helping students learn information literacy skills. Journal of Academic Librarianship, 40(3/4), 234–246. https://doi.org/10.1016/j.acalib.2014.03.012

Jackson, R., & Stacy-Bates, K. K. (2016). The enduring landscape of online subject research guides. Reference & User Services Quarterly, 55(3), 219. https://doi.org/10.5860/rusq.55n3.219

Kline, E., Wallace, N., Sult, L., & Hagedon, M. (2017). Embedding the library in the LMS: Is it a good investment for your organization’s information literacy program? In T. Maddison & M. Kumaran (Eds.), Distributed Learning (pp. 255–269). Chandos Publishing. https://doi.org/10.1016/B978-0-08-100598-9.00014-3

Lauseng, D. L., Howard, C., Scoulas, J. M., & Berry, A. (2021). Assessing online library guide use and open educational resource (oer) potential: An evidence-based decision-making approach. Journal of Web Librarianship, 15(3), 128–153. https://doi.org/10.1080/19322909.2021.1935396

Lee, J., Hayden, K. A., Ganshorn, H., & Pethrick, H. (2021). A Content analysis of systematic review online library guides. Evidence Based Library and Information Practice, 16(1), 60–77. https://doi.org/10.18438/eblip29819

Lee, L., Hime, L., & Dominicis, E. (2003). Just-in-time course guides. Florida Libraries, 46(2), 8–10.

Lee, Y. Y., & Lowe, M. S. (2018). Building positive learning experiences through pedagogical research guide design. Journal of Web Librarianship, 12(4), 205–231. https://doi.org/10.1080/19322909.2018.1499453

Leighton, H. V., & May, D. (2013). The library course page and instruction: Perceived helpfulness and use among students. Internet Reference Services Quarterly, 18(2), 127–138. https://doi.org/10.1080/10875301.2013.804019

Li, Y. (2016). Using new venture competitions to link the library and business students. Qualitative & Quantitative Methods in Libraries, 5(3), 551–559.

Lierman, A., Scott, B., Warren, M., & Turner, C. (2019). Testing for transition: Evaluating the usability of research guides around a platform migration. Information Technology & Libraries, 38(4), 76–97. https://doi.org/10.6017/ital.v38i4.11169

Lilly. (2022, May 2). 15 Years of Springshare! The Springy Share. https://blog.springshare.com/2022/05/02/15-years-of-springshare/

Little, J. (2010). Cognitive load theory and library research guides. Internet Reference Services Quarterly, 15(1), 53–63.

Magi, T. J. (2003). What’s best for students? Comparing the effectiveness of a traditional print pathfinder and a web-based research tool. Portal: Libraries and the Academy, 3(4), 671–686. https://doi.org/10.1353/pla.2003.0090

Mahaffy, M. (2012). Student use of library research guides following library instruction. Communications in Information Literacy, 6(2), 202–213. https://doi.org/10.15760/comminfolit.2013.6.2.129

Metter, E., & Willis, E. (1993). Creating a handbook for an academic library: Rationale and process. Research Strategies, 11(4), 32–220.

Miles, M. J., & Bergstrom, S. J. (2009). Classification of library resources by subject on the library website: Is there an optimal number of subject labels? Information Technology & Libraries, 28(1), 16–20. https://doi.org/10.6017/ital.v28i1.3167

Miller, L. N. (2014). First year medical students use library resources emphasized during instruction sessions. Evidence Based Library & Information Practice, 9(1), 48–50. https://doi.org/10.18438/B8F316

Miner, J., & Alexander, R. (2010). LibGuides in political science: Improving student access, research, and information literacy. Journal of Information Literacy, 4(1), 40–54. https://doi.org/10.11645/4.1.1467

Mubofu, C., & Malekani, A. (2021). Accessibility of library resources and support services by distance learners. Journal of Library & Information Services in Distance Learning, 15(4), 267–279. https://doi.org/10.1080/1533290X.2021.2021345

Murphy, S. A., & Black, E. L. (2013). Embedding guides where students learn: Do design choices and librarian behavior make a difference? Journal of Academic Librarianship, 39(6), 528–534. https://doi.org/10.1016/j.acalib.2013.06.007

Mussell, J., & Croft, R. (2013). Discovery layers and the distance student: Online search habits of students. Journal of Library & Information Services in Distance Learning, 7(1/2), 18–39. https://doi.org/10.1080/1533290X.2012.705561

Nicholson, K. P. (2017, October 26). The “Value Agenda”: Negotiating a Path Between Compliance and Critical Practice [[Keynote]]. Canadian Libraries Assessment Workshop (CLAW), University of Victoria.

Oakleaf, M. J. (2010). The value of academic libraries: A comprehensive research review and report. Association of College and Research Libraries, American Library Association.

Olshausen, M. (2018). A statistical approach to assessing research guide use at Central Washington University. PNLA Quarterly, 82(3/4), 23–34.

Ouellelte, D. (2011). Subject guides in academic libraries: A user-centered study of uses and perceptions. Les Guides Par Sujets Dans Les Bibliothèques Académiques: Une Étude Des Utilisations et Des Perceptions Centrée Sur l’utilisateur, 35(4), 436–451.

Pagowsky, N. (2021). The contested one-shot: Deconstructing power structures to imagine new futures. College & Research Libraries. https://doi-org.libproxy.unl.edu/10.5860/crl.82.3.300

Paschke-Wood, J., Dubinsky, E., & Sult, L. (2020). Creating a student-centered alternative to research guides: Developing the infrastructure to support novice learners. In the Library with the Lead Pipe.

Paul, S., Wright, L. B., & Clevenger-Schmertzing, L. (2020). Going the distance: Flipped classrooms and the research appointment. Southeastern Librarian, 68(3), 3–13.

Peters, M. D. J., Godfrey, C., McInerney, P., Khalil, H., Larsen, P., Marnie, C., Pollock, D., Tricco, A. C., & Munn, Z. (2022). Best practice guidance and reporting items for the development of scoping review protocols. JBI Evidence Synthesis, 20(4), 953–968. https://doi.org/10.11124/JBIES-21-00242

Pickens-French, K., & Mcdonald, K. (2013). Changing trenches, changing tactics: A library’s frontline redesign in a new cms. Journal of Library & Information Services in Distance Learning, 7(1/2), 53–72. https://doi.org/10.1080/1533290X.2012.705613

Rafferty, R. S. (2013). The impact of library instruction: Do first-year medical students use library resources specifically highlighted during instructional sessions? Journal of the Medical Library Association, 101(3), 213–217. https://doi.org/10.3163/1536-5050.101.3.011

Rothstein, S. (1989). Point of need/maximum service: An experiment in library instruction. Reference Librarian, 25–26, 253–284.

Scoulas, J. M. (2021). STEM undergraduate students: Library use, perceptions and GPA. Performance Measurement & Metrics, 22(2), 137–148. https://doi.org/10.1108/PMM-04-2020-0021

Sharrar, G. S. (2017). Are course pages useful? Getting beyond implementation and usability. UNC Chapel Hill Theses, MP4449–MP4449.

Singal, A. G., Higgins, P. D. R., & Waljee, A. K. (2014). A primer on effectiveness and efficacy trials. Clinical and Translational Gastroenterology, 5(1), e45. https://doi.org/10.1038/ctg.2013.13

Sinkinson, C., Alexander, S., Hicks, A., & Kahn, M. (2012). Guiding design: Exposing librarian and student mental models of research guides. portal: Libraries and the Academy, 12(1), 63–84. https://doi.org/10.1353/pla.2012.0008

Slemons, M. H. (2013). Design standards for LibGuides: Does better design lead to greater use? [Master’s, University of North Carolina at Chapel Hill].

Smith, C. H. (2007). Meta-assessment of online research guides usage. Reference Librarian, 47(1), 79–93. https://doi.org/10.1300/J120v47n97_08

Soskin, M. D., & Eldblom, N. (1984). Integrating the term paper into economics courses at liberal arts colleges: Industry case studies papers at SUNY-Potsdam (63454646; ED246732; pp. 1–30).

Stewart, C. (2022). Future states of the research library. 303, 3–11. https://doi.org/10.29242/rli.303.1

Stone, S. M., Sara Lowe, M., & Maxson, B. K. (2018). Does course guide design impact student learning? College & Undergraduate Libraries, 25(3), 280–296. https://doi.org/10.1080/10691316.2018.1482808

Tang, Y., & Tseng, H. W. (2014). Distance students’ attitude toward library help seeking. Journal of Academic Librarianship, 40(3/4), 307–312. https://doi.org/10.1016/j.acalib.2014.04.008

Taylor, A., Nelson, J., O’Donnell, S., Davies, E., & Hillary, J. (2022). The skills imperative 2035: What does the literature tell us about essential skills most needed for work? National Foundation for Educational Research.

Thomas, J., Kneale, D., McKenzie, J. E., Brennan, S. E., & Soumyadeep, B. (2023). Determining the scope of the review and the questions it will address. In J. Higgins, J. Thomas, J. Chandler, M. Cumpston, T. Li, M. Page, & V. Welch (Eds.), Cochrane handbook for systematic reviews of interventions version 6.4. Cochrane. https://www.training.cochrane.org/handbook

Thorngate, S., & Hoden, A. (2017). Exploratory usability testing of user interface options in LibGuides 2. College & Research Libraries, 78(6), 844–861. https://doi.org/10.5860/crl.78.6.844

Tomlin, N., Tewell, E., Mullins, K., & Dent, V. (2017). In their own voices: An ethnographic perspective on student use of library information sources. Journal of Library Administration, 57(6), 631–650. https://doi.org/10.1080/01930826.2017.1340776

Tricco, A. C., Lillie, E., Zarin, W., K. O’Brien, K., Colquhoun, H., Levac, D., Moher, D., D. J. Peters, M., Horsley, T., Weeks, L., Hempel, S., A. Akl, E., Chang, C., McGowan, J., Stewart, L., Hartling, L., Aldcroft, A., G. Wilson, M., Garritty, C., … E. Straus, S. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and Explanation. Annals of Internal Medicine. https://doi.org/10.7326/M18-0850

Vassar, M., Atakpo, P., & Kash, M. J. (2016). Manual search approaches used by systematic reviewers in dermatology. Journal of the Medical Library Association: JMLA, 104(4), 302–304. https://doi.org/10.3163/1536-5050.104.4.009

Vileno, L. (2007). From paper to electronic, the evolution of pathfinders: A review of the literature. Reference Services Review, 35(3), 434–451. https://doi.org/10.1108/00907320710774300

Wharton, L., & Pritchard, M. (2020). A successful long-term relationship: Three years of LTI integrations in Canvas. Journal of Electronic Resources Librarianship, 32(4), 286–303. https://doi.org/10.1080/1941126X.2020.1821993

Wineburg, S., Breakstone, J., McGrew, S., Smith, M. D., & Ortega, T. (2022). Lateral reading on the open Internet: A district-wide field study in high school government classes. Journal of Educational Psychology. https://doi.org/10.1037/edu0000740

Appendix A

Eligibility Criteria

Include:

  1. Study includes an explicit or implied research question regarding the effectiveness of academic library research guides for college student learning.
  2. The research guide must have been directly developed or compiled by an academic librarian or under the oversight of an academic library program or initiative.
  3. Empirical data must have been gathered as part of the study’s assessment of research guide efficacy or effectiveness.
  4. The study population must include college students and provide learning outcomes-related data drawn from or about this population.
  5. We are interested in all studies regardless of publication date.
  6. It includes explicit or implied learning outcomes relating to any model or operationalization of information literacy.
  7. There are no limitations on study design or study type. Study types will include experimental and observational (quasi-experimental, observational, case studies, non-quasi-experimental survey-based) primary studies. These can include peer reviewed articles and high-quality grey literature (e.g., dissertations, white papers, reports, conference proceedings, posters);
  8. We will not actively limit results to any language.

Exclude:

  1. A research guide cannot be identified as the primary intervention. Excluded from this study would be those in which a research guide is implemented or assessed as part of a broader suite of educational offerings, and the impact of the guide therefore cannot be understood.
  2. Excluded from this review are studies investigating the usability or user experience of research guides as related to their functional design, in which no measures relating to student learning are provided.
  3. No student-related data are gathered or analyzed. Excluded from this review are studies in which librarians or instructors comprise the sample population and student data were not gathered or assessed.
  4. Non-empirical research, such as reflections, perspectives, editorials, opinion pieces, best practices, or professional guidance materials.
  5. No sufficient information to understand the research guide’s implementation as an intervention, or how its effectiveness for learning was defined or assessed is offered.

Appendix B

Extraction Table Aligned With Research Questions

  1. * Erica Lynn DeFrain is Associate Professor, Social Sciences Librarian at the University of Nebraska-Lincoln, email: edefrain2@unl.edu; Leslie Sult is Librarian at the University of Arizona, email: lsult@arizona.edu; Nicole Pagowsky is Librarian at the University of Arizona, email: nfp@arizona.edu. ©2025 Erica Lynn DeFrain, Leslie Sult, and Nicole Pagowsky, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

RQ 1: IL learning outcomes associated with guides

RQ 2: how guides are evaluated

RQ 3: evidence shared

Study purpose

Outcomes measured

Investigatory foci

Guide integration

N Population

Data sources

Findings

Almeida & Tidal, 2017

Identify student design and organizational preferences for guides

design and learning modality preferences

Usability; Satisfaction

Print-based, Subject guide

10 students in two- & four-year programs

Usability testing; interviews

Neutral

No best layout identified from users

Archer et al., 2009

Evaluate utility of research guide for primary source literacy

knowledge of primary source literacy

Usability; Evidence of learning

Supplemental to library instruction, subject guide

17 undergraduates from different departments

Pre/Post survey; usability testing

Neutral

minimal improvement in students’ pre/post-questionnaire definitions of primary sources; students seemed confused over purpose of guides

Baker, 2014

Compare student preferences for pathfinder or tutorial style guides

design, content, and organizational preferences

Satisfaction

Course guide

N/A undergraduate students from 2 first-year experience sections

Survey

Positive

students preferred tutorial guide and self-reported improved learning experience

Barker & Hoffman, 2021

Identify student content and design preferences for guides

design, content, and organizational preferences

Usability

Subject guide

18-40 undergraduate students

Pre/Post usability testing

Positive

design updates based on first card sort showed improvements

Becker et al., 2017

Determine if and how students engage with the library as part of their studies and determine how well the library supports the academic activities of students

Use and awareness of resources; frequency of use

Usage; Satisfaction; Utility

Subject guide

394 Faculty, grad students and majority undergraduate students

Web stats; interviews; survey

Mixed

Unaware of guides in survey, but use data shows that the guides were being accessed

Becker et al., 2022

Overview of institutional LibGuide implementation; assessment of whether creating LibGuides supported the information needs of students

students’ perceptions and reported use of guide

Usage; Satisfaction; Utility

Subject guide

28 completed online questionnaire 13 for follow up interview

Survey; focus group

Positive

Most students reported library guide to be useful

Bisalski et al., 2017

Present a case study of pedagogy for implementing online study materials for the ETS MFT-B

self-reported test scores; students’ perceptions on effectiveness and usefulness of guide

Usage; Utility; Evidence of learning

Course guide

55 students enrolled in strategic management course

Survey

Mixed

about half of students used guide; most preferred internet resources

Bowen et al., 2018

Measure and compare students’ use and satisfaction of different guide navigation designs

design, content, and organizational preferences

Usability; Satisfaction; Utility

Course guide

10 stage 1; 14 stage 2 - undergraduate students enrolled in COMM 430 class

Usability testing; Standardized survey

Mixed

greater preference shown towards longer version of guide

Bowen, 2012

Describe current approaches and assess the value of placing course-level research guides into an LMS

students’ perceptions on effectiveness and usefulness of guide

Usage; Satisfaction; Utility

Embedded into LMS; course guide

63 undergraduates in a communications course

Survey

Positive

most students reported that assignment guide was beneficial

Bowen, 2014

Comparing students’ performance between LibGuide versus website guide

knowledge-based test and affective measurement survey

Usability; Usage; Satisfaction; Utility; Evidence of learning

Embedded into LMS, course guide

89 undergraduate students enrolled in COMM 132

Pre/Post survey; Pre/post-test performance

Mixed

students able to access materials; both sets of students were confused in answering knowledge-based questions

Brewer et al., 2017

Look at how program level and the timing of the introduction of a Literature Review library guide within the program influenced online business students’ perceived value of the resource

reported use and satisfaction with guides; usability and relevance of content

Usage; Satisfaction

Course guide

24 online undergraduate business students and online MBA students

Survey

Mixed

students were satisfied and able to use the guide; usability could be enhanced; earlier introduction desired

Carey et al., 2020

Examine students’ use, perceptions, and awareness of library guides

use, perceptions, and awareness

Usage; Satisfaction; Utility

Course guide, Subject guide

100 undergraduate and graduate health sciences students

Survey

Mixed

Limited general awareness, limited general use; perceived as valuable

Chiware, 2015

Evaluate students’ perceptions of a guide / determine how effective guides were in supporting students

use, perceptions, and awareness

Usage; Satisfaction; Utility

Course guide

1303 undergraduate ECON students

Survey

Mixed

half of students used guide; most expressed appreciation for guide

Cobus-Kuo et al., 2013

Investigate student preferences in terms of guide layout, organization, internal navigation, hierarchy, images and video, and content

design, content, and organizational preferences

Usability

Course guide, Subject guide

20 Students in user interface design and development course

Usability testing

Neutral

when shown guides, students expect to find library resources, databases most useful, value design consistency across guides, but held differing opinions overall.

Courtois et al., 2005

Gather information on students’ satisfaction with guides

single question survey was this guide helpful with 4 possible responses

Utility

Course guide, Subject guide

210 students

Survey

Mixed

40% of respondents rated a guide as Not Helpful or A Little Helpful

Dalton & Pan, 2014

Outlines the overall project management process involved in implementing LibGuides at UCD Library,

use, perceptions, and awareness

Usage; Utility

Course guide, Subject guide

58 students in the main Arts building

Pre/Post survey; Pre/Post web stats; Pre/Post interviews

Mixed

low guide use overall

Daly, 2010

Assess the use of both automatically and manually linked Library Guides into the LMS / are guides useful to students’ research; should they be embedded?

use, perceptions, and awareness

Usage; Utility

Embedded into LMS, course guide

106 Students who accessed the Library Guides menu item

Survey

Positive

majority reported that course- specific guides were somewhat useful or very useful for their research and should be in LMS

Dotson, 2021

Process article of how author used pandemic time to create 460 course guides for his STEM liaison areas and a look at use stats on the guides

use

Usage

Embedded into LMS, Course guide

N/A looked at use stats only

Web stats

Negative

data shows low use overall

Fagerheim et al., 2017

Student feedback on library guide design updates

use; design, content, and organizational preferences

Usability; Usage

Subject guide

16 Undergraduate students

Web stats; focus group

Mixed

students liked clean layout with consistent template; home tabs highest use stats

Gardner, 1977

Encyclopedia entry describing history of pathfinder development out of Project Intrex Model Library Program, M.I.T.

perceptions of usefulness

Usage; Utility; Satisfaction

Subject guide

71 users of MIT’s Barker Engineering Library

Survey

Positive

48% used pathfinders for course paper research, and all sections of the Pathfinders were used. 90 found pathfinders very helpful or fairly helpful; 10% not helpful

Gerrish & Martin, 2023

Measure success of changes to remote field station library service in response to COVID-19

student willingness to use virtual library services

Usage

Embedded into LMS, subject guide

N/A annual guide stats of undergraduate use gathered 2017–2022

Web stats; instructor interviews

Positive

guide visits spiked during pandemic despite fewer research assignments, fewer students, and decrease in reference questions asked

Gibbons, 2003

Pilot study evaluating course guides embedded into LMS

perceptions of usefulness; use

Usability; Utility

Embedded into LMS, course guide

53 students enrolled in 12 pilot classes

Survey; web stats

Positive

students reported guides as highly useful to them; web stats showed repeat usage and lengthy engagement

Gilman et al., 2017

Overview of faculty / librarian partnership for developing IL to support first-year agricultural science students

perceptions of usefulness; use; task completion

Usage; Satisfaction; Utility; Evidence of learning

Embedded into LMS, Course guide

N/A First-year agricultural science students in AGRI 116

Standardized survey; web stats; assignment analysis

Positive

students reported guides as highly useful to them though no association with assignment completion rates

Greenwell, 2016

Testing an instructional design model by comparing students’ performance after using a guide designed using a systems approach with IL Standards as outcomes versus a guide designed using I-LEARN process as framework:

use; information searching behaviors and pathways; source use

Usage; Evidence of learning

Course guide

112 first-year undergraduate students enrolled in seven sections of the same composition and communications course.

Survey; IL skills test; web stats; citation analysis

Positive

students find online library research guides valuable for finding sources

Griffin & Taylor, 2018

Offers a methodology for using quantitative analytics data to evaluate guide usefulness and use

use

Usage

Course guide; subject guide

N/A Primary user population undergraduate and graduate students

Web stats

Negative

limited engagement with content overall with little use beyond home page

Hansen, 2014

Examine effectiveness of ESL library guide

IL skills; academic language proficiency; academic research process; perceptions of usefulness

Utility; Evidence of learning

Course guide; supplemental to library instruction

142 ESL undergraduates enrolled in two sections of ESL class

Survey; Pre/post test performance; focus group; pre/post assignment analysis

Mixed

increased awareness of library resources and scholarly source types; no increase in students’ ability to effectively use academic research

Hintz et al., 2010

Identify what students want from subject guides

rating of guide comprehension, visual appearance, and content usefulness; reported intention to use a guide

Satisfaction; utility

Subject guide

55 students

Survey

Neutral

students want authoritative information and think guide design matters

Hsieh et al., 2014

Quasi-experimental study to assess effectiveness of four approaches to teaching IL skills, one of which required students to preview a librarian created research guide

test scores and performance measures

Evidence of learning

Supplemental to library instruction, Subject guide

107 undergraduate students in required FYW courses

Pre/Post test performance

Neutral

No significant gains for research guide group

Lauseng et al., 2021

Measure the impact of the EBM guide on user learning experience and outcomes; and to gather evidence for staffing allocations and for conversion to an OER.

use; knowledge; confidence; perceptions, satisfaction level, recommendations, and future intention of referral

Usage; Satisfaction; Utility; Evidence of learning

Subject guide

119 students 64% and practicing health professionals 23%

Survey; web stats

Positive

Participants reported finding what they needed and high satisfaction with guide content

Lee & Lowe, 2018

Observe students’ unmediated and outside of class interactions and learning with either pedagogical or pathfinder style library guides during simulated research assignment

assignment performance; perceived-learning experience; guide interaction and use; IL skills based on Framework

Usability; Evidence of learning

Course guide

22 students from first year to graduate in various majors

Survey; test performance; assignment analysis; usability testing

Mixed

no difference on IL skills test; pedagogical guide preferred over pathfinder design

Lee et al., 2003

Evaluate course guides effectiveness for students’ immediate information needs

knowledge of library resources

Satisfaction; Evidence of learning

Supplemental to library instruction, Course guide

89 students enrolled in three basic courses

Pre/Post test performance

Positive

experimental group performed higher than control group on all questions

Leighton & May, 2013

Describe effectiveness of library instruction and course guide for preparing students for mock appellate exercise

use; perceptions of usefulness

Usage; Utility

Supplemental to library instruction, Course guide

24 undergraduate international business students

Survey; web stats

Mixed

Few students used guide resources; most would recommend to a friend

Li, 2016

Evaluate how students use the library resources and services for completing their projects

use of library resources and services for completing projects

Usage; Utility

Course guide, subject guide

N/A undergraduate business students

Survey

Positive

Majority of students used library resources to complete their projects, incl. databases 80%, course guides 63.3%, articles 33.3%, subject guides 23.3%, archives 16.7% and books 10%

Lierman et al., 2019

Describes multi-stage usability testing process used during and after migration to LibGuides v2.

design, content, and organizational preferences

Usability

Course guide, Subject guide

6 mix of students

Usability testing; survey

Neutral

students grouped content according to type of task e.g. citing sources instead of users e.g. undergrads, athletes

Little et al., 2010

Share information related to a faculty learning community and their instructional methods for teaching research skills

self-perceptions: ease of navigation; usefulness of info and resources

Usability; Satisfaction; Utility

Course guide

18 graduate students

Survey

Positive

Authors conclude survey findings reveal “overwhelming success” of library guide as a tool to support student research

Magi, 2003

Quasi-experimental study comparing students’ use of print pathfinder versus web-based research guide in library instruction

self-perceptions of guide usefulness; feelings, opinions, and attitudes; source use

Usability; Satisfaction; Utility; Evidence of learning

Supplemental to library instruction, Print-based, Course guide, Subject guide

84 Undergraduate students enrolled in two sections of first-year business course

Pre/Post survey; citation evaluation

Mixed

high satisfaction; low use; no difference in resources used

Mahaffy, 2013

Explores students’ independent interactions with research guides

use; design, content, and organizational preferences

Usage; Satisfaction; Utility

Print-based, Course guide

10 undergraduates in ART 101 course

focus groups; web stats

Mixed

limited use reported; little familiarity with content

Metter & Willis, 1993

Overview of library handbook project to replace library instruction

Student perceived usability, utility, and satisfaction

Usability; Satisfaction; Utility

Print-based

85 students

Survey

Positive

Most students reported greater comfort in using library and would recommend it to a friend

Miles & Bergstrom, 2009

Usability study on effect of the number of subject labels listed on research question response times

Response time to research questions and total number of subject headings

Usability

Other: Participants selected subject label in response to research questions

120 students and staff

Usability testing

Neutral

No association between response time and number of subject categories

Miller, 2014

Examines custom library guide creation and use of library resources

course guide resource use and assignment performance

Usage

Supplemental to library instruction, course guide

318 technical college students in English and psychology classes

Web stats

Positive

Relationship found between course guide creation and use stats

Miner & Alexander, 2010

Investigates use of library guides for broad and narrow topics in lower- and upper-division POLI classes

Students’ performance on theory papers and current events assignments; guide use

Usage; Evidence of learning

Course guide

75 students in an international affairs and political science course

Web stats; Assignment analysis

Positive

Relationship between overall guide use and assignment performance

Mubofu & Malekani, 2021

Explore accessibility of library resources and services to distance learners

satisfaction, use, and access challenges re. library resources

Usage; Satisfaction

Course guide, subject guide

33 distance students

Survey

Mixed

Most students reported using the guides but were neutral re. satisfaction with library research guides

Murphy & Black, 2013

Examined use and design characteristics of library guides embedded in LMS

Consideration of promotion, design characteristics, and student preferences for library guides

Usage; Utility

Embedded into LMS, Course guide, Subject guide

100 students

Standardized survey; web stats; content analysis

Mixed

more students aware of guides than used them; most students described guides as helpful

Mussell & Croft, 2013

Evaluation of library resource use to aid resource allocation

Use, perceptions, and awareness

Satisfaction; Utility

Course guide, Subject guide

1,038 mix of undergraduate and graduate students

Survey; web stats

Mixed

limited use of guides; clear preference for Google; less than half who had used guides found them helpful to essential

Olshausen, 2018

Examine use of course guides outside of classroom

Use, perceptions, and awareness

Usage; Satisfaction; Utility

Supplemental to library instruction, Course guide, Subject guide

5 students

Web stats; interviews

Mixed

Little consistency in responses but most said guides seemed valuable

Ouellette, 2011

Qualitative project investigating students’ use of and satisfaction with subject guides

Use, perceptions, and awareness

Usage; Satisfaction

Subject guide

11; mix of students from different class levels and disciplines

Interviews

Negative

Students don’t use guides as unaware, prefer Google, or have info strategies in place

Paul et al., 2020

Case studying examining whether online library guides helped prepare students to meet with reference librarian

student survey on guide usefulness, quiz and discussion post about guide content

Satisfaction; Utility; Evidence of learning

Course guide

30 online graduate students in education doctoral program

Survey; test; assignment analysis

Positive

positive responses to design and content; content viewed as valuable

Pickens-French & McDonald, 2012

Study effectiveness of library guides embedded into CMS

Surveyed students on guide usability and overall satisfaction

Usability; Satisfaction

Embedded into LMS, Course guide

34 undergraduate students in English class

Survey; web stats

Neutral

low interest in instructional content; preference for fewer resources listed

Rafferty, 2013

To evaluate whether students used resources recommended in library instruction

Sources cited in students’ research assignments

Usage

Supplemental to library instruction, Course guide

118; three years of first-year medical students enrolled in course

Citation analysis

Positive

Students heavily cited library resources with 22% citing sources shared on course guide

Rothstein, 1989

Reflection on effectiveness of library school project having students create customized research guides for undergraduates

Questionnaire given to student recipients of custom research guides

Usage; Satisfaction; Utility; Evidence of learning

Subject guide

77 questionnaires given to all 260 undergraduate student recipients of custom research guides

Survey

Positive

90% of users reported being satisfied with custom research guides

Scoulas, 2021

Examine relationship between STEM and non-STEM students’ library use, perceptions, and GPA

Students’ overall experience with library use; frequency of visits and resource use; perceptions of resources; satisfaction with physical spaces

Usage; Satisfaction

Course guide, Subject guide

1,265 undergraduate students responding to library use survey

Survey

Mixed

STEM students valued course/subject guides less than non-STEM, though small effect size

Sharrar, 2017

Understand how student perceptions of library course guides effect their intent to use them

Students’ stated intentions to use a guide

Usage; Utility

Course guide

47 undergraduate students who use course pages

Standardized survey

Positive

most found guides useful and relevant to their needs

Sinkinson et al., 2012

Open card sort study comparing undergraduate, graduate, and librarian perceptions and expectations of library guide content

User content expectations

Usability; Utility

Subject guide

30 included three groups: undergraduate, graduate, and librarians

Pre/Post survey; usability testing

Mixed

differences detected between undergrad and grad student users

Slemons, 2013

Use of guides regressed against design and usability standards to understand relationship

Average guide page hits per month / per page

Usage

Course guide, Subject guide

N/A usage stats for 2 years

Web stats

Mixed

more content = less use; use of design standards associated with use

Smith, 2007

Overview of using meta-assessment to evaluate LibGuide annual use

Results from multiple regression analysis of guide use stats

Usage; Utility

Course guide, Subject guide

N/A examined annual use stats of guides per month

Web stats

Mixed

Identified significant differences in use for some subject areas over others

Soskin & Eldblom, 1984

Problems and potential benefits of a term paper for an upper-division economics course are examined using 3 years of data

Informal assessment of effectiveness of library instruction and guide

Usage

Supplemental to library instruction, Course guide, print-based

N/A students enrolled in economics class

Citation analysis; Assignment analysis

Neutral

Small relationship between number of sources cited and grade; No relationship between number of source types and grade on assignment

Stone et al., 2018

Comparative investigation between pedagogical and pathfinder guide designs and impact on student learning

Retention of learning; student perceptions of guide effectiveness;

Satisfaction; Utility; Evidence of learning

Supplemental to library instruction, Course guide, Subject guide

43 dental hygiene students

Survey; pre/post test performance; web stats; assignment analysis

Positive

students using pedagogical guide showed increase in perceptions, use, and grade performance over pathfinder

Tang & Tseng, 2014

Examine distance students attitudes towards library help services

Preferences and attitudes for receiving help; self-efficacy for online learning

Usage; Satisfaction; Utility

Subject guide, Course guide

220 distance students

Standardized survey

Mixed

Library guides most common library assistance tool used but low use overall

Thorngate & Hoden, 2016

Compared students’ use of three different guides to understand how guide layout and spatial distribution components affect interaction

student understanding of purpose of guide; task completion; satisfaction and preferences of content and layout

Usability

Subject guide

30 students representing wide range of demographic characteristics

test performance; usability testing

Mixed

students had design and layout preferences

Tomlin et al., 2017

Understand students’ use of library resources

students’ use and perceived usefulness of library guides

Usage; Satisfaction; Utility

Course guide, Subject guide

182 survey; 30 interviews graduate and undergraduate students at two campuses

Survey; interviews

Mixed

most students did not use library guides, but those who did reported strong satisfaction with them

Wharton & Pritchard, 2020

Assessment of LTI integration after three years of Canvas course integration

perceived usefulness, satisfaction with, and use of library guides integrated in the LMS

Usage; Satisfaction; Utility

Embedded into LMS, Supplemental to library instruction, Course guide, Subject guide

>500 survey of fully online students

Survey; web stats

Positive

nearly half of students surveyed reported using guides; most found them helpful

Copyright Erica Lynn DeFrain, Leslie Sult, Nicole Pagowsky


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (By Year/Month)

2026
January: 287
2025
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 0
September: 1298
October: 696
November: 510
December: 306