08_MullinsBoyd-Byrns

Academic Librarians’ Contribution to Information Literacy Instruction and Learning

Using data from a learning module embedded in all first-year seminars, researchers found evidence suggesting that librarians are uniquely qualified to deliver information literacy instruction compared to campus faculty. The study analyzes writing assignments from first-year modules taught by either librarians or campus faculty for two academic years. The data indicate that students met the learning objectives more often in modules taught by librarians. The outcome demonstrates the centrality of the librarian’s role in information literacy instruction and student learning and helps substantiate the value of academic libraries.

Introduction

In keeping with Ranganathan’s theory that the library is a growing organism, library instruction continues to evolve, adjusting to a changing environment that conserves its survival. 1 To that end, there has been a shift from teaching bibliographic sessions about library resources and services to facilitating student learning focused on thinking critically about information and engaging in reasoned processes to evaluate its reliability. Moreover, in response to an era marred by social, political, and economic upheavals, the discipline of information literacy is no longer a library-centric topic but a critical competency that applies to all academic content areas.

Higher education often views academic librarians as subject matter experts within the information literacy landscape. Historically, they have been at the forefront as teachers and curricular consultants who work with campus faculty on all issues related to information literacy.2 However, despite these endeavors, there appears to be a lack of sufficient research to substantiate the academic librarian’s contribution to student learning in the classroom.

The research reported here strongly indicates that librarians are distinctively qualified to deliver information literacy in the classroom compared to other campus faculty. The study is unique because it reports on information that surfaced when working with data from two different cohorts enrolled in the same program over two different academic years. It expands on what is known about librarians leading information literacy instruction.

The study utilizes qualitative assessments, including constructivist grounded theory, to report on information derived from direct measures of student learning from a module embedded in every section of a required first-year seminar class over two years. The module addresses the importance of evaluating information sources on the Internet and the role fake news plays in the rights of news consumers in a democratic society.

Initially, the researchers focused on assessing whether students met the learning objectives, while remaining open to any other outcomes that surfaced. After a close review of emerging information, the results indicate the librarians’ positive impact on student learning when measured against similar instruction delivered by campus faculty. The results contribute to the steadily increasing research on the impact of academic libraries over the past decade and further substantiate the value librarians bring to information literacy instruction and the colleges and universities they serve.

Literature Review

The Value of Academic Libraries

Doomsday discussions about the demise of libraries are nothing new. Over the years, shrinking budgets, evolving technology, and ubiquitous access to information have threatened their existence. While academic libraries assumed protection by the infrastructure of higher education, Sullivan’s Academic Library Autopsy Report: 2050 sardonically projected the bleak mortality of academic libraries.3 Sullivan’s piece triggered alarms, skepticism, and mobilization.

Too often, stakeholders fail to see the merit of academic libraries beyond “underutilized, expensive storehouses.”4 As colleges and universities viewed other academic units as more impactful, they began shifting their resources. As a result, libraries experienced reduced brick-and-mortar real estate, declining budgets, and a shrinking workforce.5 During this time, the Association of College and Research Libraries’ (ACRL) 2010 Value of Academic Libraries: A Comprehensive Research Review and Report responded by explaining the plight of academic libraries. The Value of Academic Libraries report suggested recovery required libraries to move beyond defending “knowledge for knowledge’s sake” and instead prove their value.6 ACRL, along with accrediting agencies and academic stakeholders, petitioned academic libraries to define their contributions to institutional worth with supporting evidence explicitly linking libraries and librarians to student learning and academic success, as well as to enrollment and retention improvements and graduation rates.

In 2017, ACRL issued the report Academic Library Impact: Improving Practice and Essential Areas to Research, addressing the lack of consensus on how academic libraries could best demonstrate their value. The document was the product of an extensive review of the library and information science and higher education literature, focus group interviews with library administrators, and interviews with campus provosts. As a result, Academic Library Impact identified six “priority areas” for research and practice, including “quantifying the library’s impact on student success” and “enhancing teaching and learning” as two of the “action-oriented” ways that “libraries could increase student learning and success while communicating their value to higher education stakeholders.”7 The report also identified actions for developing programs, collections, and spaces, further specifying that campus provosts expressed particular interest in libraries establishing value through quantification rather than qualification, mission alignment, and strategy.

A subsequent paper by Cheng and Hoffman amplified the range of perspectives represented in the Academic Library Impact report. Their study involved practicing academic librarians, researchers, administrators, and others, investigating the library’s impact on student success. It found that the librarian’s views differed vastly from those of library administrators and provosts. Remarking that professional librarians valued “practical, action-oriented results” over a desire to “quantitatively communicate value” to external audiences, Cheng and Hoffman also felt that library deans and provosts were data-driven and less receptive to qualitative inquiries on the library’s impact and the significance of associated research to prove value and persuade campus administrators on a myriad of issues, including budgeting.8

Doucette performed a content analysis of 2006–2014 library papers on assessment, published as part of the biennial Library Assessment Conference proceedings. This work sought to uncover the factors influencing the push for library assessment and identify which stakeholders values these assessments represented.9 Doucette analyzed 39 assessment papers noting that 92 percent of these studies contained at least one motivation for improving the library.10 In contrast, 46 percent of these papers identified that they were explicitly driven by requirements to strategically prove or demonstrate value by justifying, establishing, or illustrating something to higher-level stakeholders.11

Taken altogether, the Academic Library Impact report, Cheng and Hoffman, and Doucette’s work emphasize that tensions exist when library values are juxtaposed with institutional aims and where the inquiry rests on ‘proving’ rather than ‘improving.’ Additionally, these papers reveal that libraries and librarians have difficulty adopting business-driven practices in a service-based, not-for-profit environment. Understanding the motivation behind the need to link academic libraries with student learning is essential because provosts and other upper-level administrators make the critical financial decisions that affect library operations.

Demonstrating value and contributing to student success is increasingly essential for academic libraries. They are now required to participate in the push to provide quantitative evidence on their role in student learning and success. Moreover, establishing a library’s influence on learning gives the library an edge in institutional decision-making, particularly when vying for resources, personnel, and funding.

As early as 2007, Lynch et al. reported that the library had been displaced in its symbolic role as the “heart of an academic institution.”12 The authors found that university leaders were less inclined to reduce library budgets when library administrators employed strategies that connected the “functional role of the library in service to the university’s [values and] mission,” observing that this was the information they were looking for to provide ongoing levels of budgetary support.13 Then, some ten years later, Murray and Ireland’s 2018 study surveyed provosts and chief academic officers about their perceptions of academic libraries and value. Their findings echoed much of Lynch et al.’s earlier work. They reported that 72 percent of their respondents looked favorably on continuing library budgetary support when accompanied by data that demonstrated correlations linking the use of library resources and services with student academic success.14 The research reported here confirms one of the primary ways librarians can quantitatively communicate value to the institutions they serve is by documenting how information literacy instruction and other collaborative work impact student success and learning.

Academic libraries continue to build evidence correlating information literacy instruction to student research and learning.15 Much of the literature suggests a relationship between library instruction and student success indicators, such as GPA, retention, and campus course grade.16 While formal assessments of information literacy learning objectives are crucial, student success measures are a starting point for proving the value of information literacy instruction. One study found a statistically significant increase in GPA among graduating students who took library classes (n = 1,265) over students who received no library instruction (n = 115).17 Often, research focuses on collaborations between libraries and first-year programs as a means to instill early opportunities for impacting student success.18 However, more studies indicate that a scaffolded approach to increasingly difficult information literacy instruction throughout students’ academic careers significantly impacts learning and success.19

Constructivist Grounded Theory

The current study employs a constructivist grounded theory approach, described by its originator, Charmaz, as a more “contemporary version” of Glaser and Strauss’s initial grounded theory work.20 The researchers employed this theory because it more closely aligns with their philosophical views that one cannot escape prior knowledge and that one should examine and understand how this knowledge might influence their perspectives. In agreement with Charmaz, the constructivist version also “fosters asking probing questions about the data and scrutinizing the researcher and the research process.”21

Grounded theory seeks to develop principles grounded in data rather than hypotheses. Established in 1967 by Glaser and Strauss, it has become a well-known method of inquiry in social research. Grounded theory is an inductive research methodology that bridges the gap between research and theory development by “discovering theory from data that has been systematically obtained.”22 However, grounded theory has evolved and now includes several distinct “genres” from within the larger framework.23

Charmaz further states that constructivist grounded theory allows one to position theory based on “historical, social, and situational conditions.”24 Such is the case in this current initiative, where the researchers wanted to understand disproportions to various outcomes associated with administering these modules. Consequently, the constructivist grounded theory lends visibility and gives voice to data that may otherwise have gone undetected. The approach provides opportunities for learning, which would expand efforts on how best to support the information literacy needs of students enrolled in the First-Year Seminar program, given the specific circumstances. Priya submitted in 2016 that constructivist grounded theory is instrumental in building middle-range theories, or those that help people describe, understand, and construct meaning from problems or phenomena that occur in everyday practice25—very much like those experienced here in the first-year seminar modules.

Background

In 2016, the director of first-year seminars at a private university in New York State asked librarians to replace a standardized information literacy exam with a one-session learning module embedded in all first-year seminar classes. The director requested that the learning module provide an information literacy foundation by incorporating the first-year seminar reading chosen annually for all incoming classes. While the intention was to have librarians teach the information literacy module, first-year seminar instructors representing campus faculty had the academic freedom to teach the information literacy module personally.

An instructional design librarian (IDL) worked with other librarians to create the information literacy module that evaluated resources in the context of fake news related to the themes of the first-year reading. In general, the module’s purpose was to teach students who “remain unprepared to navigate the digital landscape”26 to distinguish between alternative facts and legitimate online information sources. The librarians also hoped that this essential aspect of information literacy would become a foundational springboard for future collaborations with faculty in subsequent academic years.

The IDL designed the one-shot module using a flipped-classroom approach requiring students to interact with materials before attending an in-person class session. The librarians believed that this pedagogical method led to more efficient use of class time during the quintessential one-shot library class.27 The learning management system (LMS) embedded the module’s materials including activities, readings, videos, and assignments, within the forty-five sections of first-year seminar course.

During the in-person class session, the students worked in small groups to assess the validity of online news and information sources centered on the first-year reading themes. After the activity, the groups shared their findings with the class. The instructor’s role, whether it was campus faculty or a librarian, was to guide and amplify this discussion to reinforce best practices for managing fake news within the context of the assignment’s objectives. After the session, the course’s first-year instructor was expected to assign the following short reflective writing assignment graded on a pass/fail basis:

  • What best practices and media literacy tools do you plan to use when consuming news and Internet information?
  • What issues concern you the most moving forward as a news consumer with the right to be informed of the truth?

Data about learning outcomes was collected for 2017 and 2019. The instructional module for both years was identical except for framing the assignments around the different first-year seminar readings and the role academic librarians played in its implementation. In 2017, librarians offered to teach the in-person class at the instructor’s request and provide any additional support. As a result, librarians taught 84 percent of the fake news modules in first-year seminar classes. The library also facilitated teach-the-teacher instruction and a lesson plan outlining the best pedagogical approaches to teaching the module. By 2019, the departure of seven full-time librarians due to attrition made it impossible to provide the same level of support. However, the library did offer limited assistance when it was available. Librarians were not tapped to deliver any information literacy modules or provide teach-the-teacher instruction to first-year instructors delivering the module in 2019.

Methodology

Overview

The study involved analyzing reflective writing assignments related to a fake news module. While the assessment focused on whether the students met the learning objectives, the researchers were open to what other information might be unearthed. In this type of inductive approach, the investigators do not seek to prove any hypothesis when analyzing the data but instead let concepts and patterns emerge from the data itself. As the analysis progressed, the researchers constructed tentative ideas about the data. They contextualized them further by looking at the data’s properties, such as the year, the instructor’s qualifications, library involvement, and the reading choice. In the context of this study, assigned reflective writing scores representing how well students met the objectives were mapped to whether a librarian or other faculty taught the module. The data suggest a relationship between the improved student learning and instruction taught by librarians versus other instructors.

Data collection and analysis occurred from spring 2019 to winter 2020. To ensure the manageable size of the writing samples, the researchers randomly chose one writing sample per class for 2017 (n = 31) and 2019 (n = 28). To note, writing samples were unavailable for analysis from 16 percent of the 2017 sections and 30 percent of the 2019 sections due to either not being assigned by the first-year instructor or not electronically submitted via the LMS. In addition, the sample size proved large enough to reach data saturation as no new data emerged to warrant additional thematic codes during the analysis. The investigators randomly assigned alphanumeric labels to each sample to ensure a blind review and separately recorded the label, year, section, and module’s instructor in a spreadsheet for future reference.

In general, the analysis included 1) an initial coding phase of deconstructing and coding the written samples, 2) a focused coding phase of inductively organizing the codes into themes as they relate to the objectives, 3) an objective assessment phase of revisiting and scoring each sample using a rubric representing how well the student responses met the module’s learning objectives, and 4) comparing the objective scores to whether a librarian or other instructor taught the module. Throughout the initial and focused coding phases, the IDL applied Charmaz’s grounded theory phases described below in more detail. The analysis was iterative within and among the phases; when new information emerged during coding and categorizing, the investigator revisited, reviewed, and revised previous codes and themes.

The researchers decided that as the course developer, the IDL was best suited to analyze and rate the students’ written responses. They used this approach because they believed the IDL’s intimacy with the content would elicit greater insight into the open-ended writing prompts. Glaser refers to this tactic as furthering “theoretical sensitivity,”28 because it brings “analytic precision to the work.”29 This study also used an analytic rubric and Cronbach’s alpha to measure a single observer’s reliability.

Initial Coding Phase

The initial coding phase aimed to deconstruct student responses into distinct descriptions to dig deeper into their meaning. During this phase, the IDL read each response line-by-line and assigned short descriptions in the form of actions using gerunds, not topics. This tactic allowed movement through the data to answer the question, “What is the student trying to communicate here?” The answer was applied in conjunction with the iterative process of reviewing prior codes as new data and patterns emerge. Coordinating these strategies ensured that the coding remained organic and unforced. Examples of initial codes during this phase included “verifying information using a secondary source,” “using skepticism when reading news,” “believing that news media greatly affects personal ideals,” and “recognizing the right to be informed with the truth.”

Focused Coding Phase

Focused coding further defined emerging themes and subthemes from the initial coding. This is where the information is organized into logical buckets—with some buckets fractured further into more nuanced subcategories that ensure greater consistency among the codes. For instance, the theme “trustworthy/authoritative sources” contained subthemes naming specific sources students identified as trustworthy (e.g., CNN, research databases, Google Scholar, peer-review journals). Next, the themes were organized according to the module’s learning objectives to form a codebook (Appendix A). Finally, the IDL used the codebook to re-analyze the samples and assign the objectives and themes to the written content (Appendix B).

Objective Assessment Phase

Because writing assessments are subjective and more prone to reliability issues, the researchers developed an analytical rubric to assess the samples according to the learning objectives because “any assessment without a scale is based on subjective judgments and general impressions.”30 Educators and researchers commonly accept that rubrics add to the consistency of single raters.31

The rubric explicitly defined the criteria for assessing how well the writing sample met the module’s learning objectives:

Objective One: Discuss objectivity, fairness, and balance in the context of fake news, disinformation, and misinformation.
Objective Two: Identify personal concerns as news consumers with the right to be informed.
Objective Three: Define the best practices and tools for evaluating news and information.

The ratings included proficient (3 pts), emerging (2 pts), beginning (1 pt), and not met (0 pts) (Appendix C). Before applying the rubric, the researchers agreed on how to apply the categories, then scored the same set of writings and discussed the outcome of these scores until they reached a consensus. The IDL used the consensus as a framework to score each writing sample.

Intra-rater Reliability

This study verified intra-rater reliability using Cronbach’s alpha as a measure of internal consistency, that is, how closely the single rater re-assessed the same writing samples. Measuring alpha, technically not a statistical test but a coefficient of reliability, was vital because it evaluated the accuracy of the interpretation of the writing samples by the IDL.32

While intra-rater reliability is reported most in the medical literature, it is seldom reported in social sciences or educational research despite its importance.33 Ideally, intra-rater reliability is estimated by having the rater read and evaluate each paper more than once. In practice, this approach is infrequently used due to time factors and because two readings of the same essay by the same rater are not considered genuinely independent.34

To measure alpha, the researchers randomly selected ten writing samples, assigned new identification numbers, and mixed them into the existing data. The rater blindly assessed the samples a second time using all analysis phases—coding the responses, mapping the course objectives, and scoring the samples according to the rubric. Next, SPSS was used to calculate the internal consistency of the rubric scores assigned to identical writing samples. Consistency measures of 0.70 or greater are deemed acceptable in the literature.35 This study showed acceptable levels of consistency with alpha reliability coefficients ranging from .853 to .942 (table 1).

Table 1

Consistency Measures

Objective

Cronbach’s alpha (n = 10)

1

.853

2

.875

3

.942

Comparing Objective Ratings to Instructor

During this phase, researchers investigated whether the student writing samples met the module’s objectives when taught by a librarian versus campus faculty. This was done in two steps. First, they recorded each student’s objective ratings of not met (0), beginning (1), emerging (2), or proficient (3) and whether a librarian taught the module (Appendix D). Next, the researchers calculated the rating percentage for each objective according to whether the module was librarian-taught versus campus faculty-taught (Appendix E).

Results

For objective one, about 29 percent of the writing samples from librarian-taught modules received ratings of emerging (2) or proficient (3), and 7 percent of the writing samples from campus faculty-taught modules received the same ratings. In addition, about 71 percent of the writing samples from librarian-taught modules received ratings of not met (0) or beginning (1), and 93 percent of the writing samples from campus faculty-taught modules received the same rating.

Regarding objective two, approximately 51 percent of the writing samples from librarian-taught modules received a rating of emerging (2) or proficient (3), and 18 percent of the writing samples from campus faculty-taught modules received a similar rating. About 48 percent of the writing samples from librarian-taught modules received a rating of not met (0) or beginning (1), whereas 82 percent of the writing samples from campus faculty-taught modules received the same rating.

For objective three, roughly 80 percent of the writing samples from librarian-taught modules received a rating of emerging (2) or proficient (3), and 43 percent of the writing samples from campus faculty-taught modules received the same rating. Also, about 19 percent of the writing samples from librarian-taught modules received a rating of not met (0) or beginning (1), whereas 57 percent of the writing samples from campus faculty-taught modules were rated the same.

Finally, the cumulative calculations for all three objectives showed that 53 percent of the writing samples from librarian-taught modules received a rating of emerging (2) or proficient (3), and 23 percent of the writing samples from campus faculty-taught modules were rated the same. Also, about 46 percent of the writing samples from librarian-taught modules received an overall rating of not met (0) or beginning (1), whereas 77 percent of the writing samples from campus faculty modules received the same rating. The results indicate that 30 percent more reflective writing samples taught by librarians received an overall rating of emerging or proficient. In addition, 31 percent more samples from modules taught by campus faculty received an overall rating of not met or beginning.

Discussion

The data shows a causal relationship between information literacy modules taught by librarians and improved student learning. That said, a discussion of the factors that may have affected student learning outcomes other than the instructor’s knowledge or background provides essential insight.

The module’s content was consistent across the years and sections. The flipped-class approach helped level the knowledge playing field before students participated in the class activities. During class, students accessed almost identical content and assignments. The minor difference between the two years was that the class examples aligned with the first-year reading titles chosen for each year. In addition, all instructors used the same content to teach the module as prescribed by the IDL. Finally, the reflective assignments were essentially identical.

There are variables between the modules to consider. A notable difference was the involvement of the library and librarians in 2017 and 2019. In 2017, librarians played a significant role in the module’s implementation, delivered the majority of instruction, and provided training to faculty instructors who chose to deliver the module independently. Whereas in 2019, the library was not involved. Not coincidentally, the LMS statistics from 2019 also reflected a decrease in student engagement with the materials and assignments. The decline in engagement may also have affected the learning outcomes.

Limitations

Unaddressed contributing factors may also affect the differences between student performance in 2017 and 2019. First, the researchers speculate that the content of each first-year reading may affect student engagement and perspective on evaluating sources and recognizing fake news. Classroom instructors’ anecdotal feedback indicated that students seemed more motivated to learn about the 2017 first-year reading because of numerous campus events, including the author’s visit, than the 2019 novel, which had no associated events. A second potential factor could be the cognitive capabilities of the students themselves. As the institution’s recruitment and enrollment were relatively stable during this period, it is unlikely a contributing factor. It is also essential to recognize that the data reflects a sample of student work, and there must be caution when generalizing the results across the larger population. Finally, the instructors administering the course could also potentially affect outcomes. There were some differences in who taught the course; however, most instructors remained the same.

Conclusions

Academic librarianship is changing. Higher education is increasingly asking libraries to prove their value. Librarians are increasingly playing a more significant role in facilitating student learning, particularly in evaluating resources. However, measuring and assessing the outcome of this work continues to be the Achilles heel within this discipline. This insufficiency is often the most significant reason teaching librarians and library faculty cannot quantitatively demonstrate their significance in student learning.

This study used data from a learning module embedded in a required first-year seminar to posit that librarians are uniquely qualified to deliver information literacy in classroom settings compared to other campus faculty. The analysis of student written work demonstrated that learning improved when librarians taught information literacy classes. While a causal relationship is inferred, the reasoning substantiates academic librarians’ role in student performance. Future research should, and must, find ways to explicitly link libraries to student learning and academic success.

A vital lesson learned from this project was the importance of gaining campus faculty buy-in when implementing a large-scale library instruction module. While most first-year instructors understood the value of information literacy instruction early in students’ academic careers, some campus faculty viewed the module as “extra work.” Faculty who supported information literacy in informal follow-up discussions had more students complete the pre- and post-class assignments. Conversely, a few campus faculty members who stated that the library module was too work-intensive for a first-year seminar class chose not to assign the pre-class work. As a result, the librarians who taught the module’s in-class portions indicated that they could not complete the instruction because of the time dedicated to bringing students “up to speed.”

Librarians reflected on ways to promote the value of information literacy instruction to campus faculty. One librarian suggested explaining how to explicitly build upon and scaffold the module into other classes. Another suggested discussing how librarians can support campus faculty, already burdened by classroom demands, in future information literacy endeavors. Finally, all librarians agreed that they must remind faculty that information literacy is not just a “library” topic but manifests itself in all content disciplines and is critical to lifelong learning.

The strategies used here should be a call to other librarians to develop ways to meaningfully and measurably position and advocate for themselves within their universities. Using their institution’s LMS as a frame for devising content, modules, and other learning objects that include opportunities for authentic assessment will take ingenuity and planning but will almost certainly help garner quantitative mechanisms to document their worth according to administrative standards. Such validation is crucial to offset Sullivan’s doomsday predictions of libraries’ demise, particularly in the higher education landscape.36

Appendix A. Sample Codebook

Objective 3: Define the best practices and tools for evaluating news and information.

Theme

Subtheme

Corroboration

finds two or more agreeing sources, lateral reading, using reliable/primary sources, lateral reading

Trustworthy/authoritative sources

uses journals, science, mainstream news sources, avoids unknown/uncredible sites

Sourcing

checks links, verifies original sources

Sniff Test

sounds too good to be true, why do I want it to be true,

Reliable URLs

.org, .edu, .com, .gov, other

Grammar/spelling/punctuation

all caps, misspellings, poor grammar

Mechanical errors

dead links, site not loading

Clickbait

ads, selling something, persuasion

Author credibility

background check,

Verify images

Google image search, Tineye

Appendix B. Sample Initial and Focused Coding

Student Writing Sample

Initial Coding

Focused Coding

In order to be sure that the content that I read on the Internet is reliable there are multiple precautions I can choose to take. The first step I can take is to research the author and the article to see whether or not the author himself has a credible knowledge base on whatever subject it is that I am researching. Secondly, I would look through the authors cited works to make sure that he or she is using sources that are credible and up to date. If the sources are not up to date or if the article itself is not up to date the article and\or sources may have information that has since been proven to be false. Finally, I can look for skewed information within the article I am reading that the author may have used to sway to readers to think a certain way that is bias and opinionated rather than factual.

My main concern in regards to being a news consumer is that it is especially evident today that authors journalist etc… often put their own opinions into their writing. This makes it difficult to find information that is credible and without bias. Society as a whole is negatively impacted by this because not only is there many examples of bias in news and reporting but the topics that are reported in a bias way are often topics that need facts and honest reporting the most. This also makes it especially hard for individuals to make their own conclusions and educate themselves on particular topics. In today’s society, it is essential that individuals have access to non-bias information so that they may create their own their own ideas without them being disrupted by the bias of another individual. 

Researching the author to verify they have credible knowledge on a subject.

Looking at authors’ cited works to be sure they are credible and up to date.

Recognizing that if sources are not up to date that current information may discredit the sources.

Looking for skewed information that may sway readers toward bias and opinionated rather than factual information.

Worrying that as a news consumer authors and journalists often include personal opinions in their writing.

Making it difficult to find credible information.

Identifying the negative impact bias news reporting has on society because such topics need facts and honest reporting.

Referencing the difficulties individuals have in drawing conclusions and educating themselves on topics.

Stating that it is essential for individuals to access unbiased information so they can create their own opinions.

Obj1: Discuss objectivity, fairness, and balance

Biases/subjective reporting 

Education and awareness are important to inform public

Obj2: Identify personal concerns

Expresses personal concern

Obj3: Define the best practices and tools

Author credibility

Currency

Looks for bias

Seek facts/stats over opinions

Appendix C. Rubric

Objective Criteria

Proficient (3)

Emerging (2)

Beginning (1)

Not Met (0)

Obj1: Discuss objectivity, fairness, and balance in the context of fake news, disinformation, and misinformation.

The argument should include obstacles such as the spreadability of content for own purpose, bias reporting through agenda setting, monetary incentives through clickbait and advertisements, news algorithms, drawing readership through slander, sensationalism, and rush to report, and the public’s lack of awareness and education on the subject. 

The discussion should reflect the importance of verifying information, neutral and unbiased reporting, and freedom of speech. 

The response reflects a sophisticated, analytical understanding of what impedes and promotes objective, fair, and balanced news and information. 

The discussion includes clear and varied examples and evidence, as well as the importance of objective, fair, and balanced news and information.

The response reflects an adequate analytical understanding of what impedes and promotes objective, fair, and balanced news and information.

The discussion includes some examples and evidence regarding the importance of objective, fair, and balanced news and information.

The response reflects a minimally adequate analytical understanding of what impedes and promotes objective, fair, and balanced news and information.

The discussion includes few examples and lacks insight and a high level of sophistication regarding the importance of objective, fair, and balanced news and information.

The response fails to recognize the obstacles to objective, fair, balanced news and information. 

The discussion is unsupported by examples or evidence, nor does it reflect the importance of objective, fair, and balanced news and information.

Obj2: Identify personal concerns

The argument should reflect personal awareness and concerns about how fake news affects the self or individuals and the importance of questioning or validating information we use in our daily lives.

The discussion should reflect the negative repercussions fake news may have on one’s personal (deciding where to live, who to vote for, reputation among friends and family), academic (disseminating misinformation as fact), or professional lives (career choices, professional standing, or performance). 

The response reflects a sophisticated, analytical awareness of how fake news affects the self or individuals. 

The response also includes a discussion about the importance of validating information.

The discussion includes clear examples of how fake news may negatively affect one’s personal, academic, or professional lives. 

The response reflects an adequate analytical awareness of how fake news affects the self or individuals and the importance of validating information. 

The discussion includes some examples and critical regarding how fake news may negatively affect one’s personal, academic, or professional lives. 

The response reflects a minimally adequate analytical awareness of how fake news affects the self or individuals and the importance of validating information. 

The discussion includes few examples and lacks insight into how fake news may negatively affect one’s personal, academic, or professional lives. 

The response fails to recognize how fake news affects the self or individuals and the importance of validating information. 

The discussion is unsupported by examples or evidence nor reflects how fake news may negatively affect one’s personal, academic, or professional lives. 

Obj3: Define the best practices and tools. The response focuses on the specific ways in which students will verify information going forward. 

The discussion may include the following practices and tools:

1) Performing lateral reading using fact-checking sites (i.e., factcheck.org), corroboration using authoritative resources (peer-review journals, mainstream news outlets, science-based resources), “going upstream” by verifying the original source where the data or information originated, verifying the credibility of the author/ organization, images or graphics (i.e., Tineye, Google reverse image search).

2) Attending to elements internal to the information source, including the URL, mechanical errors/functional links, about/contact/author sections, monetary incentives such as clickbait/ads, trigger words (rumor, donate), site purpose (entertain, persuade, sell, inform), close reading for bias or agendas

3) Applying general approaches including using intuition or applying the “sniff test” (if it sounds too good to be true, then it probably is), becoming knowledgeable on a subject before reading (such as gathering background information of seeking multiple viewpoints), and recognizing personal beliefs/bias/motivations.

The response includes clear and varied best practices (5+) for verifying information (i.e., lateral reading, internal elements, general approaches)

The discussion includes examples of specific tools (i.e., Tineye).

The response includes adequate best practices (3–4) for verifying information. 

The discussion may include examples of specific tools (i.e., Tineye).

The response includes minimally adequate best practices (1–2) for verifying information. 

The discussion may include examples of specific tools (i.e., Tineye).

The response lacks best practices for verifying information or examples of specific tools (i.e., Tineye).

Appendix D. Objective Ratings and Instructor Per Student (Sample)*

Student Identifier

OBJ 1 Rating

OBJ 2 Rating

OBJ 3 Rating

Librarian-taught Module? (Y/N)

a38

2

3

2

N

b42

1

2

2

N

c37

1

2

2

Y

d34

1

1

1

N

e35

0

1

0

N

e47

1

1

1

N

f38

0

1

1

N

f45

1

1

1

N

g39

1

1

3

N

g44

0

1

0

N

j33

0

0

0

N

j37

1

1

2

N

k33

1

1

3

N

k35

1

2

2

Y

k39

0

1

1

N

k47

1

2

3

Y

l35

1

1

3

N

l46

1

1

2

Y

m40

1

3

3

Y

m44

2

1

0

Y

m45

0

0

1

N

n44

1

1

3

Y

n45

3

2

3

Y

o35

1

1

3

N

o44

2

2

2

Y

p37

0

0

1

Y

p39

1

1

1

N

*ratings = not met (0), beginning (1), emerging (2), or proficient (3)

Appendix E. Overall Objective Percentage Ratings by Instructor Type

OBJECTIVE 1

Rating Scale

Module Taught by Librarian

Module Taught by Campus Faculty

not met (0)

9.67%

32.14%

beginning (1)

61.29%

60.71%

emerging (2)

16.13%

7.14%

proficient (3)

12.90%

0%

OBJECTIVE 2

Rating Scale

Module Taught by Librarian

Module Taught by Campus Faculty

not met (0)

3.22%

14.28%

beginning (1)

45.16%

67.85%

emerging (2)

38.70%

14.28%

proficient (3)

12.90%

3.57%

OBJECTIVE 3

Rating Scale

% Module Taught by Librarian

Module Taught by Campus Faculty

not met (0)

6.45%

14.28%

beginning (1)

12.90%

42.85%

emerging (2)

32.25%

21.42%

proficient (3)

48.38%

21.42%

OVERALL FOR ALL OBJECTIVES (1, 2, 3)

Rating Scale

% Module Taught by Librarian

Module Taught by Campus Faculty

not met (0)

6.45%

14.28%

beginning (1)

12.90%

42.85%

emerging (2)

32.25%

21.42%

proficient (3)

48.38%

21.42%

Notes

1. Shiyali R. Ranganathan, The Five Laws of Library Science (Bombay: Asia Publishing House, 1964), 326.

2. Robert Farrell and William Badke, “Situating Information Literacy in the Disciplines,” Reference Services Review 43, no. 2 (June 2015): 329, https://doi.org/10.1108/RSR-11-2014-0052.

3. Brian T. Sullivan, “Academic Library Autopsy Report: 2050,” The Chronicle of Higher Education, January 2, 2011, https://www.chronicle.com/article/academic-library-autopsy-report-2050.

4. Carla J. Stoffle, Alan E. Guskin, and Joseph A. Boisse, “Teaching, Research, and Service: The Academic Library’s Role,” in Increasing the Teaching Role of Academic Libraries, ed. Thomas G. Kirk (San Francisco: Jossey-Bass, 1984), 3.

5. Mohammad Aslam, “Current Trends and Issues Affecting Academic Libraries and Leadership Skills,” Library Management 39, no. 1 (January 2018): 78–92, https://doi.org/10.1108/LM-102016-0076.

6. Association of College and Research Libraries (ACRL), The Value of Academic Libraries: A Comprehensive Research Review and Report, researched by Megan Oakleaf (Chicago: ACRL, 2010), 7.

7. Association of College and Research Libraries (ACRL), Academic Library Impact: Improving Practice and Essential Areas to Research, prepared by Lynn Silipigni Connaway et al. of OCLC Research (Chicago: ACRL, 2017).

8. James Cheng and Starr Hoffman, “Libraries and Administrators on Academic Library Impact Research: Characteristics and Perspectives,” College & Research Libraries 81, no. 3 (April 2020): 538–69, https://doi.org/10.5860/crl.81.3.538.

9. Lise Doucette, “Acknowledging the Political, Economic, and Values-Based Motivators of Assessment Work: An Analysis of Publications on Academic Library Assessment,” (paper, Library Assessment Conference, Arlington, VA, October 31-November 2, 2016), 288, https://perma.cc/HX6Y-J65A.

10. Ibid., 292.

11. Ibid., 293.

12. Beverly P. Lynch et al., “Attitudes of Presidents and Provosts on the University Library,” College & Research Libraries 68, no. 3 (May 2007): 213.

13. Ibid., 213–214.

14. Adam Murray and Ashley Ireland, “Provosts’ Perceptions of Academic Library Value & Preferences for Communication: A National Study,” College & Research Libraries 79, no. 3 (April 2018): 345–46.

15. Jacline L. Contrino, “Instructional Learning Objects in the Digital Classroom: Effectively Measuring Impact on Student Success,” Journal of Library & Information Services in Distance Learning 10, no. 3–4 (2016): 186–98, https://doi.org/10.1080/1533290X.2016.1206786; Brooke M. Robertshaw and Andrew Asher, “Unethical Numbers? A Meta-Analysis of Library Learning Analytics Studies,” Library Trends 68, no. 1 (2019): 76–101, https://doi.org/10.1353/lib.2019.0031.

16. Jennifer Rowe et al., “The Impact of Library Instruction on Undergraduate Student Success: A Four-Year Study,” College & Research Libraries 82, no. 1 (2021): 7, https://doi.org/10.5860/crl.82.1.7; Rebecca Croxton and Anne Moore, “Quantifying Library Engagement: Aligning Library, Institutional, and Student Success Data,” College & Research Libraries 81, no. 3 (2020), https://doi.org/10.5860/crl.81.3.399; Lucinda R. Wittkower, Joleen Westerdale McInnis, and David R. Pope, “An Examination of Relationships Between Library Instruction and Student Academic Achievement,” Journal of Library Administration 62, no. 7 (October 2022): 887–898, https://doi.org/10.1080/01930826.2022.2117953.

17. Ula Gaha, Suzanne Hinnefeld, and Catherine Pellegrino, “The Academic Library’s Contribution to Student Success: Library Instruction and GPA,” College & Research Libraries 79, no. 6 (2018): 737–46, https://doi.org/10.5860/crl.79.6.737.

18. Francesca Marineo and Qingmin Shi, “Supporting Student Success in the First-Year Experience: Library Instruction in the Learning Management System,” Journal of Library & Information Services in Distance Learning 13, no. 1–2 (April 2019): 40–55, https://doi.org/10.1080/1533290X.2018.1499235.

19. Melissa Bowles-Terry, “Library Instruction and Academic Success: A Mixed-Methods Assessment of a Library Instruction Program,” Evidence Based Library and Information Practice 7, no. 1 (March 9, 2012): 82, https://doi.org/10.18438/B8PS4D.

20. Kathy Charmaz, “The Power of Constructivist Grounded Theory for Critical Inquiry,” Qualitative Inquiry 23, no. 1 (January 2017): 34, https://doi.org/10.1177/1077800416657105.

21. Ibid.

22. Barney G. Glaser and Anselm L. Strauss, The Discovery of Grounded Theory: Strategies for Qualitative Research (New Brunswick, NJ: AldineTransaction, 1967), 1.

23. Ylona Chun Tie, Melanie Birks, and Karen Francis, “Grounded Theory Research: A Design Framework for Novice Researchers,” SAGE Open Medicine 7 (January 2019): 2–3, https://doi.org/10.1177/2050312118822927.

24. Charmaz, “The Power of Constructivist Grounded Theory for Critical Inquiry,” 34.

25. Arya Priya, “Grounded Theory as a Strategy of Qualitative Research: An Attempt at Demystifying its Intricacies,” Sociological Bulletin 65, no. 1 (January/April 2016): 50–68, http://www.jstor.org/stable/26368064.

26. Joel Breakstone et al., “Students’ Civic Online Reasoning: A National Portrait” (executive summary, Stanford History Education Group & Gibson Consulting, Stanford Digital Repository, November 14, 2019): 26, https://purl.stanford.edu/gf151tb4868.

27. Jacqueline O’Flaherty and Craig Phillips, “The Use of Flipped Classrooms in Higher Education: A Scoping Review,” The Internet and Higher Education 25 (April 2015): 85–95, https://doi.org/10.1016/j.iheduc.2015.02.002.

28. Barney G. Glaser, Theoretical Sensitivity: Advances in the Methodology of Grounded Theory (Mill Valley, CA: Sociology Press, 1978).

29. Kathy Charmaz, Constructing Grounded Theory, 2nd ed. (Thousand Oaks, CA: Sage Publications, 2014), 160.

30. Ulaş Kayapinar, “Measuring Essay Assessment: Intra-Rater and Inter-Rater Reliability,” Eurasian Journal of Educational Research 14, no. 57 (October 2014): 114, https://doi.org/10.14689/ejer.2014.57.2.

31. Anders Jonsson and Gunilla Svingby, “The Use of Scoring Rubrics: Reliability, Validity and Educational Consequences,” Educational Research Review 2, no. 2 (January 2007): 130–44, https://doi.org/10.1016/j.edurev.2007.05.002.

32. Mohsen Tavakol and Reg Dennick, “Making Sense of Cronbach’s Alpha,” International Journal of Medical Education 2 (June 2011): 53–55, https://doi.org/10.5116/ijme.4dfb.8dfd.

33. Emily Saxton, Secret Belanger, and William Becker, “The Critical Thinking Analytic Rubric (CTAR): Investigating Intra-Rater and Inter-Rater Reliability of a Scoring Mechanism for Critical Thinking Performance Assessments,” Assessing Writing 17, no. 4 (October 2012): 251–70, https://doi.org/10.1016/j.asw.2012.07.002.

34. Yoav Cohen, “Estimating the Intra-Rater Reliability of Essay Raters,” Frontiers in Education (September 2017): 49, https://doi.org/10.3389/feduc.2017.00049.

35. Jonsson, Anders, and Gunilla Svingby, “The Use of Scoring Rubrics: Reliability, Validity and Educational Consequences,” 133; Steven Stemler, “A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability,” Practical Assessment, Research, and Evaluation 9, no. 4 (2004): 1–11, https://doi.org/10.7275/96jp-xz07.

36. Sullivan, “Academic Library Autopsy Report: 2050”.

* Kimberly Mullins is an Associate Professor, First Year Experience and Social Sciences Librarian at Adelphi University, email: kmullins@adelphi.edu; Mary Kate Boyd-Byrnes is an Associate Professor and Reference and Instruction Librarian at Long Island University’s Post Campus, email: MaryKate.Boyd-Byrnes@liu.edu. ©2024 Kimberly Mullins and Mary-Kate Boyd-Byrnes, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Kimberly Mullins, Mary Kate Boyd-Byrnes


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (By Year/Month)

2025
January: 175
February: 277
March: 247
April: 276
May: 253
June: 221
July: 229
August: 250
September: 263
October: 377
November: 707
December: 587
2024
January: 0
February: 0
March: 0
April: 1025
May: 320
June: 153
July: 166
August: 133
September: 192
October: 251
November: 230
December: 113