07_LockerWhelan

When There’s No Information Literacy Requirement: Curriculum Mapping to Drive Engagement

Curriculum mapping provides valuable opportunities for internal reflection and external advocacy in academic libraries. Librarians at a small liberal-arts college developed a curriculum mapping project designed to measure information literacy interventions with students, despite a lack of a standardized set of courses that all students take over the course of their tenure. The project incorporated both quantitative scoring and qualitative reflections by liaison librarians to determine the extent of information literacy-focused engagements with students and allowed librarians to target interventions in a way that was designed to reach as many students as possible during their undergraduate careers.

Introduction

Although assessment in academic libraries has often revolved around baseline data such as headcounts, in recent years the conversation has shifted to focus more on assessments of engagement and impact.1 These assessments look for evidence of interactions which make a meaningful difference in students’ academic careers and that are connected with, rather than incidental to, the curriculum. Ultimately, this means considering the reach of high-quality and/or impactful interactions across the student body and aligning librarians’ instructional activities with the learning outcomes and values of the library, institution, and/or profession.

At the College of the Holy Cross, all instruction is already tailored to the goals of each course and incorporates the values and priorities of the libraries and the broader institution. In this context, assessment efforts are primarily focused on the reach and scalability of the instruction program as measured by quantity (and, to some extent, depth) of interactions with students. Since Holy Cross does not require any single course or sequence of courses that librarians can visit, the ultimate goal is to engage with as great a percentage of the student body as possible; therefore, an understanding of the curriculum and opportunities for such engagement is critical. This article describes the authors’ undertaking to develop said understanding by means of a mixed-methods curriculum mapping project.

Literature Review

Curriculum mapping is a widely accepted strategy both for obtaining an in-depth understanding of the academic curriculum, and for strategizing in other ways about information literacy programming.2 The library literature reports many different strategic applications of traditional curriculum mapping, at the department3 and program levels4 as well as in related contexts such as outreach programming.5 In addition to informing internal library decision-making, curriculum mapping has been shown to be a useful tool for communicating with faculty; for example, librarians at Berkeley College actively involved faculty in their curriculum mapping process, which included the development of department-specific curriculum maps made available through their website.6 Additionally, librarians at Texas Tech’s Architecture Library turned to curriculum mapping when attempts to introduce information literacy-focused assignments failed, ultimately using their project to demonstrate the need for a scaffolded instruction program in place of one-shot sessions,7 and Ziegler made a similar case on the basis of a project at the University of West Florida.8

The term “curriculum mapping” is widely used to refer to activities which are designed to systematically align, and assess the alignment of, information literacy programs with curricula. However, in the literature, this terminology may be used as a stand-in for a variety of different techniques depending on an institution’s current programs, local needs, and resources (administrative or otherwise) available to librarians. One such method is syllabus analysis. When targeting a given program or department, there are many examples of librarians collecting syllabi and analyzing them to determine where library learning goals might fit into courses in an embedded, scaffolded way. Broadly, the literature is split into two types of syllabus analysis. The first type attempts to score or map syllabi based on outside standards, like the Association of College and Research Libraries’ (ACRL) Framework or Standards for Information Literacy or the American Association of Colleges and Universities’ (AAC&U) Information Literacy VALUE Rubric. Examples of this strategy include Beuoy and Boss’s use of a rubric to code syllabi based on the inclusion of elements from the ACRL framework;9 Boss and Drabinski’s survey project that reviewed third- and fourth-year syllabi in a single department using a rubric based on the AAC&U framework, allowing librarians to determine where higher-level information literacy concepts might be introduced to students;10 and Buchanan et al.’s mapping of learning outcomes found in syllabi in different departments to the Information Literacy Competency Standards for Higher Education established by ACRL.11 The second syllabus analysis strategy attempts to score or map syllabi based on learning goals and outcomes developed within the institution. Examples of this method include McGowan et al.’s analysis of the types of information literacy assignments included in syllabi;12 Smith et al.’s scoring of syllabi to determine the “degree of library use,” which included assigning materials found in the library as well as LI sessions;13 and Ziegler’s review of syllabi that analyzed use of program learning outcomes already developed by the home library and departments.14 Both strategies have their advantages. One of the primary benefits to using outside rubrics is that they are already validated. However, they can lack specificity, or address needs that aren’t central to a given institution. Internally-created rubrics can achieve specificity and address a program’s individual needs, but are time-consuming to produce and validate.

Roadmapping is another technique which can be useful in situations where the current status and/or reach of a program is not fully understood. Roadmapping is mainly a means of gathering information on the layout and progression of curricula and identifying where information literacy already appears within that progression. Since, as Buchanan et al. aptly note, librarians do not typically have any authority over the curriculum,15 this process is less about curriculum design and more about identifying the progress that has already been made. The roadmapping technique can serve as a useful exercise for internal reflection and assessment. Detailed engagement with the institution’s course requirements may reveal nuances of an academic program or barriers to outreach which were not previously understood.16 It can also support communication with faculty, allowing librarians to target their specific needs, learning outcomes, and/or language,17 or to advocate for greater information literacy integration on the basis of specific skills and previous demand for instruction.18 More substantially, roadmapping can form a baseline for further work on information literacy integration in the curriculum, as at Cornell University.19 Gessner & Eldermire used a “retrospective teaching map” as a means both to understand their teaching capacity (by inventorying their existing activities) and to identify where information literacy already fit into the academic curriculum.20 Similarly to Holy Cross, Cornell does not have centralized requirements and allows for a number of paths through the undergraduate degree. Roadmapping, however, enabled librarians to quickly grasp (and reference as-needed) the various major programs, core requirements, history of information literacy instruction, and more, facilitating both effective use of staff resources and higher-level planning for their information literacy program.21 There is also plenty of evidence in the literature of roadmapping being used as a preliminary step in a more-involved curriculum mapping project, or in conjunction with other techniques such as syllabus analysis.22

Strategies such as roadmapping and syllabus analysis are necessarily fairly qualitative, but it can be difficult to accurately assess a program’s reach and understand the program as a whole without quantitative information. One means of reporting out quantitative results from such projects is through using scoring techniques. There are numerous examples in the literature of libraries utilizing scoring formulas in conjunction with mapping projects to quantify the degree to which information literacy is present within particular programs, courses, etc. Specifically, scoring has been used to quantify such elements as the strength of a course’s candidacy for future information literacy integration23 or the sophistication of existing information literacy elements.24 At institutions where librarians have access to student-level data, scoring has also been used to calculate the instructional histories of individual students (i.e., whether they have had previous information literacy instruction), garnering a stronger sense of how students move through the course sequence and which courses might most effectively target the greatest number of students.25 Broadly, scoring- and rubric-type techniques are a common fixture in the library literature, and the authors have found scoring to be an effective method for understanding assessment data in previous projects. This was the primary basis for the authors’ choice of methodology in this project.

Background

The College of the Holy Cross is a Jesuit, undergraduate-only, liberal arts college located in Worcester, Massachusetts (FTE approx. 3,000). The college’s current curriculum aims to provide students with maximum flexibility in their learning experiences. Students begin with the required first-year program, Montserrat, which consists of year-long seminars from across the college’s disciplinary departments, loosely grouped into themed clusters. Rather than a predetermined course sequence or required entry-level courses, students at the college select from the full range of the curriculum to fulfill 12 disciplinary requirements, in addition to requirements for the student’s selected major(s), minor(s) and/or concentration(s). Each academic program has its own set of requirements, which might consist of a standardized course sequence, a selection of electives from designated categories, and/or a minimum competency level (e.g., in some languages), among other combinations. The end result is that each student’s degree path is highly individualized.

Information literacy instruction, meanwhile, has been ad hoc, contingent largely on the rotation of courses and the strength of relationships (and overall communication) between individual faculty and librarians. While liaison librarians for the first-year communities were in place at the program’s inception in 2008, a teaching-focused liaison program for the major departments was not established until 2014. Most information literacy instruction at the college has historically been, and is still, driven by faculty request: faculty approach librarians with requests for instruction and librarians tailor a session based on the syllabus, research assignment, and/or specific requests from the instructor. Thus, while librarians can point to strong relationships with specific faculty and may have an anecdotal sense of their level of engagement, the bigger picture is less clear. With no single class that all students are required to take, there is no information literacy module or session that all students are guaranteed to view. All of these factors have limited librarians’ ability to accurately assess the extent of their reach or to ensure that all students receive appropriate and equal instruction in information literacy. The Research, Teaching & Learning (RTL) division in the college libraries mainly engages with students via two methods: course-tailored information literacy instruction sessions, and Personal Research Sessions (PRS)—30-minute, individual consultations typically focused around a single assignment or research project and designed both to teach information-seeking/evaluation skills and to provide students with supporting materials for their projects. Given the ad hoc nature of the existing instruction program, it was clear that the Holy Cross Libraries needed to conduct a curriculum mapping project to assess how many students were actually being reached, and to what extent they were being reached (i.e., were students being reached evenly across all programs and departments?).

Methodology

We felt strongly that we wanted to interface with each one of our students at least once in the course of their college careers, but with no shared course across the curriculum and no attendance records for the majority of information literacy sessions, there was no good way to measure progress towards this goal. The purpose of the libraries’ curriculum mapping project was therefore not to determine how best to incorporate information literacy skills into the curriculum, but how to scale a customized instruction program to ensure librarians were reaching as many students as possible. In pursuit of this goal, the Libraries’ Teaching & Learning Team developed a formula to assign a single score to each course which would allow RTL to succinctly and clearly convey their findings to non-library stakeholders.

Since it would be both logistically impossible and inefficient to interface with every single course at the college, the authors decided that it would be most effective to identify the required course sequences for each major-granting department, positing that each student would have to pass through at least one. These sequences, once identified, would guide our review of the results as well as future engagement efforts. As mentioned previously, each major-granting department at the college has a slightly different approach towards requirements. Thus, as the first step in our project, liaison librarians were asked to map out the requirements for their areas of responsibility. Liaisons created spreadsheets for their associated major-granting departments, listing specific required courses as well as upper-level requirements (e.g., the English department requires students to take one upper-level course for every major movement within English literature; all of the options for those requirements were listed). As part of this process, liaisons also identified the various course codes associated with requirements in their respective departments.

The Assessment Librarian then aggregated all consultation and information literacy instruction data from Fall 2013 through Spring 2018 and cleaned the data. This process involved requesting information about all courses from the Registrar’s Office; assigning a specific course number, section number, and faculty member to each instruction session and personal research session; and indicating the total number of students enrolled in each course that received library engagement. Once this information was entered and standardized, the assessment librarian calculated the engagement score for each course where at least one intervention (i.e., one research appointment or one instruction session) had occurred during this period, whether it was a “required” course or not. The score was designed to consider the number of interventions/interactions in comparison to course enrollment and to weight information literacy sessions more heavily than individual research appointments with students. Since the goal of the project was to increase engagement with students, the Teaching & Learning Team agreed that it made sense to count each interaction with each individual student in the score, which led to the following formula:

For example, a course with 16 enrolled students, one information literacy session, and no individual consultations, would receive a score of 100 (). The same course, if 3 out of 16 students had also booked individual consultations, would have a score of 118.75 (); conversely, if 3 out of 16 students booked consultations but no information literacy session was held, the course would receive a score of 18.7 ().

Detailed course data obtained from the Registrar’s Office was used to confirm enrollment numbers (which were not always available and/or provided accurately at the time instruction was scheduled) and to differentiate multiple sections of a single course, as well as to provide broader context (i.e., to understand how many courses had had library engagement in a particular area vs. how many courses were offered). RTL’s data collection procedures have changed substantially over the years, so, while data exists for AY2013–2014, the data for individual consultations was not granular enough to support robust analysis. As a result, our initial analysis ultimately only considered Fall 2014 through Spring 2018.

Liaisons were provided with the engagement scores for all courses in their areas of responsibility during the time considered. Each liaison was responsible for reviewing their scores, comparing these scores against their list of major requirements/electives, and reflecting on the findings. As part of this process, liaisons generated a written narrative and reflection for each of their departments. While this took additional work and had a more subjective result, this step was important to account for anecdotal information that could contextualize the results. For example, one liaison identified lacking and/or outdated collections as a likely cause of limited engagement in her area; another department had recently restructured their 100-level course sequence, where the bulk of instruction tended to occur; and in at least one other department, a drop in engagement levels correlated with the departure of certain heavily-engaged faculty from the college. These reflections added nuance to the analysis and helped the authors differentiate permanent from temporary issues, as well as identify barriers that would require more systematic and creative effort to overcome.

Librarians also wanted to incorporate data about the first-year program, Montserrat; however, because Montserrat courses and faculty change on a biannual or in some cases annual basis, it was not possible to definitively compare engagement scores across academic years. For these courses, the Assessment, Teaching & Learning Librarian calculated engagement scores for each individual course and tallied the total number of interventions on a program level, regardless of score; assessment of individual clusters, however, was based on reflection narratives from each individual cluster librarian, similar to those produced for the major departments.

All librarian narratives were reviewed and summarized by the authors as part of the data analysis process. The findings from these narratives were then incorporated into a report on the project, which was submitted to college administrators and used to inform subsequent efforts to improve engagement across academic programs.

Initial Results

When this curriculum mapping project was initially conceived, the intent was to follow engagement on a course-by-course basis over multiple academic years. Ultimately, this wasn’t possible due to multiple factors: changing course offerings and instructors made it impossible to guarantee that instruction would be provided in a particular course during any given year; instructors sometimes scheduled library instruction based on factors other than the curriculum (e.g., needing to travel but not wanting to cancel class); and librarians occasionally switched department affiliations, which altered relationships with faculty members. Instead of following engagement on a course-by-course basis, the results of this project allowed librarians to see how engagement with entire departments fluctuated on a yearly basis and identify departments that needed additional outreach and intervention to ensure that students majoring in those fields received adequate information literacy instruction.

Similarly, while engagement scores were calculated for individual courses in Montserrat, it was not possible to compare scores across years due to the constantly evolving course rotation. Long-term analysis of engagement with the first-year program was based on narratives composed by the librarians liaising with individual clusters within the broader program. Common themes across these narratives included: the strong influence of individual faculty members’ preferences and interests (both first-year program cluster directors and individual teaching faculty) on levels of engagement from year to year; the importance of opportunities to engage with faculty at the start of each academic year; the difficulty of advocating for library engagement in courses with little or no research component; the impact of varying qualities of communication between faculty and librarians; and the importance of buy-in and direct support from the director of the first-year program.

Out of the four curricular areas identified (Arts, Humanities, Social Sciences, and Natural Sciences), in most semesters, the Social Sciences were found to have the highest engagement with librarians, followed by Humanities (as figure 1 shows). However, even within high-scoring curricular areas, there were departments with robust engagement and departments with poor engagement. Additionally, the authors discovered department-specific engagement which fell outside the areas scored in this analysis and was thus excluded from the final results. For example, music department faculty frequently assign projects which students complete via more-traditional, drop-in reference transactions; as these transactions fell outside the scope of the Personal Research Session program, they were not reflected in the final scores for that department.

Figure 1

Average Interventions per Course per Curricular Area, 2014–2019

Figure 1. Average Interventions per Course per Curricular Area, 2014–2019

In order to create meaningful visualizations, the authors analyzed the data by charting the number of courses engaged per department and per curriculum area each semester, as well as the mean engagement scores within those courses during each semester. Figure 1 shows the average score per course across all curricular areas. For example, in Fall 2018, the average score for Humanities departments was just over 4, meaning that, on average, courses in the Humanities that had any kind of intervention during the Fall 2018 semester, had an engagement score of 4. For clarity of visualization, courses which had no engagement at all were eliminated from these figures. In the final report, the scoring information was combined with information from the narrative reflections to allow the authors to paint a complete picture of engagement within each academic department.

Due to the wide variation in engagement among departments and across semesters, the combination of these two metrics allowed for more effective evaluation than simply calculating the mean engagement score for all classes running per department per semester. For the purpose of reporting, multiple heat map visualizations were developed, some which showed null scores across semesters (e.g., if a course received library instruction once and then again 3 years later, with no engagement between, that was indicated), and some which did not. This allowed for visualization of weak and strong areas within and across departments and comparison of quantitative scores of required courses produced at the beginning of the project. An example of one of these heatmaps can be seen in Appendix A.

While the final compilation and analysis of data, especially qualitative data, continued throughout 2018, the authors felt that the project’s initial results, particularly the maps of course requirements, suggested some changes that could be made immediately. It was decided that both authors would pilot new engagement strategies during AY2018–2019, with the goal of introducing them across the RTL division should they prove successful. Targeted emails were sent to faculty teaching required courses in the authors’ liaison departments. The authors additionally explored opportunities for increased extracurricular engagement, either in conjunction with or in lieu of in-class library instruction.26 Results from this initial pilot were promising: despite using slightly different strategies, both librarians were successful in increasing engagement with the targeted departments. Plans were made to implement a soft launch of these engagement strategies across RTL in the Spring 2020 semester. Unfortunately, the onset of the COVID-19 pandemic and the college’s subsequent transition to remote learning necessitated the cancellation or reconfiguring of many library instruction activities and put this plan on hold for the foreseeable future.

While the AY2018–2019 pilot saw an increase in engagement (the number of total instruction sessions increased by 8.39% over the previous year), a natural consequence of this success was that both librarians saw large increases in their instruction loads. Broadening this approach to the entire RTL division would significantly grow each librarian’s workload, so any future attempts to implement these strategies would need to consider scalability. It will also be difficult to ensure that librarians are engaging with every student, unless they are able to teach in every required or possibly-required course. It is likely that librarians will need to consider alternative solutions, perhaps several in combination, to reach the library’s goal of 100% student engagement while responsibly and effectively utilizing resources. However, the time-intensive process of identifying each major department’s required courses was an important step forward. Spreadsheet maps of major requirements will set the stage for future engagement efforts, including directly communicating with faculty about appropriate courses in which to integrate information literacy, and identifying courses that would specifically benefit from other types of support such as data literacy or visual literacy programming.

Discussion: Challenges & Limitations

Any major assessment project has its challenges, that are multiplied with many parties and complicated factors involved. The current project benefited from the decision to complete the analysis internally, which simplified the process and did not require collaboration with academic departments. However, each participating librarian was very aware of the challenges and limitations of their liaison departments, which affected how each person approached this project. One concern that was unresolved was the uneven distribution of labor in asking each liaison to develop a course list and reflection for each of their academic programs: some liaisons had many more academic programs to assess than others, and some liaisons had fewer but more complex programs for which it was more challenging to construct a list of critical courses. The authors considered the possibility of evenly dividing departments regardless of liaison, but ultimately, a liaison perspective was required to unearth the idiosyncrasies of each program’s past and present interactions with the libraries.

In a practical sense, this project was limited by the state of the existing data. Methods for recording instruction and consultation statistics have varied over time as the libraries’ programs and needs have changed. All data had to be cleaned and normalized manually before analysis could proceed, leaving room for human error and requiring some data points, in cases where the referenced course was simply unidentifiable, to be excluded from the final analysis (these were mostly individual consultations). Additionally, existing data collection methods did not account for the length of individual consultations, meaning that scoring could not differentiate between interventions of varying lengths—i.e., a 15-minute instruction session would be scored the same as a 75-minute session, and the same as an entire class coming in for required, individual consultations (15–20 consultations of roughly 30 minutes each). The highlighting of these limitations was, however, an unforeseen benefit of this project: once these issues became apparent, the RTL division was able to take steps to standardize data collection, as well as to more accurately track the actual length of interventions. Future iterations of this analysis will benefit from progressively more consistent and thorough data which will allow for better accuracy. However, incorporating these new data collection methods would also necessitate the development of a more nuanced scoring system to account for the additional factors like session length.

The complicated structure of academic departments at the college also created challenges, if not outright barriers, to accurate evaluation of library engagement data. As mentioned earlier, the track followed by each student is extremely individualized. Many major programs at the college, particularly in the humanities, do not follow a set track and/or offer a complicated set of course requirements, making it difficult to devise meaning from engagement scores. It is difficult to identify or target critical courses in a major program where students are asked to, for example, select one from each of four course categories (as in Religious Studies). Some academic departments, and most interdisciplinary programs such as Environmental Studies, draw heavily upon or at least accept coursework from other departments to fulfill major requirements. This is in addition to departments which themselves encompass multiple course codes. Additionally, the constantly rotating nature of the first-year curriculum made it impossible to quantitatively score or analyze the Montserrat program—evaluation of this program had to be based solely on qualitative assessment.

It is also worth noting that department requirements and course availability have changed over time, making it challenging to compare different years or accurately assess the success of library engagement with critical courses. This could be resolved by consulting previous enrollment data and course catalogs, but this is time-consuming, sometimes imprecise, and optimally requires institutional knowledge of programs which is not always available. While the authors chose not to directly engage academic departments on this project for a number of reasons, direct communication with the departments could be another, more effective means of resolving these issues in future.

Finally, the relationship between the libraries and various academic disciplines varies widely in ways that are sometimes outside the scope of this project. Some departments have a strong culture of library engagement which is reflected in research-heavy assignments (two examples being the History and Political Science departments), while others prefer that students engage with primary texts without secondary research or focus on skill-sets not necessarily supported by typical library engagement (for example, the Philosophy and Studio Art departments). Some programs also choose to engage with the library in other ways. For example, the Music Library, in this assessment, had relatively low engagement scores, but the Music Librarian tends to receive more walk-up extended reference questions than the other libraries—questions of an advanced nature that would likely surface during a research consultation in the main library. However, since the libraries firmly differentiate walk-up questions from pre-scheduled consultation via the PRS program, such questions were not included in this assessment.

Conclusion

The curriculum mapping project undertaken by the Research, Teaching, & Learning division at the Holy Cross Libraries was a mixed-methods project that successfully used both qualitative and quantitative methods to assess the level of engagement that liaison librarians had with students in their major-granting departments. This project revealed that certain areas had much greater engagement than others, but that overall, increasing engagement was a fairly straightforward process. Librarians found the process of mapping out the required courses to be quite valuable and reported that it gave them a better understanding of their departments and a greater ability to interface with students in a meaningful way. Additionally, the process of developing the curriculum map highlighted important considerations about the ways in which the RTL division documents its engagements with academic departments and with individual students, considerations which have already changed the division’s data collection procedures. The current project will serve as a foundation for future efforts to embed information literacy instruction into the curriculum in a meaningful way.

Acknowledgements

The authors wish to thank the members of the College of the Holy Cross Libraries Research, Teaching & Learning division and the Libraries’ Teaching & Learning Team, including David Banville, Eileen Cravedi, Janis DesMarais, Alicia Hansen, Barbara Merolli, Jared Rex, Lisa Villa, and Laura Wilson, as well as Director of Library Services Mark Shelton, for their substantial contributions to this curriculum mapping project.

Appendix A. Sample from Heatmap, Humanities Departments, with Required Courses Highlighted

Appendix A. Sample from Heatmap, Humanities Departments, with Required Courses Highlighted

Bibliography

Archambault, Susan Gardner, and Jennifer Masunaga. “Curriculum Mapping as a Strategic Planning Tool.” Journal of Library Administration 55, no. 6 (2015): 503–519. https://doi.org/10.1080/01930826.2015.1054770.

Beuoy, Melissa, and Katherine Boss. “Revealing Instruction Opportunities: A Framework-Based Rubric for Syllabus Analysis.” Reference Services Review 47, no. 2 (2019): 151–168. https://doi.org/10.1108/RSR-11-2018-0072.

Boss, Katherine, and Emily Drabinski. “Evidence-Based Instruction Integration: A Syllabus Analysis Project.” Reference Services Review 42, no. 2 (2014): 263–276. https://doi.org/10.1108/RSR-07-2013-0038.

Buchanan, Heidi, Katy Kavanaugh Webb, Amy Harris Houk and Catherine Tingelstad. “Curriculum Mapping in Academic Libraries.” New Review of Academic Librarianship 21, no. 1 (2015): 94–111. https://doi.org/10.1080/13614533.2014.1001413.

Bullard, Kristen A. and Diana H. Holden. “Hitting a Moving Target: Curriculum Mapping, Information Literacy and Academe.” In LOEX Conference Proceedings 2006, edited by Theresa Valko and Brad Sietz, 17–21. Ypsilanti, Michigan: Eastern Michigan University, 2006. https://commons.emich.edu/loexconf2006/29/.

Charles, Leslin H. “Using an Information Literacy Curriculum Map as a Means of Communication and Accountability for Stakeholders in Higher Education.” Journal of Information Literacy 9, no 1. (2015): 47–61. https://doi.org/10.11645/9.1.1959.

Gessner, A. Gabriela Castro and Erin Eldermire. “Laying the Groundwork for Information Literacy at a Research University.” Performance Measurement and Metrics 16, no. 1 (2015): 4–17. https://doi.org/10.1108/PMM-12-2014-0044.

LeMire, Sarah and Stephanie J. Graves. “Mapping Out a Strategy: Curriculum Mapping Applied to Outreach and Instruction Programs.” College and Research Libraries 80, no. 2 (2019): 273–288. https://doi.org/10.5860/crl.80.2.273.

McGowan, Britt, Melissa Gonzalez and Claudia J. Stanny. “What Do Undergraduate Course Syllabi Say About Information Literacy?” portal: Libraries and the Academy 16, no. 3 (2016): 599–617. https://doi.org/10.1353/pla.2016.0040.

Reed, Bonnie, Hillary B. Veeder, Sara Schumacher and Brian C.R. Zugay. “Placing Research on Their Map: Curriculum Mapping as a Collaborative Tool for an Architecture Branch Library.” Art Documentation: Journal of the Art Libraries Society of North America 37, no. 2 (2018): 176–191. https://doi.org/10.1086/700012.

Schattle, Erica, Joshua Quan, and Megan Bresnahan. “Student Instructional Histories: An Approach to Assessing the Reach of an Information Literacy Program.” In Proceedings of the 2016 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, edited by Sue Baughman, Steve Hiller, Katie Monroe and Angela Pappalardo, 136–144. Washington, DC: Association of Research Libraries, 2016. https://www.libraryassessment.org/wp-content/uploads/bm~doc/27-schattle-2016.pdf.

Smith, Cheri, Linda Doversberger, Sherri Jones, Parker Ladwig, Jennifer Parker and Barbara Pietraszewski. “Using Course Syllabi to Uncover Opportunities for Curriculum-Integrated Instruction.” Reference and User Services Quarterly 51, no. 3 (2012): 263–271. http://www.jstor.org/stable/refuseserq.51.3.263.

Whelan, Jennifer L.A.“Extracurricular Engagement as an Alternative to Traditional Instruction.” In Liaison Engagement Success: A Practical Guide for Librarians, edited by Ellen Hampton Filgo and Sha Towers, online supplement. Lanham, MD: Rowman & Littlefield, 2021. https://rowman.com/WebDocs/Supplement_Stories_of_Liaison_Engagement_Success.pdf.

Ziegler, Amanda. “‘I Wanna Be in the Room Where It Happens’…:Using Curriculum Mapping to Support the Information Literacy Goals of Online Programs.” Journal of Library & Information Services in Distance Learning 13, no. 2 (2019): 226–234. https://doi.org/10.1080/1533290X.2018.1499260.

Notes

1. Erica Schattle, Joshua Quan, and Megan Bresnahan, “Student Instructional Histories: An Approach to Assessing the Reach of an Information Literacy Program,” in Proceedings of the 2016 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, ed. Sue Baughman, Steve Hiller, Katie Monroe and Angela Pappalardo (Washington, DC: Association of Research Libraries, 2016), 137, https://www.libraryassessment.org/wp-content/uploads/bm~doc/27-schattle-2016.pdf.

2. For an overview of the history of this practice, see Susan Gardner Archambault and Jennifer Masunaga, “Curriculum Mapping as a Strategic Planning Tool,” Journal of Library Administration 55, no. 6 (2015): 503–519. https://doi.org/10.1080/01930826.2015.1054770.

3. Heidi Buchanan, Katy Kavanaugh Webb, Amy Harris Houk and Catherine Tingelstad, “Curriculum Mapping in Academic Libraries,” New Review of Academic Librarianship 21, no. 1 (2015): 94–111, https://doi.org/10.1080/13614533.2014.1001413.

4. Bonnie Reed, Hillary B. Veeder, Sara Schumacher and Brian C.R. Zugay, “Placing Research on Their Map: Curriculum Mapping as a Collaborative Tool for an Architecture Branch Library,” Art Documentation: Journal of the Art Libraries Society of North America 37, no. 2 (2018): 176–191, https://doi.org/10.1086/700012.

5. E.g., in Sarah LeMire and Stephanie J. Graves, “Mapping Out a Strategy: Curriculum Mapping Applied to Outreach and Instruction Programs,” College and Research Libraries 80, no. 2 (2019): 273–288, https://doi.org/10.5860/crl.80.2.273.

6. Leslin H. Charles, “Using an Information Literacy Curriculum Map as a Means of Communication and Accountability for Stakeholders in Higher Education,” Journal of Information Literacy 9, no 1. (2015): 47–61, https://doi.org/10.11645/9.1.1959.

7. Reed et al., “Placing Research on Their Map.”

8. Amanda Ziegler, “‘I Wanna Be in the Room Where It Happens’…: Using Curriculum Mapping to Support the Information Literacy Goals of Online Programs,” Journal of Library & Information Services in Distance Learning 13, no. 2 (2019): 226–234, https://doi.org/10.1080/1533290X.2018.1499260.

9. Melissa Beuoy and Katherine Boss, “Revealing Instruction Opportunities: A Framework-Based Rubric for Syllabus Analysis,” Reference Services Review 47, no. 2 (2019): 151–168, https://doi.org/10.1108/RSR-11-2018-0072

10. Katherine Boss and Emily Drabinski, “Evidence-Based Instruction Integration: A Syllabus Analysis Project,” Reference Services Review 42, no. 2 (2014): 263–276, https://doi.org/10.1108/RSR-07-2013-0038.

11. Buchanan et al., “Curriculum Mapping in Academic Libraries”; Kristen A. Bullard and Diana H. Holden, “Hitting a Moving Target: Curriculum Mapping, Information Literacy and Academe,” in LOEX Conference Proceedings 2006, edited by Theresa Valko and Brad Sietz (Ypsilanti, Michigan: Eastern Michigan University, 2006), 17–21, https://commons.emich.edu/loexconf2006/29/.

12. Britt McGowan, Melissa Gonzalez and Claudia J. Stanny, “What Do Undergraduate Course Syllabi Say About Information Literacy?,” portal: Libraries and the Academy 16, no. 3 (2016): 599–617, https://doi.org/10.1353/pla.2016.0040.

13. Cheri Smith, Linda Doversberger, Sherri Jones, Parker Ladwig, Jennifer Parker and Barbara Pietraszewski, “Using Course Syllabi to Uncover Opportunities for Curriculum-Integrated Instruction,” Reference and User Services Quarterly 51, no. 3 (2012): 263–271, http://www.jstor.org/stable/refuseserq.51.3.263.

14. Ziegler, “Room Where It Happens.”

15. Buchanan et al., “Curriculum Mapping in Academic Libraries,” 96.

16. Buchanan et al., Case Study 1, 98–100.

17. Buchanan et al., Case Study 2, 100–103.

18. Bullard and Holden, “Hitting a Moving Target.”

19. A. Gabriela Gessner and Erin Eldermire, “Laying the Groundwork for Information Literacy at a Research University,” Performance Measurement and Metrics 16, no. 1 (2015): 4–17, https://doi.org/10.1108/PMM-12-2014-0044.

20. Gessner and Eldermire, “Laying the Groundwork,” 7–8.

21. Gessner and Eldermire, “Laying the Groundwork,” 15.

22. Bullard and Holden, “Hitting a Moving Target.”

23. Beuoy and Boss, “Revealing Instruction Opportunities.”

24. Smith et al., “Using Course Syllabi.”

25. Schattle, Quan, and Bresnahan, “Student Instructional Histories.”

26. For a description of one such opportunity, see Jennifer L.A. Whelan, “Extracurricular Engagement as an Alternative to Traditional Instruction,” in Liaison Engagement Success: A Practical Guide for Librarians, ed. Ellen Hampton Filgo and Sha Towers (Lanham, MD: Rowman & Littlefield, 2021), online supplement, https://rowman.com/WebDocs/Supplement_Stories_of_Liaison_Engagement_Success.pdf.

* Monica V. Locker was Assessment, Teaching & Learning Librarian, at College of the Holy Cross, email: mvlocker@gmail.com; Jennifer L.A. Whelan is Coordinator of Research, Teaching & Learning at College of the Holy Cross, email: jwhelan@holycross.edu. ©2024 Monica V. Locker and Jennifer L.A. Whelan, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Monica V. Locker, Jennifer L.A. Whelan


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (By Year/Month)

2025
January: 92
February: 157
March: 100
April: 86
May: 53
June: 69
July: 65
August: 98
September: 60
October: 94
November: 118
December: 80
2024
January: 0
February: 11
March: 1782
April: 637
May: 268
June: 104
July: 225
August: 96
September: 79
October: 80
November: 49
December: 74