Conducting a User Needs Assessment through the Consortia: Pooling Resources to Examine Student and Faculty Habits

From 2016 to 2020, ten smaller schools in one regional library consortium participated in a user needs assessment project. This article documents the process for implementing a collaborative user needs assessment by utilizing the shared interest and enthusiasm of a team of librarians to create a consortial toolkit. The toolkit supplied direction and leverage to conduct ethnographic research at consortial schools while providing clarity and consistency for testing across multiple sites.


Students at small to medium-sized private colleges and universities exist within a diverse smattering of close-knit campus communities. These students and the schools they attend represent an important sector of higher education often overlooked in the literature on ethnography in libraries. Many of these schools have missions unique to higher education and attract specific demographics of students. With limited resources, these schools often lack the opportunity on their own to conduct student behavioral research, particularly observational approaches such as ethnography, that can provide insights beyond mere numbers. While existing ethnographic research can help these libraries support students in general, specific local research is missing on how these libraries can best support their students within their own campus contexts. However, the regional library consortium Private Academic Library Network of Indiana (PALNI), working with a shared pool of resources, developed an efficient model of ethnographic research by creating a collaborative toolkit to investigate the research habits and needs of this group of users.

Literature Review

The use of ethnographic methods among libraries and librarians blossomed after anthropologist Nancy Fried Foster’s research at the University of Rochester on student study habits.1 Ethnography is “a scientific approach to discovering and investigating social and cultural patterns and meaning in communities, institutions, and other social settings.”2 Roger Sanjek describes it as both a product (written work) and process (methodology), rooted in an anthropological triangle with comparison and contextualization.3 This literature review focuses on ethnographic methods, such as the techniques of observation, interview, and questionnaire, developed by anthropologists to conduct ethnography.

The use of ethnographic methods and other forms of qualitative research “allows you to make fine distinctions and see ambiguities in your data… [and] facilitate in-depth and open-ended investigations into observed phenomena, often allowing the researcher(s) a great deal of flexibility in pursuing research questions.”4 Ethnography isn’t an easier way of collecting data but a way to collect data chosen specifically because of the viewpoint it allows the researchers to take. It looks at a phenomenon or process within a context or culture by asking the participants to describe and demonstrate their lives. Ethnography looks for themes rather than proof.

An extensive literature review of ethnographic methods in libraries found eighty-one published studies, the majority published after 2005, most of which used observation or interview methods.5 A later literature review outlined three primary ways in which libraries were using ethnographic methods: to evaluate space assessment, to monitor student behavior, and to assess library resource usage.6 Other reviews focused on the use of visual research and spatial theory in library space studies.7 Many of these studies pull heavily from the Ethnographic Research in Illinois Academic Libraries (ERIAL) project’s guide to ethnographic research.8 Project Information Literacy has used ethnographic methods in its reports on student information usage habits since 2009; in 2015, it began the professional community UXLibs, which supports an annual conference, yearbook, and training events.9 While ethnographic methods have proliferated within libraries, such usage has not reached maturity, with challenges of buy-in and awareness across library leadership and staff limiting adoption.10 Likewise, anthropologists have criticized libraries using ethnographic methods without an ethnographic mindset or set ontological assumptions, arguing that measurable, short-term projects to fix problems miss the goals and purposes of ethnography and can result in incomplete results.11

Most published library ethnographic projects come from large universities with the staffing and support for such work, although scalable testing models suitable for smaller colleges are becoming more common.12 A handful of studies do exist from smaller liberal arts colleges, frequently as part of consortial collaborative efforts. Illinois Wesleyan University (IWU), with a 2009 enrollment of 2,066, was one of five members of the ERIAL project conducted in 2008–10, which utilized fourteen different interview and mapping methods to investigate student study habits. IWU implemented thirteen of the fourteen methods with 245 participants.13 Reed College, with a 2014 enrollment of 1,339, conducted a mapping activity and focus group in Spring 2014 with seven student participants, asking about their research process and how they obtained a print book from the collection. The project was developed using a service design which employs an ongoing working group of participants.14 Gustavus Adolphus College (GAC), with a 2015 enrollment of 2,457, was one of eight members of the “A Day in the Life” (ADITL) project, where 19 GAC participants received text messages throughout the course of one day asking for their location, activity, and feeling. Results were geocoded and maps were created of each student’s day.15 GAC’s own study on student research experiences in spring 2018 included interviews, a mapping activity, and photo diary.16

In the literature, collaborative ethnographic projects take one of two forms: studies conducted across a multicampus college or university system, and studies conducted across multiple independently functioning universities. There is currently a gap in the literature for consortia-led studies. Studies across multicampus college or university settings include the 2009 to 2011 City University of New York (CUNY) system’s six-site Undergraduate Scholarly Habits Ethnography Project, and the 2013 to 2016 Montgomery College three-site study.17 Collaborative multi-university collaborations came in all shapes and sizes. The ERIAL project was focused on colleges and universities within a shared geographic area: the state of Illinois.18 A three-site study focused on a specific library type, performing arts libraries, while the schools were scattered across the country (Clark et al., 2020). The ADITL project’s participants were from diverse college and university types, selected “based on their libraries’ capacity and experience in undertaking ethnographic research.”19

The literature highlighted above illustrated for us the challenges of small and/or underresourced institutions to conduct quality ethnographic research, which included staffing levels, skill sets, and institutional support. To address these challenges, smaller colleges often leveraged collaborations and sought out anthropologists outside the organization for consultation.

A Consortial Toolkit

PALNI’s 2014–16 strategic plan called for “strengthening libraries’ ability to demonstrate relevance and value” and “assessing user needs and behaviors.” To fulfill this strategic initiative, PALNI’s executive director asked that a user-needs assessment study be done. We needed to develop a model that would fit the diverse membership of our consortium, that would support a long-term project, and include, but not be dominated by, staff with some background in user research. In order to conduct our consortial-level user needs assessment, we designed a consortial toolkit for the project. While the ERIAL Project’s Practical Guide to Ethnographic Research in Academic Libraries provides helpful advice and documentation, we decided to go one step further and create a digital toolkit with all the documentation and information needed to perform the study based on our group’s literature review.20 To our knowledge, ours is the only such consortial toolkit for conducting a user needs assessment. This article will walk through our process of creating and using a consortial user needs assessment toolkit (see appendix A for a link to our toolkit).

Project Team

We started our user needs assessment by bringing together a project team. We have found that successful projects at this scale generally require at least one consortial staff member to head the project. Given that capacity at individual schools may be limited for ethnographic research, a centralized person dedicated to this task (even if the project is just one of their various job tasks) is key.

Our project team originally consisted of three consortial employees (two of whom worked part-time for the consortium alongside their other librarian commitments), five librarians employed by libraries in the consortium, and two LIS graduate student interns. Structured as a short-term task force, the team was open to any librarian employed in the consortium. As the project progressed, the team shrunk to four members, due to attrition. This did not cause a problem, as the bulk of the work of designing the toolkit was done up front, and the team became more efficient over time. Our first task was to determine research questions and methods. As Arnold Arcolio, a user researcher for OCLC, advises: “Clearly define your research questions. Otherwise, you’ll spend the rest of your life reviewing the data and finding interesting things.”21 As a group, the team developed a large list of research question ideas derived from previous studies. We ranked and prioritized the questions, then removed questions outside the scope of our study, such as those focusing on information literacy or directly on individual libraries. Our final research questions focused on three areas: faculty, spaces, and study habits (appendix B).After our research questions were complete, we chose five methods from the dozens available that we believed would be manageable for anyone to implement. These methods, which included interviews, photo essays, photo collages, day mapping, and library mapping, were intended to elicit robust responses to our research questions as well as complement each other. We specifically excluded surveys from our method list as we wanted qualitative versus quantitative data.

Table 1

Ethnographic Research Methods




We asked students fourteen questions about their study habits. We asked faculty eight questions about their research and their students’ research. The interviews lasted between ten to thirty minutes.

Pros: The most effective method for answering all our research questions.

Cons: Allowed for the least amount of creativity on the part of participants.

Photo Essay

We asked students to take twenty photos over a forty-eight hour period, each corresponding to a specific prompt about their study habits. We then asked them to participate in a short interview in which they explained why they chose each photo.

Pros: Provided an in-depth look at a participant.

Cons: This was the most challenging of our methods to schedule as it required two meetings with the participants.

Photo Collage

We chose 300 stock photos and printed them on heavy cardstock the size of a business card. We laid them across a table and asked students to pick five photos that answered a question. Each student responded to three questions, so they chose a total of fifteen photos. We then asked them to tell us why they chose each photo.

Pros: This was the method that attracted the most students. Students would stop by without being asked just to see what we were doing with all the shiny cards. It worked great in a busy location such as a coffee shop.

Cons: The need to store and share cards between team members and schools took extra time.

Day Mapping

We asked students to draw a map of their typical day around campus. We then asked them to explain the map to us.

Pros: This was a great method for learning about student traffic patterns that were not shared using other methods.

Cons: This was the most difficult method to visually represent in our reports.

Library Mapping

We asked students to draw us a map of their normal route through the library. We then asked them to explain the route to us.

Pros: Multiple methods identified popular study spaces in the library, but this was the method that related the spaces to one another most effectively.

Cons: The method was limited to the library and library users.

Toolkit Overview

Now that the basics of our study were determined, we decided to create our toolkit on Google’s Team Drive product, already in use by the consortium. We stored the toolkit on the Team Drive with public visibility, and all the project documents and data on a separate Team Drive only visible to the project team. (Alternative solutions could include Microsoft OneDrive or wiki software.) We chose the Team Drive approach because everyone in the consortium could access files stored there, and the files could easily be updated, but only by our project team.

In our collaborative toolkit, we included the following documents: an overview of the project, a methodology chart, a list of research questions, applicable information for Institutional Review Board (IRB) applications, demographic forms, descriptions of how to implement each method, and signage and recruitment templates.

In the project overview document, we chose to include the following information:

  • What is a user needs assessment? (Local definition based on the literature)
  • Why should I participate in a user needs assessment? (Local justification based on the literature)
  • A high-level overview of the research questions
  • A high-level overview of the methods being used
  • Information about the (IRB) process
  • Details about how coding would be performed
  • Information about incentives for participants

The methodology chart provided a clear overview of each of the methods used in the study and why a school might choose or reject to implement a method at their school.

Table 2

Example Method from Methodology Chart


Photo Collage

Library Needs Met

Physical spacesPhysical resourcesGeneral feedback

Research Questions Answered

What variety in study spaces do students need?

What would students consider their dream workspace?

What spaces on campus do students use for studying?

Why do students choose a study spot?


Participants are asked to describe life, research, and spaces through pictures. Geared toward students.

Equipment Needs

Printed photos (available from the consortium office)

Since IRB approval is required before conducting ethnographic research, we included an IRB folder in our toolkit to help each school obtain approval on their campuses. While IRB approval processes vary widely by school, we provided the standard required information. Additionally, we found that it was easier and faster to have the consortial employees from the project team fill out most of the IRB paperwork for each school, as they were already familiar with variation in IRB forms and the project. Sometimes schools would need to share the IRB forms, as they were hidden on intranets or other locations only accessible to employees of the school. Many schools also required the project team to take some training covering research with human subjects. The most common site for this training is the Collaborative Institutional Training Initiative (CITI Program). We had each member of the team acquire CITI certification at the beginning of the project.

The next piece of our toolkit was the demographic forms that were used for the entire study. The data points we chose for students were school, academic status, age, gender, major, GPA, housing location, devices owned, enrollment status, first generation status, employment status, affiliation with a fraternity or sorority, or international status. (A note about gender: We asked for gender but did not include it in any of our data analysis or reporting. We only did a quick check to ensure that we spoke to multiple genders on each campus as a representative sample of the population.) For our faculty participants, we asked for their department, research specialty, number of years spent teaching, and whether they teach online classes. For our study, we used JotForm’s free plan to create our demographic forms. In our investigation of form platforms, only JotForm gave us the ability to assign each of our participants a unique ID number. Neither Google Forms, Survey Monkey, or any of the other products we investigated at the time could do this for the same price point or at all. JotForm also allowed files to be uploaded for the methods forms, which we used for several methods.

The largest piece of our toolkit was a document for each of our methods that included a method description, research questions covered by the method, and steps to conduct the method. (Appendix C contains an example method document.) This included a list of items necessary to perform the method including consent forms, writing utensils, demographic and transcription forms or links, a fully charged audio recording device, and specific items for a test method such as paper, markers, or photographs. These documents explain the methods and make the method process easily reproducible by anyone, regardless of whether they were a member of our project team.

The final piece of our toolkit was a detailed list of items needed for a campus visit. This included clear and consistent language for signage and recruitment templates that campuses could adjust to match institutional branding.

Toolkit Implementation

Once our toolkit was complete, we ran a beta test during the summer of 2016. Each member of the task force conducted a method with a student worker or faculty member colleague at their own school to see what tweaks needed to be made. As we started the beta testing, we received promising feedback. One researcher still regrets that she could not actually write up a report with a student’s comments, which she found very insightful. Similarly, she walked away from her faculty test with at least two action items based on the faculty member’s responses.

Then, we moved forward with piloting the study. We chose a larger, centrally located school for our pilot test. We involved every member of our project team and performed every method in our toolkit. Along with finding stress points in our toolkit, we were also able to share with each other the experience of conducting each method and its testing benefits and drawbacks. We continued having debriefing meetings after each campus conducted the study to share initial insights with the whole team. Our next step was for the consortial employees to work together to establish a code tree for the project (appendix D). We scanned numerous sources for this process, but the most practical source we found was Qualitative Data Analysis: A Methods Sourcebook, by Miles, Huberman, and Saldaña.22 For our coding we used Dedoose, web-based software for qualitative data analysis. We looked at other options, including CATMA and QCAmap, but found that Dedoose worked best for our purposes. We chose to code the data for each individual school as the study progressed, rather than coding all the data at the end of the study. Then we shared the coded data with the members of the project team for further analysis. Using the analyzed data, we worked together to produce a report sharing the results of the study for each school. Each report included:

  • A project overview
  • Demographic information on participants
  • Research questions
  • Research question findings
  • Highlighted findings
  • Most-used terms from participants
  • Questions aimed at starting insightful campus/library conversations
  • Specific method sections (such as maps of student paths around campus and/or their library)

We wrote our reports in Google Docs (visibility set to public) as the consortium was already using that product, but as mentioned earlier, Microsoft OneDrive or wiki software would also work well. We used ArcGIS and Carto to create the mapping pieces of our study. There are dozens of mapping programs available, but these were the two with the features we wanted to include.

After piloting the study, it was our intention for member libraries to take this toolkit, conduct the ethnographic tests themselves with the guidance of a team member, then share the results with the team, who would code them and write a report of the findings. Originally, this was the benefit of developing a toolkit. The reality turned out to be much different. With limited staffing, the burden of becoming acquainted with a toolkit and making time to conduct a study was too much for most of the schools. Rather, each institution requested a visit from a project team member to conduct the study. The institutions worked with that project team member to determine the methods that would be used at that campus and the timing for the study, and to complete the IRB paperwork. One site additionally provided librarians to facilitate some of the testing. Having dedicated interns helped with this process, as we sent them to several schools without having to send another member of the project team. Our interns also did much of the coding for the project. While the toolkit did not become a transferable tool for individual schools to use, it became increasingly helpful for the project team over time, as it provided clarity and consistency with standard language and procedures to implement across campuses.

Study Completion

Our study ended earlier than planned, as the COVID-19 pandemic forced our last two schools to cancel midway through IRB approval. At this point, 194 students and 14 faculty members across ten schools had participated in the study. We finished our project by debriefing and writing our final report as we had all along using web conference meetings and collaborative writing tools, another positive result of the collaborative approach. We reanalyzed the coded data in its entirety as well as results from each school. We also made recommendations for future testing. In each individual school report, we gave answers to each of our research questions. In our final report, we generalized our findings to share generic PALNI students’ study habits. We also included personas to represent twelve of the most common types of students who participated in our study (appendix A).


Creating a collaborative consortial toolkit was a successful approach for our team, even though our individual libraries did not have the capacity to implement the toolkit on their own. This approach helped build individual skill sets and added local expertise in using ethnographic methods. It developed collaborative relationships between consortial staff and members of the project team, along with the librarians at each school who participated in the project. It also focused the project with a clearly defined end in mind, keeping it consistent along the way. As librarians entering the world of ethnographic testing, we quickly realized there was a large gap between our areas of expertise and those of anthropologists. Our strengths as professional research librarians allowed us to develop a robust literature review on this subject area and dive deep into a topic with which we were otherwise unfamiliar. With the resources of the consortium backing us, we sought out collaborations among fellow librarians to complete this work, as well as the insight of a professional anthropologist. We also were able to easily organize and document our project and share it with others in the form of a toolkit. Thus, our inexperience in conducting ethnographic testing was an asset, not a liability.

However, it was a challenge to convey the value of ethnographic testing and our project to stakeholders. After our project team presented the final toolkit to the consortial board, it was slow to gain traction for implementation. After repeated one-on-one conversations with stakeholders and successful site visits implementing the toolkit, word spread about the toolkit’s value and benefits, which then resulted in interest in the project. It was also a challenge and a steep learning curve to learn how to evaluate the coded qualitative data and make definitive statements from our research.

During our consultation with the professional anthropologist mentioned above, we were told: “At the end of your study, you will discover that all students are the same.” However, the benefit of using ethnographic methods was that these procedures helped us understand our individual students’ specific preferences. While all students may seek a place that they feel comfortable, our study allowed us to find the places and spaces in which they feel comfortable on each campus, allowing librarians to develop programs and spaces to meet their students’ needs more specifically. Every library director or dean who participated in the study appreciated both the process and the results. Each library rearranged or bought new furniture, redecorated spaces, and more. As Tonya Fawcett, Director of Library Services at Grace College and Seminary, wrote, “the experience was great (even without knowing the results) and my concerns about how much time it would require from us were unfounded.” Julie Miller, then Dean of Libraries at Butler University, shared that “We used [the report] to inform our strategic priorities. I also used it when I was part of the executive team that visited Wabash College as they were in the process of hiring a new library director. It helped the search committee made up primarily of faculty and administrators to understand the ways in which PALNI can provide support, especially for institutions with relatively few library staff.”


Using a shared pool of resources through our regional library consortium allowed us to develop an efficient model of ethnographic research to investigate our users’ research habits and needs. Working through a consortium gave us the resources we needed to complete this project and allowed us to compile a dedicated team to develop a collaborative toolkit model to work with all our schools, as well as the scope and limits of our study structure. While we hit a few bumps and roadblocks along the way, the toolkit allowed ten small private institutions to conduct a thorough user needs assessment without requiring substantial local resources.


The authors would like to thank our initial task force members: Carrie Halquist, Kirsten Leonard, Edita Sicken, Amanda Starkel, Sarah Wagner, and Megan West, as well as our project team members James Bell, Lara Miller, and Sarah Newell.


Our complete toolkit and reports are accessible by visiting https://libguides.palni.edu/inap.

APPENDIX B: Research Questions


  • How often are academic sources required for assignments?
  • How are faculty integrating information literacy into their classes?
  • Where do faculty members do research?
  • Where do faculty point their students when they ask for help?


  • Where do students go to collaborate/work on projects with a group?
  • What spaces on campus do students use for studying?
  • Where do students study during the late night? (Is this different from other hours?)
  • Why do students choose a study spot?
  • What library resources (vs. computers, wifi, printers, etc.) are students using while they are in the library?
  • What variety in study spaces do students need?
  • What would students consider their dream workspace?

Study Habits

  • Where do students find their class readings?
  • What items help students study?
  • What distracts students from studying?
  • Whom do students ask for help with studying?

APPENDIX C: Example Method Document

Students—Day Mapping


Day mapping is used to visually illustrate a participant’s needs and the series of interactions, places, and resources that are necessary to fulfill those needs. In this activity, the participant is given a campus map and asked to track their movements over the course of a day. Afterward, the participant will explain their map in a short interview. What makes this methodology unique is its ability to highlight the flow of the participant’s daily experience with critical pain points, where our attention and focus will have the most impact..

For informational purposes, things we want to learn (research questions)

  • What spaces on campus do students use for studying?
  • Where do students study during the late night? (Is this different from other hours?)
  • Why do students choose a study spot?
  • Where do students go to collaborate/work on projects with a group?
  • What would students consider their dream workspace?

Before you start testing

  1. Have consent forms printed and writing utensils available.
  2. Have the following pages open in a web browser:
    1. Demographic form: [insert link here]
    2. Transcription form: [insert link here]
  3. Have blank paper or printed campus maps.
  4. Have recording device available and ready (if using).
  5. Request access to the Google folder the maps will be uploaded to. (This will be granted within one business day.)


  1. Review and follow general guidelines in the Getting Started Document.
  2. Have the student sign a consent form.
  3. Have the student fill out the demographic form
  4. Open this form (contains all questions): [insert link here]
  5. Give the student a blank piece of paper or a printed campus map. Then, ask the student to do the following:
    1. With this map, draw out your usual routes and activities during a typical day. After you finish, I’ll ask you a few questions about your map.
  6. Optional: If using a recording device (audio/visual), begin recording, ensuring the microphone is close to the student.
  7. Ask the student the questions on the form and transcribe the answers. Include the student’s participant ID number (see Getting Started Document). Submit the form.
  8. Thank the student for their time and allow them to leave.
  9. Add student participant ID number to the finished map and take a photo or scan it.
  10. Upload the digitized map to the Google folder.
Appendix. Example Student Map

Example Student Map

Debrief Interview

Q – Please provide an overview of your day.

A – It was a Tuesday, so I have class from 8:30 to 11:30 and then I work in the afternoon. I worked out in the morning and went to a movie at night.

Q – Let’s go into more detail. Can you please walk us through each part of your day, letting us know what specifically you were doing in each location?

A – I left my house at 7am and walked to the HRC to work out.

Q – What did you carry with you?

A – Backpack, change of clothes, books for my first two classes, water bottle, my phone.

Q – Okay, let’s go back to your map. You’re at the HRC…

A – I showered and rushed to class in the Pharmacy building by 8:30. I was in class from 8:30 to 11:30. I go back and forth between Pharmacy and the Science Building. Then I went to lunch in Atherton and home for an hour to change for work. I went to one more class in Lily and then to work for two hours at the library. I went home again for dinner and to relax a bit. My friends wanted me to go to a movie, so we drove off-campus. Then at night, starting around 9, I had to get homework done so I went back to the library.

Q – Was that the only time during the day when you were doing homework/studying?

A – Yes, on Tuesdays I don’t have time during the day.

Q – If applicable: Where did you go in the library? And why did you select this space?

A – I sat at the tables on the second floor. I like the first floor better, but it was crowded and I knew I needed someplace with no distractions.

Recommended Resource

APPENDIX D: Coding Schema

Category 1: Faculty

  • Code: Academic sources
    • » Use for any mention of academic sources or sources used in course work.
  • Code: Information literacy
    • » Use for references to finding, selecting, using, or evaluating sources. This includes plagiarism, citations, information literacy, or similar phrases.
      1. Child codes: Use when any of these terms are mentioned:
        1. Plagiarism
        2. Citations
        3. Evaluating Sources
  • Code: Research locations
    • » Use when faculty members specify a physical or virtual location in which they do research.
  • Code: Research guidance
    • » Use for any mention of faculty recommendations for research help.
  • Category 2: Spaces

    • Code: Study spaces
      • » Use for any mention of a space used for studying or course work.
        1. Code: Group Space
          1. Use for any mention of groups working together on an assignment.
        2. Code: Late-night spaces
          1. Use for any mention of late-night studying.
        3. Code: Library
          1. Use for any mention of library as study space
        4. Code: Study Space reason
          1. Use when a reason is given for choosing a space for studying or doing course work.
            1. Child codes: Use when any of these terms are mentioned
              1. 1) Beauty
              2. 2) Comfy / Cozy
              3. 3) Flexible space
              4. 4) Food
              5. 5) Natural light / Windows
              6. 6) Not quiet
              7. 7) Outlets
              8. 8) People
              9. 9) Personal space / Spacious
              10. 10) Printers
              11. 11) Projector
              12. 12) Quiet
              13. 13) Secluded
              14. 14) TV
              15. 15) Whiteboard
    • Code: Library resources
      • » Use when any library resource is mentioned.
        1. Child codes: Use when any of these terms are mentioned:
          1. Books
          2. Charging cables
          3. Library computers
          4. Online resources
          5. Printers
          6. Restrooms
          7. Vending
          8. Wi-fi
    • Code: Ideal space
      • » Use when any ideal space for studying is mentioned.
        1. Child codes: Use when any of these terms are mentioned:
          1. Beauty
          2. Books
          3. Candles
          4. Coffee
          5. Comfy / Cozy
          6. Food
          7. Good light
          8. Mentor
          9. Music
          10. Natural light
          11. Open
          12. Outdoors
          13. Outlets
          14. Quiet
          15. Screen
          16. Secluded
          17. Table
          18. White Noise
          19. Wine

    Category 3: Study Habits

    • Code: Course readings
      • » Use for mention of discovering, location of, and use of student’s course readings.
    • Code: Distractions
      • » Use for noting what items, people, or places distract students from studying or completing course work.
        1. Child codes: Use when any of these terms are mentioned:
          1. Doodling
          2. Food
          3. Friends
          4. Games
          5. Noise
          6. Phone
          7. Social media
          8. Videos / TV
    • Code: Research assistance
      • » Use for references to people: peers, faculty, librarians, family, etc. who assist the student with their course work, academic advice, and/or find resources and/or materials for them.
        1. Child codes: Use when any of these terms are mentioned:
          1. Classmates
          2. Faculty
          3. Online
          4. Tutors / Writing Center
    • Code: Study items
      • » Use for mention of items students directly or indirectly use when studying/completing course work or in a study space.
        1. Child codes: Use when any of these terms are mentioned:
          1. Backpack
          2. Binder / Folder
          3. Books / Textbooks
          4. Calculator
          5. Chapstick
          6. Chargers
          7. Clipboard
          8. Coffee
          9. Computer
          10. Food
          11. Headphones
          12. ID
          13. Keys
          14. Music
          15. Paper / Notebooks
          16. Phone
          17. Planner
          18. Wallet / Purse
          19. Water
          20. Writing utensil(s)


    1. Nancy Fried Foster, Studying Students: A Second Look (Chicago: Association of College and Research Libraries, a division of the American Library Association, 2013); Nancy Fried Foster and Susan Gibbons, Studying Students: The Undergraduate Research Project at the University of Rochester (Chicago: Association of College and Research Libraries, 2007), http://hdl.handle.net/1802/7520.

    2. Stephen L. Schensul et al., Essential Ethnographic Methods: Observations, Interviews, and Questionnaires (Walnut Creek, CA: AltaMira Press, 1999).

    3. Roger Sanjek, “Ethnography,” in Encyclopedia of Social and Cultural Anthropology, ed. Alan Barnard and Jonathan Spencer (Routledge, 2009), https://search-credoreference-com.ezproxy.goshen.edu/content/entry/routencsca/ethnography/0.

    4. Andrew D. Asher and Susan Miller, “So You Want to Do Anthropology in Your Library? Or a Practical Guide to Ethnographic Research in Academic Libraries,” March 22, 2011, http://www.erialproject.org/wp-content/uploads/2011/03/Toolkit-3.22.11.pdf.

    5. Michael Khoo, Lily Rozaklis, and Catherine Hall, “A Survey of the Use of Ethnographic Methods in the Study of Libraries and Library Users,” Library & Information Science Research 34, no. 2 (April 2012): 82–91, https://doi.org/10.1016/j.lisr.2011.07.010.

    6. Bryony Ramsden, “Ethnographic Methods in Academic Libraries: A Review,” New Review of Academic Librarianship 22, no. 4 (October 1, 2016): 355–69, https://doi.org/10.1080/13614533.2016.1231696.

    7. Juliet Kerico Gray et al., “Applying Spatial Literacy to Transform Library Space: A Selected Literature Review,” Reference Services Review 46, no. 2 (June 11, 2018): 303–16, https://doi.org/10.1108/RSR-02-2018-0023; Angela Pollak, “Visual Research in LIS: Complementary and Alternative Methods,” Library & Information Science Research 39, no. 2 (April 1, 2017): 98–106, https://doi.org/10.1016/j.lisr.2017.04.002.

    8. Asher and Miller, “So You Want to Do Anthropology in Your Library? Or a Practical Guide to Ethnographic Research in Academic Libraries.”

    9. Alison Head, “Project Information Literacy,” Project Information Literacy, February 26, 2020, https://www.projectinfolit.org/; Andy Priestner and Matt Borg, User Experience in Libraries: Applying Ethnography and Human-Centred Design (Abingdon, Oxon; Routledge, 2016); “UXLibs—the Conference, the Books, the Training Courses,” 2020, http://uxlib.org/.

    10. Craig M. MacDonald, “‘It Takes a Village’: On UX Librarianship and Building UX Capacity in Libraries,” Journal of Library Administration 57, no. 2 (February 17, 2017): 194–214, https://doi.org/10.1080/01930826.2016.1232942.

    11. Brian L Griffin, “Metatheory or Methodology? Ethnography in Library and Information Science,” Information Research 22, no. 1 (March 2017): 1–13, http://informationr.net/ir/22-1/colis/colis1640.html; Donna Lanclos and Andrew D. Asher, “‘Ethnographish’: The State of the Ethnography in Libraries,” Weave: Journal of Library User Experience 1, no. 5 (2016), https://doi.org/10.3998/weave.12535642.0001.503.

    12. Rebecca Kuglitsch and Juliann Couture, “Things That Squeak and Make You Feel Bad: Building Scalable User Experience Programs for Space Assessment,” Weave: Journal of Library User Experience 1, no. 8 (2018), https://doi.org/10.3998/weave.12535642.0001.801.

    13. Lynda M. Duke and Andrew D. Asher, College Libraries and Student Culture: What We Now Know (Chicago: American Library Association, 2012), https://worldcat.org/oclc/704391709.

    14. Joe Marquez and Annie Downey, “Service Design: An Introduction to a Holistic Assessment Methodology of Library Services,” Weave: Journal of Library User Experience 1, no. 2 (2015), https://doi.org/10.3998/weave.12535642.0001.201; Joe Marquez, Annie Downey, and Ryan Clement, “Walking a Mile in the User’s Shoes: Customer Journey Mapping as a Method to Understanding the User Experience,” Internet Reference Services Quarterly 20, no. 3–4 (October 2, 2015): 135–50, https://doi.org/10.1080/10875301.2015.1107000.

    15. Andrew Asher et al., “Mapping Student Days: Collaborative Ethnography and the Student Experience,” Collaborative Librarianship 9, no. 4 (2017): 293–317.

    16. Joseph Robbins and Barbara Fister, “Research in the Lived Experience of Gustavus Students,” 2018, https://gustavus.edu/library/concertFiles/media/Lindell18.pdf.

    17. Nancy Fried Foster, “Reflections on Ethnographic Studies in a Community College Library System” (Ithaka S+R, September 27, 2016), https://doi.org/10.18665/sr.284329; Mariana Regalado and Maura A. Smale, “‘I Am More Productive in the Library Because It’s Quiet’: Commuter Students in the College Library,” College & Research Libraries 76, no. 7 (2015): 899–913.

    18. Duke and Asher, College Libraries and Student Culture: What We Now Know.

    19. Asher et al., “Mapping Student Days,” 295.

    20. Asher and Miller, “So You Want to Do Anthropology.”

    21. Arcolio, Arnold. Interview by authors. WebEx, December 5, 2014.

    22. Miles, Matthew B, A. M Huberman, and Saldaña Johnny. Qualitative Data Analysis: A Methods Sourcebook. Third ed. Thousand Oaks, California: SAGE Publications, 2014.

    * Ruth Szpunar is Information Fluency Coordinator at Private Academic Library Network of Indiana (PALNI), email: ruth@palni.edu; Eric Bradley is Information Fluency Coordinator at Private Academic Library Network of Indiana (PALNI) and Head of Research & Instruction at Goshen College, email: ebradley@goshen.edu. ©2023 Ruth Szpunar and Eric Bradley, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Ruth Szpunar, Eric Bradley

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

January: 1436
February: 183
March: 169
April: 57
May: 37
June: 23
July: 49
August: 23
September: 20