More than a Decade Later: Library Web Usability Practices at ARL Academic Libraries in 2007 and 2020
This study compares library web usability practices in 2007 and 2020 at academic libraries that are institutional members of the Association of Research Libraries. The authors performed chi-square and t-tests to determine whether there were differences in establishing policies/standards/guidelines (PSGs), conducting usability tests, and providing resources between samples of libraries from both years. There was no statistically significant difference between the number of libraries with and without PSGs in both samples. In 2020, the level of perceived importance of usability testing significantly decreased, and the resources needed for web usability initiatives doubled. The authors suggest that academic and research libraries foster a culture of web usability to actualize and optimize usability endeavors.
Introduction
In this digital age, the World Wide Web is the dominant medium for accessing information. As such, it is essential for web developers to make web-based information systems usable in various platforms. Usability scholars such as Nielsen, Norman, and Shneiderman, provided principles for best usability practices.1 Additionally, the International Organization for Standardization (ISO) and the U.S. Department of Health and Human Services (HHS) published standards and guidelines for web developers to create information systems with superior usability.2
Researchers in information system success modelling indicate that the quality of information, systems, and services is positively associated with intention to use and user satisfaction, which leads to the continued use of a system.3 Continuous use of such a quality information system can then lead to higher rates of return on investments.
Academic libraries have put tremendous effort and funding into providing electronic resources and services via their library web portals. Hong et al.4 revealed that perceived ease of use and usefulness can influence users’ acceptance and use of digital libraries. If libraries do not take these usability characteristics into account, they risk underutilization of their resources.5
As electronic resources grow exponentially, academic libraries must develop web portals with quality usability to prompt continued use of these resources, thus making libraries’ investment cost-effective. To accomplish this goal, a sound infrastructure is indispensable, which includes employing web usability experts, establishing and implementing institutional usability policies/standards/guidelines (PSGs), and providing necessary resources. In 2007, Chen, Germain, and Yang explored the ways that academic members of the Association of Research Libraries (ARL) met these infrastructure objectives.6 In this study, the authors have attempted to identify whether web usability infrastructure and efforts devoted to web usability testing have increased at these libraries over the last decade.
Problem Statement
In the library and information science literature, research on web usability usually addresses a specific aspect, such as case studies of usability testing,7 discussions on web accessibility policies,8 or web team development.9 Instead of focusing on a particular area, Popp in 2001 examined several aspects of web usability practices at members of ARL libraries, such as testing, obtaining web assessment training, and supporting professional development.10
As there was a void in the literature investigating holistic web usability, in 2007 Chen et al. expanded the scope of Popp’s study by incorporating PSGs and resources into their research.11 They observed that of the eighty-four participating libraries, only twenty-five had web usability PSGs, even though the perceived importance of usability testing was high. Additionally, 85 percent of the libraries had tested their websites. Nevertheless, due to a lack of infrastructure and buy-in, there was minimal iterative testing of the various components of the library web portal. Furthermore, there were just twenty libraries with dedicated, full-time usability staff. Based on these research outcomes, Chen et al. advocated education and organizational support for usability initiatives.12
It has been over a decade since Chen et al.’s initial study.13 There are still few systematic studies of organizational web usability infrastructure. Therefore, the authors conducted a comparative study to determine whether ARL academic libraries have demonstrated a stronger commitment in their usability initiatives since then. Through this research, we aim to
- determine if there are more web usability PSGs in ARL academic libraries in 2020 than in 2007;
- compare the perceived importance of web usability in 2020 and 2007;
- assess if more usability testing, including iterative testing, has been conducted since 2007; and
- evaluate if there are more resources (e.g., committees, staffing, and training) devoted to usability initiatives in 2020 than in 2007.
The issues and degrees of progress identified in these results will help advance web usability enterprises in the information science and higher education communities.
Literature Review
In 1988, Norman advocated for the importance of usability by promoting simple design focused on the successful interaction between an object and its user.14 Based on system engineering principles, Nielsen proposed five measurable usability attributes: easy to learn, efficient to use, easy to remember, low error rate, and overall user satisfaction.15 ISO defined usability as the “[e]xtent to which a product can be employed by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”16 Palmer extended ISO’s goal-oriented perspective by highlighting a system’s information architecture.17
As web technologies emerged, Brophy and Craven regarded web usability as “the experience the user has when reading and interacting with a website.”18 The authors of this study took a holistic approach to the subject by introducing a working, multifaceted definition that addressed the gaps in the ISO definition concerning content, cognitive capacity, affect, and interactivity.19 In 2018, ISO took a more inclusive stance in redefining usability and expanded its scope to include products and services.20
Nielsen, Rosenfeld and Morville, and Shneiderman indicated that websites built for optimal usability during the development cycle enable users to interact more easily with and yield greater satisfaction from the systems.21 Several studies revealed that websites with high levels of usability will engender user satisfaction, and that users will hence revisit these sites.22 Likewise, in e-learning, a quality interface and useful content facilitate coherent teaching and learning, which increase acceptance and satisfaction.23 Because academic libraries rely heavily on web technology to provide access to resources and services, it is thus essential that the design of their online system reflects users’ mental models and usability best practices.
Library professionals have adopted usability principles when developing their online portals. For example, they have conducted usability tests across platforms, including the library’s main pages,24 lower-level pages,25 OPACs,26 and discovery systems27 to ensure quality control. With the widespread use of mobile devices, libraries have also conducted usability testing on their mobile library websites.28
Academic libraries apply various web usability testing methods. Card-sorting is an option for the preliminary stage of the development, since it takes the user’s mental models into account when designing intuitive information architectures.29 Think-aloud protocol allows users to articulate their thought processes while navigating web resources.30 Paper or online prototyping is a cost-effective method for constructing initial layout of a website, as it is easy to make design modifications in the early stages.31
Sometimes, usability testing is conducted by experts in this area. Cognitive walkthrough, a process whereby experts emulate a novice navigating a system, yields information on its learnability and the ease of identifying its most straightforward path to accomplish a specific task.32 Similarly, heuristic evaluation involves expert inspection of a system based on a set of established standards or guidelines.33 Task analysis examines whether a system’s design aligns with the sequence activities necessary to complete a specific task.34
As usability testing technology advances, some usability practitioners augment traditional methods with additional tools; for example, log analysis35 and eye tracking.36 Researchers also conduct focus groups or surveys to solicit feedback from users.37
Usability testing is an on-going, indispensable process throughout the system development life cycle.38 Iterative testing enables web designers to detect flaws and make improvements.39 These usability initiatives require considerable personnel, time, technical expertise, funding, and other resources throughout the various phases of the process.40 Teams can provide valuable support, but members with limited expertise in these areas may hinder a team from working at its full potential.41 However, Nichols et al. noted that while some team members may not have a high level of usability training, they still bring important knowledge about users to the process.42 Lacking staff expertise, some organizations opt to hire outside consultants to conduct usability testing.43 Cervone’s model posited that whether usability training is knowledge-based or skill-based, it should be an organization-wide endeavor.44 These diverse views “move usability towards an institutional value.”45
Usability PSGs provide uniformity for quality information system design. After exploring web policies available on selected academic libraries’ websites, Lingle and Delozier provided a list of elements for library website policies. These elements include mission statement, target audience, scope and content, selection criteria, web administration, training, URL creation, types of platforms used, security levels, backup plan, and design. Their list mainly focuses on the collection, technical, and procedural aspects of policies, not usability per se.46 ISO issued sets of usability guidelines and specifications for facilitating user-centered design.47 The HHS publication Research-Based Web Design & Usability Guidelines provides institutions with a blueprint for establishing local policies for usability best practices.48 Additionally, Nielsen’s seminal ten heuristics serve as general principles for creating intuitive web user interfaces.49 Finally, for a system to be usable it must first be accessible. The Web Accessibility Initiative at the World Wide Web Consortium emphasizes prioritizing web accessibility for persons with disabilities.50 Common elements for web usability PSGs derived from these authoritative usability guidelines include identifying goals, understanding user requirements, meeting user’s expectations, considering user interface issues, providing useful content, structuring content for easy navigation, using plain language, allowing user control and flexibility, preventing errors, avoiding information overload, addressing accessibility, and measuring outcomes of use (e.g., effectiveness, efficiency, satisfaction, user experience, etc.).
Although library professionals have applied these guidelines and standards toward general evaluation of their websites, there is little discussion in the library and information science literature specifically related to web usability policies.51
The rapid evolution of web technologies has made it more common to offer online learning and information services (including seeking and disseminating information) since Chen et al. explored web usability practices in ARL academic libraries in 2007.52 The transition from in-person to virtual environments further highlights the importance of web usability. Quality library web usability facilitates seamless interaction for teaching, learning, and research, thus providing better user experience for patrons of academic libraries. Achieving ultimate web usability requires a sound infrastructure and continuous efforts. A comparative study on these usability aspects will shed light on the progress made and the challenges encountered by the ARL academic libraries. The results can help library professionals, including library administrators, reflect on their library web usability practices. Additionally, the insights derived from this research can serve as informed strategies for advancing web usability enterprises in the information science and higher education communities to enhance user satisfaction.
Methods
In 2007, Chen et al. selected the ARL academic libraries for exploring web usability practices because they were regarded the top research libraries in North America.53 As the authors of this study intended to determine if ARL academic libraries have demonstrated a stronger commitment in Web usability initiatives in the past decade, surveying the current state of Web usability practices in these institutions must take place first. To achieve this goal, the authors adapted Chen et al.’s54 survey questionnaire. They added the “Library student worker” option to the question on testing population, and the “Eye tracking” option to the question on usability testing methods, as well as including new questions on testing a mobile version of library websites, availability and utilization of usability labs, and how existing usability PSGs and practices have influenced user experience. Furthermore, the authors added the phrase “in the past ten years” to the question on usability testing efforts to replicate the timeframe of the former study, which transpired approximately ten years after initial web usability testing initiatives occurred at academic libraries. These additions and modifications to the original survey questionnaire were meant to account for new methods and emerging web technologies, such as increased use of eye-tracking systems and mobile devices. In a forthcoming article, the authors provided a comprehensive report on the current state of web usability practices in the ARL academic libraries.55 For the comparative analyses, only responses to common questions used for both the 2007 and 2020 surveys were considered (see appendix). Thus, the authors did not anticipate that the changes made to the survey would impact the comparability of the results between the current and former study.
The rationale behind adapting Chen et al.’s56 survey instrument included that the target population was the same; to make the comparison meaningful, the scope of the investigation and survey instrument should remain the same; their questionnaire consisted of quantitative and qualitative elements providing a more complete view of the issues under examination; and the survey questions had been tested through two pilot studies to ensure validity and reliability.
The quantitative questions included multiple choice and Likert scale items focusing on usability PSGs, usability testing, and resources. The open-ended questions, pertaining to challenges encountered in the implementation of usability PSGs, web usability practices, and future plans for usability initiatives, allowed the authors to collect qualitative data which could not be captured via quantitative-oriented queries.
The authors followed the same approach identifying appropriate survey recipients. We visited the ARL academic libraries’ website directories in September 2019 and identified position titles or departments with responsibility for usability initiatives. The authors then contacted potential individuals to determine whether they were the appropriate survey recipients; if they were not, we requested a referral.
Upon receiving the IRB approval at the University, we sent the survey questionnaire via SurveyMonkey to the 105 ARL academic libraries at the end of October 2019. We followed up with emails and phone calls to increase the response rate. Due to the COVID-19 pandemic, there were delays in response submissions. As the survey solicited information on usability practices in the past ten years, the responses would not be affected by the interruption caused by the pandemic. In Chen et al.’s 2007 study, eighty-four institutions participated in the survey.57 In the 2020 study, by the close of the survey in mid May 2020, ninety-one institutions responded, yielding an 87 percent response rate, which is a strong representation of ARL academic libraries.58
The authors exported the data from SurveyMonkey to Excel for quantitative analyses. We performed chi-square tests of independence and t-tests to determine whether there were differences in terms of PSG establishment, usability testing, and resource availability between the 2007 and 2020 samples. Additionally, we downloaded responses to the open-ended questions and coded them using the themes that had emerged in the 2007 study, which applied grounded theory method. Discrepancies in coding were resolved through discussions.
Findings
Development and Implementation of Website PSGs and Web Usability PSGs
The authors mainly used the chi-square (χ2) test of independence and independent samples t-test to conduct analyses and comparison of the data from 2007 and 2020. Table 1 shows that the numbers of libraries with or without library website PSGs remained similar (χ2 = 0.074, df=1, p > 0.1). However, there was a significant difference (33 percent in 2020 vs. 30 percent in 2007) in the numbers of libraries with web usability PSGs (χ2 = 8.219, df=1, p < 0.05). Likewise, there was a notable increase (41 percent in 2020 vs. 36 percent in 2007) in the number of universities with web usability PSGs (χ2 = 34.181, df=1, p < 0.001).
|
TABLE 1 |
||||||
|
Libraries/Universities with/without PSGs |
||||||
|
Library Website PSGs |
Library Web Usability PSGs* |
University Web Usability PSGs** |
||||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
|
|
With |
35 (38) |
34 (40) |
30 (33) |
25 (30) |
37 (41) |
30 (36) |
|
Without |
56 (62) |
50 (60) |
51 (56) |
58 (69) |
41 (45) |
31 (37) |
|
Not sure |
0 (0) |
0 (0) |
0 (0) |
0 (0) |
0 (0) |
22 (26) |
|
No answer |
0 (0) |
0 (0) |
10 (11) |
1 (1) |
13 (14) |
1 (1) |
|
Total |
91 (100) |
84 (100) |
91 (100) |
84 (100) |
91 (100) |
84 (100) |
|
*p < 0.05; ** p < 0.001 |
||||||
Table 2 reveals that in terms of implementing all three types of PSGs, there were no obvious differences regarding the various levels of difficulty between both samples. This result was confirmed by non-significant chi-square (χ2) tests. Comparable numbers of libraries in 2020 and 2007 indicated the level of difficulty was moderate or higher in implementing library web PSGs (χ2 = 6.26, df=4, p > 0.1), library web usability PSGs (χ2 = 3.71, df=4, p > 0.1), and university web usability PSGs (χ2 = 7.0, df=4, p > 0.1). Independent samples t-tests revealed no statistically significant differences in the mean levels of difficulty in implementing library web PSGs (t(62) = 1.356, p > 0.05) and library web usability PSGs (t (52) = 0.298, p > 0.1) in both 2020 and 2007. Yet, the mean level of difficulty for implementing university web usability PSGs was higher in 2020 (M=2.3) than 2007 (M=1.65) (t(54) = 2.744, p < 0.005).
|
TABLE 2 |
||||||
|
Levels of Difficulty in Implementing in-library/University Web Usability PSGs |
||||||
|
Library Web PSGs |
Library Web Usability PSGs |
University Web Usability PSGs |
||||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
|
|
Not Difficult |
4 (11) |
5 (15) |
3 (10) |
4 (16) |
7 (23) |
12 (46) |
|
Slightly Difficult |
6 (17) |
8 (23) |
8 (27) |
2 (8) |
9 (30) |
8 (31) |
|
Moderately Difficult |
11 (31) |
16 (47) |
13 (43) |
15 (60) |
12 (40) |
5 (19) |
|
Very Difficult |
5 (14) |
3 (9) |
2 (7) |
2 (8) |
2 (7) |
0 (0) |
|
Extremely Difficult |
0 (0) |
0 (0) |
2 (7) |
0 (0) |
0 (0) |
0 (0) |
|
No Answer |
9 (26) |
2 (6) |
2 (7) |
2 (8) |
0 (0) |
1 (4) |
|
Total |
35 (99*) |
34 (100) |
30 (101*) |
25 |
30 (100) |
26 (100) |
|
* Percentage did not add up to 100 due to rounding. |
||||||
In 2020, 22 (62.9%), 25 (82%), and 23 (76.7%) of the libraries experienced at least slight difficulty in implementing their library web PSGs, library web usability PSGs, and university web usability PSGs respectively, as compared to 27(79.4%), 19(76%), and 13(50%) in 2007. The most frequent rating for difficulty in implementing the three types of PSGs was “Moderately Difficult” in both 2020 and 2007.
In both 2020 and 2007, the most frequently cited obstacles to implementing the library specific PSGs were: “Enforcement/agreement” (18 in 2020 vs. 20 in 2007), “Resources” (13 in 2020 vs. 7 in 2007), and “Lack of skills/training” (8 in 2020 vs. 10 in 2007). “Technical issues” was rated as the most challenging aspect in implementing the university web usability PSGs. The least cited reasons included “Resistance to change,” “One size doesn’t fit all/complexity,” “Unclear PSGs,” “Difficulty with OPAC,” and “Difficulty with lower-level pages” (table 3).
|
TABLE 3 |
||||||
|
Number of Libraries That Cite Various Reasons for Their Difficulty in Implementing PSGs |
||||||
|
Library Website PSGs |
Library Web Usability PSGs |
University Web Usability PSGs |
||||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
Enforcement/Agreement |
8 |
13 |
10 |
7 |
1 |
2 |
|
Lack of skills/training |
3 |
7 |
5 |
3 |
3 |
1 |
|
Resources |
6 |
4 |
7 |
3 |
4 |
1 |
|
Getting informed |
1 |
3 |
3 |
1 |
2 |
2 |
|
Resistance to change |
3 |
2 |
4 |
2 |
0 |
0 |
|
One size doesn’t fit all/complexity |
2 |
2 |
0 |
1 |
0 |
1 |
|
Technical issues |
1 |
1 |
1 |
2 |
5 |
4 |
|
Difficulty with lower-level pages |
2 |
0 |
1 |
2 |
0 |
0 |
|
Difficulty with OPAC |
1 |
0 |
0 |
1 |
0 |
0 |
|
Unclear PSGs |
0 |
0 |
0 |
0 |
0 |
2 |
Usability Resources: Committees/Task Forces
As shown in table 4, the authors observed that in 2020 there were decreases in the numbers of the following committees compared to 2007: usability committees (9 percent vs. 18 percent), web advisory committees (34 percent vs. 62 percent), and website usability subcommittees (2 percent vs. 14 percent). A chi-square test of independence confirmed that the two samples were significantly different (χ2 = 55.04, df=4, p < 0.001). More libraries had a web advisory committee than the other two types in both 2020 and 2007. Additionally, there were substantially fewer libraries that had at least one type of committee in 2020 (37 percent) than in 2007 (71 percent), and there were notably fewer libraries with two or more types of committees in 2020 (6 percent) than in 2007 (20 percent).
|
TABLE 4 |
||||||
|
Committees Formed by Responding Libraries |
||||||
|
Usability Committee |
Web Advisory Committee |
Website Usability Subcommittee |
||||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
|
|
Yes |
8 (9) |
15 (18) |
31 (34) |
52 (62) |
2 (2) |
12 (14) |
|
No |
62 (68) |
42 (50) |
39 (43) |
14 (16) |
68 (75) |
34 (40) |
|
Not Sure |
0 (0) |
1 (1) |
0 (0) |
3 (4) |
0 (0) |
3 (4) |
|
No Answer |
21 (23) |
26 (31) |
21 (23) |
15 (18) |
21 (23) |
35 (42) |
|
Total |
91 (100) |
84 (100) |
91 (100) |
84 (100) |
91 (100) |
84 (100) |
Usability Resources: Web Usability Personnel
Data from table 5 reveal that there was a statistically significant difference in the number of libraries employing dedicated web usability staff in 2020 and 2007 (χ2 = 10.55, df=2, p < 0.01).
|
TABLE 5 |
||
|
Libraries Employing Dedicated Web Usability Staff |
||
|
2020* |
2007 |
|
|
N (%) |
N (%) |
|
|
With |
32 (35) |
24 (29) |
|
Without |
44 (48) |
57 (68) |
|
No answer |
15 (16) |
3 (4) |
|
Total |
91 (99**) |
84 (101**) |
|
*p < 0.01 **Percentage did not add up to 100 due to rounding. |
||
Additionally, the average number of dedicated staff hours devoted to usability testing differed significantly, as confirmed by an independent samples t-test result (t (29) = 1.824, p < .05) with M=20.678 in 2020 and M=14.214 in 2007. However, the mean number of regular staff hours devoted to usability testing had no significant difference (t (42) = 0.997, p > .1) between 2020 (M=7.687) and 2007 (M=5.567).
In 2020, thirty-two libraries responded to the question on web usability training; in 2007, that number was twenty-four (table 6). The numbers for all types of training increased in 2020, compared to the 2007 sample. More dedicated staff had training in “Web usability” than in “Human-Computer Interaction (HCI)” or those receiving a “Degree or certificate in information science.” The most noticeable differences were in staff receiving HCI (66 percent vs. 25 percent) and web usability (97 percent vs. 58 percent) training. In 2007, fewer dedicated staff had training in HCI compared with the other two forms of education. This was evidenced by the significant chi-square (χ2) test of independence (χ2 = 7.23, df=2, p < 0.05). The two library samples also differed in the number of the various types of training (χ2 = 8.72, df=2, p < 0.05).
|
TABLE 6 |
||
|
Types of Web Usability Training for Dedicated Staff |
||
|
2020 (n=32) |
2007 (n=24) |
|
|
N (%) |
N (%) |
|
|
Human-Computer Interaction (HCI) |
21 (66) |
6 (25) |
|
Web usability |
31 (97) |
14 (58) |
|
Degree or certificate in information science |
19 (59) |
14 (58) |
|
No specific training |
1 (3) |
5 (21) |
|
Total |
72* |
39* |
|
Note: n refers to the number of libraries that answered this question. N refers to the number of dedicated staff. *As multiple responses were allowed for this question, the total for 2020 and 2007 adds up to more than 32 and 24 respectively. |
||
As shown in table 7, based on thirty-seven and fifty-one responses in 2020 and 2007 respectively, the chi-square result did not show differences statistically in the numbers and types of training obtained by regular staff with web usability responsibility (χ2 = 0.38, df=2, p > 0.1). Web usability was the dominant training type for both years, followed by degree or certificate in information science, and then HCI.
|
TABLE 7 |
||
|
Types of Training for Regular Staff with Web Usability Responsibility |
||
|
2020 (n=37) |
2007 (n=51) |
|
|
N (%) |
N (%) |
|
|
Human-Computer Interaction (HCI) |
17 (46) |
16 (31) |
|
Web usability |
30 (81) |
36 (71) |
|
Degree or certificate in information science |
19 (51) |
21 (41) |
|
No specific training |
6 (16) |
10 (20) |
|
Total |
72* |
83* |
|
Note: n refers to the number of libraries that answered this question. N refers to the number of regular staff with Web usability responsibility. *As multiple responses were allowed for this question, the total for 2020 and 2007 adds up to more than 37 and 51 respectively. |
||
The authors aggregated all categories of available resources—committees, training formats, outside assistance, and staff—to further examine if there was any difference between the two samples. An independent samples t-test revealed a statistically non-significant difference in the mean level of aggregated resources (t (172) = 0.968, p > .1), with M=3.758 in 2020 and M=4.190 in 2007.
Usability Testing: Perceived Importance of Usability Testing
A non-significant chi-square (χ2) test of independence showed that both samples in 2020 and 2007 rated high importance for usability testing (χ2 = 10.66, df=5, p > 0.05). As shown in table 8, the majority of participants regarded usability testing as at least “Somewhat important,” with 83.6 percent in 2020 and 91.7 percent in 2007. Additionally, 56.1 percent in 2020 and 73.8 percent in 2007 rated it “Important” or higher. However, a closer look at the data showed that the mean level of importance of usability testing in 2020 (M=2.037) was lower than that in 2007 (M=2.367) (t(155) = 2.034, p < 0.05).
|
TABLE 8 |
||
|
Responding Libraries’ Rating on the Importance of Usability Testing |
||
|
2020 |
2007 |
|
|
N (%) |
N (%) |
|
|
Not important |
3 (3.2) |
2 (2.3) |
|
Somewhat important |
25 (27.5) |
15 (17.9) |
|
Important |
25 (27.5) |
21 (25) |
|
Very important |
18 (19.7) |
34 (40.5) |
|
Extremely important |
8 (8.9) |
7 (8.3) |
|
No answer |
12 (13.2) |
5 (6.0) |
|
Total |
91 (100) |
84 (100) |
The authors coded and categorized responses to an open-ended question soliciting additional comments on the importance the library places on usability testing. The results reveal that “Staff/Resources” and “Iterative testing” were the most frequently mentioned in 2020. In 2007, “Iterative testing” was the most commonly referenced, followed by “Buy-in” and “Staff/Resources” (table 9). Additional themes mentioned in the 2020 sample were “Culture of usability,” “Support from library administration,” and “Enforcement/Agreement.”
|
TABLE 9 |
||
|
Additional Comments by Respondents on the Importance Their Libraries Placed on Usability Testing |
||
|
2020 |
2007 |
|
|
Theme |
N |
N |
|
Iterative testing |
18 |
19 |
|
Buy-in |
6 |
13 |
|
Staff/Resources |
25 |
12 |
|
On-campus usability partnership |
1 |
6 |
|
Committee |
7 |
6 |
|
Accessibility |
3 |
3 |
|
Web usability PSGs |
6 |
2 |
|
Training |
3 |
2 |
Usability Testing: Platforms and Activities
In line with participants’ rating on the importance of usability testing, an overwhelming majority of libraries in both 2020 (85.7 percent) and 2007 (84.5 percent) conducted usability testing, with a slight increase in 2020. A chi-square (χ2) test of independence confirmed this difference (χ2 = 15.55, df=2, p < 0.001).
The authors examined usability testing activities conducted at the pre-, during, and post-design stages of the library websites (table 10a), OPACs (table 10b), and lower-level pages (table 10c) in both 2020 and 2007. Chi-square (χ2) tests of independence confirmed no differences:
- Websites (Pre: χ2 = 5.57, df=5, p > 0.1; During: χ2 = 3.47, df=5, p > 0.1; Post-design: χ2 = 8.85, df=5, p > 0.1),
- OPACs (Pre: χ2 = 10.69, df=5, p > 0.05; During: χ2 = 7.29, df=5, p > 0.1; Post-design: χ2 = 8.06, df=5, p > 0.1),
- Lower-level Pages (Pre: χ2 = 4.09, df=5, p > 0.1; During: χ2 = 4.72, df=5, p > 0.1; Post-design: χ2 = 4.24, df=5, p > 0.1).
In addition, an independent samples t-test confirmed that there was no statistically significant difference in the mean total amount of testing performed on the three platforms (t(136) = 0.745431347, p > 0.1) in 2020 (M=6.82) and 2007 (M=6.23).
|
TABLE 10a |
||||||
|
Number of Libraries Testing on Their Websites: Pre, During, and Post-design Phases |
||||||
|
Testing Frequency |
Pre |
During |
Post |
|||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
1 |
18 |
15 |
14 |
12 |
21 |
16 |
|
2 |
10 |
13 |
15 |
16 |
2 |
11 |
|
3 |
6 |
9 |
5 |
11 |
9 |
9 |
|
4 |
4 |
1 |
3 |
2 |
1 |
2 |
|
5 or more |
16 |
10 |
20 |
15 |
21 |
15 |
|
TABLE 10b |
||||||
|
Number of Libraries Testing on Their OPACs: Pre, During, and Post-design Phases |
||||||
|
Testing Frequency |
Pre |
During |
Post |
|||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
1 |
14 |
8 |
14 |
9 |
10 |
11 |
|
2 |
3 |
6 |
9 |
6 |
8 |
10 |
|
3 |
5 |
2 |
3 |
4 |
6 |
4 |
|
4 |
0 |
1 |
2 |
1 |
2 |
2 |
|
5 or more |
10 |
2 |
9 |
2 |
13 |
3 |
|
TABLE 10c |
||||||
|
Number of Libraries Testing on Their Lower-Level Pages: Pre, During, and Post-design Phases |
||||||
|
Testing Frequency |
Pre |
During |
Post |
|||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
1 |
10 |
11 |
12 |
11 |
16 |
12 |
|
2 |
12 |
9 |
12 |
18 |
8 |
11 |
|
3 |
3 |
7 |
3 |
7 |
4 |
10 |
|
4 |
3 |
1 |
2 |
3 |
4 |
4 |
|
5 or more |
9 |
5 |
12 |
7 |
10 |
7 |
Table 11 shows that while the number of libraries conducting usability tests on their websites (χ2 = 0.63, df=2, p > 0.1) and lower-level pages (χ2 = 0.7, df=2, p > 0.1) at the various stages were similar, there were differences in testing OPACs. Compared with 2007, more libraries in 2020 tested their OPACs at “All three stages,” but fewer tested at “Any one of the three stages” (χ2 = 11.4, df=2, p < 0.01).
|
TABLE 11 |
||||||
|
Number of Responding Libraries Testing on the Three Web Platforms During the Development Life Cycle |
||||||
|
Websites |
OPAC |
Lower-level Pages |
||||
|
2020 |
2007 |
2020 |
2007 |
2020 |
2007 |
|
|
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
N (%) |
|
|
All three stages |
42 (64) |
37 (57) |
23 (48) |
8 (19) |
26 (49) |
26 (49) |
|
Any two of the three stages |
15 (23) |
18 (28) |
14 (29) |
12 (28) |
15 (28) |
18 (34) |
|
Any one of the three stages |
9 (13) |
10 (15) |
11 (23) |
23 (53) |
12 (23) |
9 (17) |
|
Total |
66 (100) |
65 (100) |
48 (100) |
43 (100) |
53 (100) |
53 (100) |
Usability Testing: Populations
For both samples, “Undergraduates,” “Graduates,” “Faculty,” and “Staff” were the top four populations recruited for usability testing (table 12). The participating libraries recruited “Undergraduate students” (χ2 = 0.5, df=2, p > 0.1), “Graduate students” (χ2 = 2.17, df=2, p > 0.1), “Faculty” (χ2 = 1.48, df=2, p > 0.1), “Staff” (χ2 = 5.7, df=2, p > 0.05), “Alumni” (χ2 = 4.51, df=2, p > 0.1), and “Public users” (χ2 = 5.62, df=2, p > 0.05) in both years at indistinguishable rates. Yet, the authors observed different levels for the other testing populations: “Administrators” (χ2 = 7.67, df=2, p < 0.05), “Non-library users” (χ2 = 9.67, df=2, p < 0.05), “IT Professionals” (χ2 = 12.22, df=2, p < 0.005), “Persons with disabilities” (χ2 = 6.82, df=2, p < 0.05), and “Researchers” (χ2 = 8.57, df=2, p < 0.05).
|
TABLE 12 |
||||||||
|
Testing Population Used by Participating Libraries Conducting Usability Tests |
||||||||
|
2020 |
2007 |
|||||||
|
Testing Population |
Yes |
No |
No answer |
Total |
Yes |
No |
No answer |
Total |
|
Administrators |
13 |
61 |
4 |
78 |
20 |
41 |
10 |
71 |
|
Alumni |
12 |
62 |
4 |
78 |
11 |
49 |
11 |
71 |
|
Faculty |
61 |
13 |
4 |
78 |
60 |
7 |
4 |
71 |
|
Graduates |
70 |
4 |
4 |
78 |
68 |
1 |
2 |
71 |
|
Undergraduates |
73 |
1 |
4 |
78 |
68 |
1 |
2 |
71 |
|
Public users |
23 |
51 |
4 |
78 |
16 |
43 |
12 |
71 |
|
Non-library users |
14 |
60 |
4 |
78 |
15 |
41 |
15 |
71 |
|
IT Professionals |
13 |
61 |
4 |
78 |
21 |
37 |
13 |
71 |
|
Persons with disabilities |
22 |
52 |
4 |
78 |
29 |
33 |
9 |
71 |
|
Researchers |
30 |
44 |
4 |
78 |
35 |
25 |
11 |
71 |
|
Staff |
55 |
19 |
4 |
78 |
58 |
7 |
6 |
71 |
Usability Testing: Methods
According to table 13, participating libraries in both 2020 and 2007 applied the same top three methods to conduct usability tests: “In-person observation,” “Think-aloud,” and “Card sorting.” “Keystroke path collection,” “Cognitive walk-through,” and “Filmed observation” were the three least often applied methods for 2020. “Keystroke path collection,” “Filmed observation,” and “Heuristic evaluation” were the least used measures in 2007. Overall, most of the methods used remained the same, which was confirmed by a chi-square (χ2) test of independence (χ2 = 1.63, df=2, p > 0.1). An independent samples t-test confirmed that there was no statistically significant difference (t (139) = 0.779, p > 0.1) in the mean number of testing methods between 2020 (M=4.808) and 2007 (M=5.058).
|
TABLE 13 |
||||||||
|
Usability Testing Methods Applied by Libraries for Conducting Usability Tests |
||||||||
|
2020 |
2007 |
|||||||
|
Usability Testing Method |
Yes |
No |
No answer |
Total |
Yes |
No |
No answer |
Total |
|
Card sorting |
51 |
22 |
5 |
78 |
40 |
27 |
4 |
71 |
|
Cognitive walk-through |
25 |
48 |
5 |
78 |
39 |
24 |
8 |
71 |
|
Filmed observation |
26 |
47 |
5 |
78 |
23 |
38 |
10 |
71 |
|
Heuristic evaluation |
28 |
45 |
5 |
78 |
32 |
28 |
11 |
71 |
|
In-person observation |
65 |
8 |
5 |
78 |
61 |
6 |
4 |
71 |
|
Keystroke path collection |
8 |
65 |
5 |
78 |
17 |
41 |
13 |
71 |
|
Paper prototyping |
34 |
39 |
5 |
78 |
36 |
28 |
7 |
71 |
|
Task analysis |
41 |
32 |
5 |
78 |
39 |
27 |
5 |
71 |
|
Think-aloud |
63 |
10 |
5 |
78 |
57 |
9 |
5 |
71 |
In 2020, the top three methods used to solicit feedback were “Surveys,” “Interviews,” and “Focus groups,” while in 2007 those were still the most commonly used methods, although the order was “Focus groups,” “Surveys,” and “Interviews” (table 14). The three least used approaches were identical for both samples: “Listserv postings,” “Pop-up windows via the library Web site,” and “Web site call for input.” A chi-square test indicated that the two library samples utilized significantly different methods to solicit feedback (χ2 = 8.59, df=2, p < 0.05).
|
TABLE 14 |
||||||||
|
Methods Used to Solicit Feedback |
||||||||
|
2020 |
2007 |
|||||||
|
Method |
Yes |
No |
No answer |
Total |
Yes |
No |
No answer |
Total |
|
Focus groups |
43 |
27 |
8 |
78 |
55 |
11 |
5 |
71 |
|
Interviews |
45 |
25 |
8 |
78 |
49 |
15 |
7 |
71 |
|
Listserv postings |
11 |
59 |
8 |
78 |
13 |
45 |
13 |
71 |
|
Pop-up windows via the library Website |
28 |
42 |
8 |
78 |
13 |
43 |
15 |
71 |
|
Surveys |
53 |
17 |
8 |
78 |
51 |
13 |
7 |
71 |
|
Website “call for input” |
32 |
38 |
8 |
78 |
45 |
19 |
7 |
71 |
Usability Testing: Future Plans
In 2020 and 2007, all participating libraries had future plans for their web usability, with conducting usability testing as the top priority, followed by acquiring resources in 2020 and redesigning the library website in 2007 (table 15). The chi-square (χ2) test of independence result (χ2 = 5.52, df=2, p > 0.05) showed a non-significant difference, indicating that the two sets of libraries had somewhat consistent plans in place.
|
TABLE 15 |
||
|
Future Plans Identified by Responding Libraries |
||
|
2020 |
2007 |
|
|
Future Plan |
N |
N |
|
Conduct usability testing |
22 |
26 |
|
Redesign library website |
11 |
14 |
|
Use alternative methods (focus groups, interviews, surveys, click paths) |
6 |
9 |
|
Acquire resources (outside assistance, funding) |
5 |
8 |
|
Conduct iterative testing |
4 |
7 |
|
Add usability committee, personnel, task force |
12 |
6 |
|
Test OPAC |
1 |
6 |
|
Implement CMS |
0 |
4 |
|
Establish policies |
2 |
3 |
|
Test lower level pages |
0 |
1 |
|
Total |
63 |
84 |
Discussion
Development and Implementation of Website PSGs and Web Usability PSGs
The authors expected that there would be significant increases across the three types of PSGs in 2020 compared to 2007. However, the results indicated the increases for categories of library web usability and university web usability PSGs were only 3 percent and 5 percent, respectively. Chen et al. found a 31 percent increase from 1977 to 2003, and an additional 22 percent increase between 2003 and 2008 when examining the collection development policies at ARL libraries.59 By contrast, the growth rate of library web usability PSGs is only 3 percent over thirteen years. Data analysis of this study suggest that possible causes for this limited increase were lack of priority, resources (e.g., usability-focused committee), and buy-in.
Web usability PSGs provide an accountability mechanism for quality design. As the web is the dominant medium for information seeking and online learning, the accountability issue cannot be ignored. This is especially true during the COVID-19 pandemic, when most tasks or services are conducted virtually. We encourage libraries to use well-established standards, heuristics, and guidelines to create in-house web usability PSGs for best practices. Administrators need to be educated to be in sync with library stakeholders. Their understanding and knowledge can facilitate a shared vision and shared governance for web usability and make them priorities.
Comparable numbers of libraries indicated various levels of difficulty in implementing the three types of PSGs in 2007 and 2020. Both samples encountered the same top three challenges when implementing library-specific PSGs: Enforcement/agreement, Lack of skills/training, and Resources. This implies that participating libraries did not make any higher degree of a commitment or investment to web usability to reduce the level of difficulty.
Compared with 2007, fewer libraries in 2020 had difficulty in implementing library website PSGs. This might be because these PSGs mainly deal with procedural logistics. Their development is usually centralized among a limited number of IT staff members. Thus, decision making and implementation of library website PSGs on such matters as URL creation, platform selection, security settings, backup plans, and user rights can be most efficiently handled by a small number of IT professionals.
On the contrary, more libraries in 2020 had difficulty implementing library web usability PSGs addressing more complicated issues, such as information organization, which can impact users’ information seeking processes. Since the design of a library web portal involves various web authors who may have competing perspectives, the distributed model presents challenges in achieving consensus. Also, these web authors may lack understanding of the usability principles and users’ mental models embedded in usability PSGs. Additionally, the lack of a systematic scheme of holding web authors accountable makes enforcement of usability PSGs difficult. Another contributing factor is that vendor products offer limited control for libraries over their design processes, including usability.
Libraries need to be proactive and involve stakeholders when establishing PSGs to engender buy-in. With consensus from web authors, enforcement/agreement becomes less of an issue. In addition, it is indispensable to raise web authors’ awareness and understanding of PSGs through regular education and communication strategies.
Likewise, in 2020 the number of libraries lacking skills/training for implementing general library website PSGs was lower than in 2007, although that number was higher in 2020 for library web usability PSGs. General library website PSGs focus on web management matters and usually fall under the charge of staff with web technology expertise. Once PSGs are established, they are easier to implement. In contrast, not all web authors have credentials or knowledge in web usability, HCI, or user experience (UX), thus making it more challenging to implement library web usability PSGs. To resolve this issue, administrators should employ qualified personnel and provide appropriate training in web usability.
Data analysis showed that the need for supporting resources more than doubled in 2020. This might explain why the level of difficulty for implementing usability PSGs did not decrease. More resources would reduce complications associated with a lack of usability experts, training, committees, or infrastructure. Investing in usability resources will enable library staff to take on initiatives more readily or be more responsive to challenges.
Technical issues were the main obstacle in implementing university web usability PSGs in both years, and the level of difficulty increased in 2020. This might be due to university web usability PSGs’ failing to take into consideration the complexity of information architecture in library web portals. A marketing design approach supports the main function of university websites, which are for browsing and finding university specific information. In contrast, library web portals are research oriented, which requires usability PSGs to guide a seamless human-computer interaction and address extensive cognitive processes. Lacking programming and scripting skills may be another factor. Based on this discovery, the authors suggest that academic libraries collaborate with their campus IT departments to create a set of comprehensive and robust PSGs to account for the unique needs of the library web portals.
Usability Resources: Committees/Task Forces
Committees provide a mechanism to lead the usability effort in a coherent manner. In 2020, the number of libraries with committees (i.e., web advisory, usability, and usability subcommittees) decreased significantly. This might be due to semantics; for example, the variation of committee names used by some libraries include Website Steering Group, Web Content Group, Library Assessment Steering Committee, UX Team, etc. Other libraries applied an ad hoc approach for point-of-need projects. An approach to addressing these issues is for libraries to establish and maintain usability-focused committees to carry out all aspects of web usability endeavors.
Usability Resources: Web Usability Personnel
The number of web usability dedicated staff and the average number of hours they devoted to web usability increased in 2020. Since training for web usability is manageable and readily available in various formats (e.g., webinar, conference, workshop, etc.), more dedicated staff in both samples received training in this area than in HCI. Compared with 2007, the numbers of dedicated staff receiving training in both HCI and web usability increased by about 40 percent in 2020. This is encouraging as it indicates that libraries invested in usability expertise. By contrast, there were no differences in the mean number of hours and training types for regular staff with web usability responsibility. This finding further implies that libraries acknowledged the necessity of employing dedicated staff to address web usability initiatives.
The increase in dedicated web usability personnel in 2020 showed progress in usability efforts, but this trend did not occur in resources as a whole. We encourage libraries to support all resources necessary to adequately conduct usability testing for designing and implementing a quality web presence.
Usability Testing: Perceived Importance of Usability Testing
The degree of importance that the participating libraries placed on usability testing significantly declined in 2020, with a 17.7 percent decrease for rating “important” or higher. This signals that libraries did not perceive usability testing to be as important as they did in 2007. This negative change might also explain the reduction in the number of usability-focused committees, the downward trend in buy-in, and the substantially increased demand for resources.
Usability Testing: Platforms and Activities
The number of participating libraries conducting usability testing in 2020 (85.7 percent) was a little more than one percentage point higher than in 2007 (84.5 percent). While the majority of the libraries have tested their web portals, the authors had expected that all ARL academic libraries would have conducted usability testing. The results also indicated that usability testing on library websites and lower-level pages throughout the development cycle remained stagnant in 2020.
As libraries’ main pages are the gateways for accessing library’s resources and services, it is imperative that these sites provide seamless interactions between their users and needed materials. With the increased use of vendor products (e.g., LibGuides) to create lower-level pages, libraries should proactively collaborate with vendors to perform usability testing on those applications. It is disconcerting that over the last decade there has not been more testing on both the libraries’ main and lower-level pages.
However, the authors noted an increase in the number of libraries conducting OPAC testing. This is not surprising, since OPACs have gone through a dramatic transformation into discovery systems mimicking the Google search engine. Discovery systems connect users with a variety of electronic resources, which creates a high level of complexity. Thus, conducting usability testing is crucial to ensure their information retrieval function aligns with users’ mental models.
Usability Testing: Populations
As undergraduates, graduates, faculty, and staff are the major stakeholders on campus, it is natural that libraries in both samples used these populations most frequently for usability testing. The decrease in recruiting administrators to participate in testing raised some concerns, since testing this target audience can help decision makers gain insights into web usability. Administrators that lack this kind of exposure might underappreciate users’ experiences with their web portals. This also might be a contributing factor to the insufficient buy-in and allocation of resources.
Unlike administrators, who have decision-making power, researchers are also key library stakeholders at research institutions. The authors encourage libraries to increase the involvement of this population so that web portals can more effectively facilitate their research.
Another important target cohort is persons with disabilities. With the fast evolution of web technologies, it is critical to address both accessibility and usability issues on behalf of this unique population. Though some libraries have applied software to monitor accessibility of their web portals, there is more to usability than just access, so libraries should not lose sight of the usability aspect. To meet the special needs of these users, libraries can collaborate with campus disability centers to recruit participants for usability testing.
Compared with the 2007 sample, the 2020 participating libraries reduced the recruitment of IT professionals for testing. Perhaps libraries tried to avoid bias since this user group is usually better versed in web development. Also, since these IT professionals are website developers or designers, subjectivity issues may arise if they test on their own products.
Usability Testing: Methods
The average number of usability testing methods were consistent between 2007 and 2020. The top three approaches remained the same: In-person observation, Think-aloud, and Card sorting. As libraries strove to create user-centered web portals, it is understandable that they applied known user-focused techniques to gain insights into users’ information behaviors and mental models.
Surveys, interviews, and focus groups were the three most commonly used methods for soliciting feedback in both samples. However, participating libraries in 2020 opted for quantitative approaches more frequently, as evidenced in the increase in using surveys and the decrease in both the focus groups and interview methods. Quantitative techniques are advantageous in collecting large datasets without involving much staff time. Additionally, the authors observed other differences in 2020, including the nearly 50 percent increase in the use of pop-up windows and the over 20 percent drop in the website “Call for input” method. Although surveys are convenient and time efficient, they are inadequate for garnering the kind of user feedback in real time provided by qualitative methods, such as focus groups and interviews.
Usability Testing: Future Plans
In reviewing the themes mentioned in responding libraries’ future plans, we saw consistency across the years. Usability testing, website redesign, and resources continued to be the top priorities. The authors suggest academic libraries strive to foster a culture of usability, garner support from library administration, and devise a system for enforcement/agreement to build and maintain a sound infrastructure for web usability initiatives.
Conclusion
In this study, the authors compared web usability practices at ARL academic libraries in 2007 and 2020. We found a significant decrease in the mean level of perceived importance of usability testing in 2020, which was reflected in an overall stagnation in library-specific PSGs, usability testing, and resource availability. However, usability testing on OPACs and dedicated web usability personnel have increased.
Rapid web technology evolution continues to impact the development and design of library web portals. The web also serves as a common platform for current initiatives such as Open Access (OA), Open Educational Resources (OER), and Digital Scholarship. However, the emphasis on web usability practices in ARL libraries has decreased at a time when it should arguably be a higher priority. While library professional associations, such as ACRL and ARL, are advocating OA, OER, Digital Scholarship, and the information literacy framework, the authors suggest that these organizations also take a lead in promoting web usability. This includes making recommendations for establishing PSGs, providing educational resources for carrying out web usability initiatives, and fostering leadership in library web usability endeavors.
The advocacy of library professional associations can facilitate academic and research libraries’ efforts to cultivate a culture of web usability that is conducive to actualizing web usability efforts. This is especially important with the expansion of web technologies for accessing library resources and services remotely, as well as transitioning in-person teaching and learning to virtual environments. The COVID-19 pandemic further accelerated these movements. Achieving a seamless virtual environment with quality web usability for positive user experience requires concerted efforts from stakeholders with a shared vision and values.
Appendix. Survey Questions Used for the Comparative Study
- Does your library have a web site policy, guidelines, or standards that address usability issues?
- □ Yes
- □ No
- If your web policy, guidelines or standards are available electronically, please provide the URL below or send it via e-mail to ychen@albany.edu.
- Please rate the level of difficulty of implementing the policy, guidelines or standards.
- □ Not Difficult
- □ Slightly Difficult
- □ Moderately Difficult
- □ Very Difficult
- □ Extremely Difficult
- If you have had difficulties implementing your policy, guidelines, or standards, please describe them below:
- Regardless of your response to Question 1 about a general web site policy, does your library have specific policies, guidelines, or standards regarding web usability?
- □ Yes
- □ No
- If your web usability policy, guidelines, or standards are available electronically, please provide the URL below or send it via e-mail to ychen@albany.edu.
- Please rate the level of difficulty of implementing the web usability policy, guidelines, or standards.
- □ Not Difficult
- □ Slightly Difficult
- □ Moderately Difficult
- □ Very Difficult
- □ Extremely Difficult
- If you have had difficulties implementing your web usability policy, guidelines, or standards, please describe them below:
- Does your college or university provide an institutional web usability policy, guidelines, or standards?
- □ Yes
- □ No
- If your institution’s web usability policy, guidelines, or standards are available electronically, please provide the URL below or send it via e-mail to ychen@albany.edu.
- Does your library follow this institutional policy, guidelines, or standards? If your answer is no, please tell us why your library does not follow this institutional policy, guidelines, or standards.
- □ Yes
- □ No
- Please rate the level of difficulty of implementing the policy, guidelines, or standards.
- □ Not Difficult
- □ Slightly Difficult
- □ Moderately Difficult
- □ Very Difficult
- □ Extremely Difficult
- If you have had difficulties implementing your university’s web site policy, guidelines, or standards, please describe them below:
- What kinds of committees or task forces does your library have to oversee web usability? (Please check all that apply.)
- □ Usability Committee
- □ Web Advisory Committee
- □ Website Usability Subcommittee
- □ Other (please specify)
- How important is usability testing in your library?
- □ Not Important
- □ Somewhat Important
- □ Important
- □ Very Important
- □ Extremely Important
- Please use the space below if you have any more specific comments about the importance your library places on usability testing.
- In the past 10 years, has your library conducted any usability testing of its web sites?
- □ Yes
- □ No
If your library has not conducted Web usability testing, what are the reasons for that?
18–20: Please indicate the number of times you have conducted usability testing in each category.
- Main library website:
- □ Pre-website development None 1 2 3 4 5 or more
- □ During website development None 1 2 3 4 5 or more
- □ Post-website development None 1 2 3 4 5 or more
- OPAC:
- □ Pre-website development None 1 2 3 4 5 or more
- □ During website development None 1 2 3 4 5 or more
- □ Post-website development None 1 2 3 4 5 or more
- Lower-level library Web pages:
- □ Pre-website development None 1 2 3 4 5 or more
- □ During website development None 1 2 3 4 5 or more
- □ Post-website development None 1 2 3 4 5 or more
- If you perform Web usability testing, which populations are included? Please check all that apply.
- □ Administrators
- □ Alumni
- □ Faculty
- □ Graduate students
- □ Undergraduate students
- □ Library student workers
- □ Public users
- □ Non-library users
- □ Information technology (IT) professionals
- □ Persons with disabilities
- □ Researchers
- □ Staff
- □ Other (please specify)
- Which usability testing methods have you used? Please check all that apply.
- □ Card Sorting
- □ Cognitive Walk-Through
- □ Eye Tracking
- □ Filmed Observation
- □ Heuristic Evaluation
- □ In-Person Observation
- □ Keystroke Path Collection
- □ Paper Prototyping/Storyboarding
- □ Task Analysis
- □ Thinking Aloud
- □ Other (Please Specify)
- Did you use any of the methods listed below to receive additional input on library web usability? Please check all that apply.
- □ Focus groups
- □ Interviews
- □ Listserv postings
- □ Pop-up windows via the library website
- □ Surveys
- □ Website “Call for Input”
- □ Other (please specify)
- Does your library have a regular staff member who is primarily dedicated to issues of web usability? (i.e., web usability is the main focus of his or her job.)
- □ Yes
- □ No
- Since your library has a regular staff member who is primarily dedicated to issues of web usability:
- Is that staffer full-time or part-time?
- Roughly how many hours of that person’s typical workweek are dedicated to web usability?
- In which unit or department of the library is this person employed?
- What is this person’s title?
- What types of training has this staff member had regarding web usability? (Please check all that apply.)
- □ Training in human-computer interaction
- □ Training in Web usability
- □ Degree or certificate in Information Science
- □ No specific training
- □ Other (please specify)
- Regardless of your response to the previous question, does your library have any (additional) regular staff member whose regular duties include issues of web usability?
- □ Yes
- □ No
- If your library has an additional regular staff member whose regular duties include issues of web usability:
- Is that staffer full-time or part-time?
- Roughly how many hours of that person’s typical workweek are dedicated to web usability?
- In which unit or department of the library is this person employed?
- What is this person’s title?
- What types of training has this staff member had regarding web usability? (Please check all that apply.)
- □ Training in human-computer interaction
- □ Training in web usability
- □ Degree or certificate in Information Science
- □ No specific training
- □ Other (please specify)
- Regardless of your responses to the previous questions, do you receive assistance from another unit of your university (e.g., Information Technology), or do you hire an outside consultant for Web usability projects?
- □ Yes, another unit of the university
- □ Yes, an outside consultant
- □ No
- If your library receives assistance from another unit of your university or hires an outside consultant:
- What is the title of your library staff member who coordinates or oversees the activities of these entities?
- In which unit or department is the coordinator employed?
- Please use the space below if you would like to elaborate on your Library’s staff alignment with regard to issues of Web usability.
- Please provide details on future web usability plans your library may have.
- Please feel free to provide any additional comments you may have about library website usability.
Notes
1. Jakob Nielsen, “Heuristic Evaluation,” in Usability Inspection Methods (New York, NY: John Wiley & Sons, 1994), 25–62; Donald A. Norman, The Design of Everyday Things (New York, NY: Doubleday, 1988); Ben Shneiderman, Designing the User Interface: Strategies for Effective Human-Computer Interaction, 3rd ed. (Boston, MA: Addison-Wesley, 1998).
2. International Organization for Standardization (ISO), Ergonomic Requirements for Office Work with Visual Display Terminals, DIS 9241-11. Part 11: Guidance on Usability, 1998; International Organization for Standardization. ISO 9241-11:2018(en) Ergonomics of Human-System Interaction—Part 11: Usability: Definitions and Concepts, 2018, 6, https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en; United States Department of Health and Human Services, Research-Based Web Design & Usability Guidelines, (Washington DC: U.S. Government Printing Office, 2006).
3. William H. DeLone and Ephraim R. McLean, “The DeLone and McLean Model of Information Systems Success: A Ten-Year Update,” Journal of Management Information Systems 19, no. 4: 9–30. https://doi.org/10.1080/07421222.2003.11045748.
4. Weiyin Hong et al. “Determinants of User Acceptance of Digital Libraries: An Empirical Examination of Individual Differences and System Characteristics,” Journal of Management Information Systems 18, no. 3 (2001/2002): 97–124. https://doi.org/10.1080/07421222.2002.11045692.
5. Hong, et al., “Determinants of User Acceptance of Digital Libraries,” 97–124; Beth Thomsett-Scott and Frances May, “How May We Help You? Online Education Faculty Tell Us What They Need from Libraries and Librarians,” Journal of Librarian Administration 49, no. 1–2(2009): 111–35, https://doi.org/10.1080/01930820802312888
6. Yu-Hui Chen, Carol Anne Germain, and Huahai Yang, “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” Journal of the American Society for Information Science and Technology 60, no. 5, (2009): 953–68, https://doi.org/10.1002/asi.21032
7. See, for example, Blake Lee Galbreath, Corey Johnson, and Erin Hvizdak, “Primo New User Interface: Usability Testing and Local Customizations Implemented in Response,” Information Technology & Libraries 37, no. 2(2018): 10–35, https://doi.org/10.6017/ital.v37i2.10191; Sarah Guay, Lola Rudin, and Sue Reynolds, “Testing, Testing: A Usability Case Study at University of Toronto Scarborough Library,” Library Management 40, no. 1/2(2019): 88–97, https://doi.org/10.1108/LM-10-2017-0107.
8. David A. Bradbard, Caro O. Peters, and Yoana Caneva, “Web Accessibility Policies at Land-Grant Universities,” The Internet and Higher Education 13, no.4 (2010): 258–66. https://doi.org/10.1016/j.iheduc.2010.05.007; Tim Spindler, “The Accessibility of Web Pages for Mid-Sized College and University Libraries.” Reference & User Services Quarterly 42, no. 2 (2002): 149–54.
9. Jennifer Church and Kyle Felker, “Web Team Development,” portal: Libraries and the Academy 5, no.4 (2005): 545–54, https://doi.org/10.1353/pla.2005.0048; Jane Nichols, Alison M. Bobal, and Susan McEvoy, “Using a Permanent Usability Team to Advance User-Centered Design in Libraries,” Electronic Journal of Academic and Special Librarianship 10 (2009, Summer): 1–8, https://digitalcommons.unl.edu/ejasljournal/117/
10. Mary P. Popp, “Testing Library Web Sites: ARL Libraries Weigh in,” In Proceedings of the ACRL Tenth National Conference, Denver CO, March 15–18, 2001, 277–81. (Chicago, IL: American Library Association).
11. Chen et al., “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” 953–68; Popp, “Testing Library Web Sites,” 277–81.
12. Chen et al., Ibid., 953–68.
13. Ibid.
14. Norman, The Design of Everyday Things.
15. Jakob Nielsen, Usability Engineering (Boston, MA: Academic Press, 1993).
16. International Organization for Standardization, Ergonomic Requirements for Office Work, 2.
17. Jonathan W. Palmer, “Web Site Usability, Design, and Performance Metrics,” Information Systems Research 13, no. 2 (2002): 151–67, https://doi.org/10.1287/isre.13.2.151.88
18. Peter Brophy and Jenny Craven, “Web Accessibility,” Library Trends 55, no. 4 (2007): 950–72 https://doi.org/10.1353/lib.2007.0029, 960.
19. Yu-Hui Chen, Carol Anne Germain, and Abebe Rorissa, “Defining Usability: How Library Practice Differs from Published Research,” portal: Libraries and the Academy 11, no. 2 (2011): 599–628, https://doi.org/10.1353/pla.2011.0020
20. International Organization for Standardization, ISO 9241-11:2018.
21. Nielsen, Usability Engineering; Louis Rosenfeld and Peter Morville, Information Architecture for the World Wide Web (Sebastopol, CA: O’Reilly, 1998); Shneiderman, Designing the User Interface.
22. B. A. Ramadhan, Retno A. S. Lestari, and Erlinda Muslim, “Classification of Design Attributes for FMCG (Fast Moving Consumer Goods) Products Official Store in E-Commerce Website to Increase Usability and User Satisfaction,” IOP Conference Series: Materials Science and Engineering 505, (2019, July): 012082. https://doi.org/10.1088/1757-899X/505/1/012082; Viswanath Venkatesh, Hartmut Hoehle, and Ruba Aljafari, “A Usability Evaluation of the Obamacare Website,” Government Information Quarterly 31, no. 4(2014): 669–680. https://doi.org/10.1016/j.giq.2014.07.003; Yassierli Yassierli., Vinsensius Vinsensius, and M.S. Syed Mohamed, “The Importance of Usability Aspect in M-Commerce Application for Satisfaction and Continuance Intention,” Makara Journal of Technology 22, no. 3(2018): 149-158. https://doi.org/10.7454/mst.v22i3.3655
23. Ahmed Alanazi, et al., “The Role of Task Value and Technology Satisfaction in Student Performance in Graduate-Level Online Courses,” TechTrends 64, no. 6(2020): 922-930. https://doi.org/10.1007/s11528-020-00501-8; Tjie Lianawati Christian, Dennis Jaya, and Rulyna, “Impact of English Online Learning Website Quality to User Satisfaction in Jakarta.” In 2017 International Conference on Information Management and Technology (ICIMTech), Yogyakarta, Indonesia, November 15-17, 2017, 278–283. Piscataway, NJ: IEEE. https://doi.org/10.1109/ICIMTech.2017.8273551
24. Guay, et al, “Testing, Testing,” 88-97; Troy A. Swanson, et al., “Guiding Choices: Implementing a Library Website Usability Study,” Reference Services Review 45, no. 3(2017): 359–367. https://doi.org/10.1108/RSR-11-2016-0080
25. Christopher Chan, Jennifer Gu, and Chloe Lei, “Redesigning Subject Guides with Usability Testing: A Case Study,” Journal of Web Librarianship 13, no. 3(2019): 260–279. https://doi.org/10.1080/19322909.2019.1638337; Suzanna Conrad and Christy Stevens, “Am I on the Library Website?”: A LibGuides Usability Study,” Information Technology & Libraries 38, no. 3(2019): 49–81. https://doi.org/10.6017/ital.v38i3.10977
26. Asma Khatun and S. M. Zabed Ahmed, “Usability Testing for an Open-Source Integrated Library System: A Task-Based Study of the Koha OPAC Interface,” The Electronic Library 36, no. 3(2018): 487–503. https://doi.org/10.1108/EL-03-2017-0049
27. Blake Lee Galbreath, Corey Johnson, and Erin Hvizdak, “Primo New User Interface: Usability Testing and Local Customizations Implemented in Response,” Information Technology & Libraries 37, no. 2(2018): 10–35. https://doi.org/10.6017/ital.v37i2.10191; Marlen Prommann, and Tao Zhang, “Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation,” Information Technology & Libraries 34, no. 1(2015): 77–105. http://dx/doi.org/10.6017/ital.v34i1.5600
28. Ping Ke and Fu Su, “Mediating Effects of User Experience Usability: An Empirical Study on Mobile Library Application in China,” The Electronic Library 36, no. 5(2018): 892–909. https://doi.org/10.1108/EL-04-2017-0086; Gabriella Sekar Shada and Media Anugerah Ayu, “Designing Android User Interface for University Mobile Library.” 2018 International Conference on Computing, Engineering, and Design (ICCED), Bangkok, Thailand, September 6-8, 2018, 224–229, Piscataway, NJ: IEEE. https://doi.org/10.1109/ICCED.2018.00051
29. Jobke Wentzel, et al., “Card Sorting to Evaluate the Robustness of the Information Architecture of a Protocol Website,” International Journal of Medical Informatics 86, (2016):71–81. https://doi.org/10.1016/j.ijmedinf.2015.12.003
30. Denton et al., “Usability Testing as a Method to Refine a Health Sciences Library Website,” 1-15; Saeeda Sherman Rahman and Jing Nong Weng, “Component Based Method for Usability Testing of a Website,” Advanced Materials Research 765-767 (2013): 1507-1511. https://doi.org/10.4028/www.scientific.net/amr.765-767.1507; Swanson, et al., “Guiding Choices,” 359-367.
31. Shaun Ellis and Maureen Callahan, “Prototyping as a Process for Improved User Experience with Library and Archives Websites. Code4Lib Journal no.18, (2012, October 3). https://journal.code4lib.org/articles/7394; Brian Still and John Morris, “The Blank-Page Technique: Reinvigorating Paper Prototyping in Usability Testing,” IEEE Transactions on Professional Communication 53, no.2(2010): 144–157. https://doi.org/10.1109/TPC.2010.2046100
32. Eman Ahmed Al-taisan, Ghadah Salman Alduhailan, and Majed Aadi Alshamari, “Using a Discount Usability Engineering Approach to Assess Public Web-Based Systems in Saudi Arabia,” Information Technology Journal 15, no.1(2016): 26–30. https://doi.org/10.3923/itj.2016.26.30; Cathleen Wharton, et al., “Applying Cognitive Walkthroughs to More Complex User Interfaces: Experiences, Issues, and Recommendations.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Monterey CA, May 3-7, 1992, 381–388. New York, NY: Association for Computing Machinery. https://doi.org/10.1145/142750.142864; Junior Tidal, “One Site to Rule Them All, Redux: The Second Round of Usability Testing of a Responsively Designed Web Site,” Journal of Web Librarianship 11, no.1(2017): 16–34. https://doi.org/10.1080/19322909.2016.1243458
33. Nielsen, Usability Engineering; Shneiderman, Designing the User Interface.
34. Nora Almeida and Junior Tidal, “Mixed Methods not Mixed Messages: Improving Libguides with Student Usability Data,” Evidence Based Library and Information Practice 12, no.4 (2017): 62–77, https://doi.org/10.18438/B8CD4T; Prommann and Zhang, “Applying Hierarchical Task Analysis Method,” 77–105.
35. Melisa M. Gustafson, “They Searched What? Usage Data as a Measure of Library Services and Outreach,” Serials Librarian 74, no.1–4 (2018): 240–43; Junior Tidal, “Using Web Analytics for Mobile Interface Development,” Journal of Web Librarianship 7, no.4 (2013): 451–64, https://doi.org/10.1080/19322909.2013.835218
36. Jiahui Wang et al., “Exploring Relationships between Eye Tracking and Traditional Usability Testing Data,” International Journal of Human-Computer Interaction 35, no.6 (2019): 483–94, https://doi.org/10.1080/10447318.2018.1464776; Wegner et al., “Value of Eye-Tracking Data for Classification of Information Processing–Intensive Handling Tasks: Quasi-Experimental Study on Cognition and User Interface Design,” Journal of Medical Internet Research Human Factors 7, no.2 (2020): e15581, https://doi.org/10.2196/15581
37. Suzanna Conrad and Nathasha Alvarez, “Conversations with Web Site Users: Using Focus Groups to Open Discussion and Improve User Experience,” Journal of Web Librarianship 10, no.2 (2016): 53–82, https://doi.org/10.1080/19322909.2016.1161572; Adrian St. Patrick Duncan and Fay Durrant, “An Assessment of the Usability of the West Indies (Mona, Jamaica) Main Library’s Website,” The Electronic Library 33, no.3 (2015): 590–99, https://doi.org/10.1108/EL-11-2013-0207; James Miller, “The Design Cycle and a Mixed Methods Approach for Improving Usability: A Case Study,” Journal of Web Librarianship 13, no.13 (2019): 203–29, https://doi.org/10.1080/19322909.2019.1600451
38. Gregg Bailey, “Iterative Methodology and Designer Training in Human-Computer Interface Design,” In INTERCHI ‘93: Proceedings of the INTERCHI ‘93 Conference on Human Factors in Computing Systems, Amsterdam Netherlands, April 24–29, 1993, 198–205, Amsterdam Netherlands: ISO Press. https://doi.org/10.1145/169059.169163; Nielsen, Usability Engineering.
39. Jennifer C Romano Bergstrom et al., “Conducting Iterative Usability Testing on a Web Site: Challenges and Benefits,” Journal of Usability Studies 7, no.1 (2011): 9-30.
40. Katy Kavanagh Webb et al., “Our Experience with User Experience: Exploring Staffing Configurations to Conduct UX in an Academic Library,” Journal of Library Administration 56, no.7 (2016): 757–76, https://doi.org/10.1080/01930826.2015.1109892; Kimberly Mullins, “Research PlusTM Mobile App: Information Literacy ‘On the Go’” Reference Services Review 45, no.1 (2017): 38–53, https://doi.org/10.1108/RSR-03-2016-0020.
41. Nora Dethloff and Elizabeth M. German, “Successes and Struggles with Building Web Teams: A Usability Committee Case Study,” New Library World 114, no.5/6 (2013): 242–50, https://doi.org/10.1108/03074801311326867; Jane Nichols, Alison M. Bobal, and Susan McEvoy, “Using a Permanent Usability Team to Advance User-Centered Design in Libraries,” Electronic Journal of Academic and Special Librarianship 10, (2009 Summer): 1–8, https://digitalcommons.unl.edu/ejasljournal/117/
42. Nichols, Bobal, and McEvoy, “Using a Permanent Usability Team,” 1–8.
43. Heather Jeffcoat King and Catherine M. Jannik, “Redesigning for Usability: Information Architecture and Usability Testing for Georgia Tech Library’s Website,” OCLC Systems & Services 21, no.3 (2005): 235–43. http://doi.org/10.1108/10650750510612425.
44. H. Frank Cervone, “Usability Training: An Overlooked Component in an On-Going Program of Web Assessment and Development,” OCLC Systems & Services 21, no.3(2005): 244- 251. https://doi.org/10.1108/10650750510612434
45. Krista Godfrey, “Creating a Culture of Usability,” Weave: Journal of Library User Experience 1, no.3 (2015): https://doi.org/10.3998/weave.12535642.0001.301, 6.
46. Virginia A. Lingle and Eric P. Delozier, “Policy Aspects of Web Page Development,” Internet Reference Services Quarterly 3, no. 2 (1998): 33–48. https://doi.org/10.1300/J136v03n02_07
47. International Organization for Standardization, Ergonomic Requirements for Office Work; International Organization for Standardization. ISO 9241-11:2018(en).
48. United States Department of Health and Human Services, Research-Based Web Design.
49. Jakob Nielsen, “Heuristic Evaluation,” 25–62.
50. World Wide Web Consortium, Web Accessibility Initiative, “Web Content Accessibility Guidelines 1.0” (1999). https://www.w3.org/TR/WAI-WEBCONTENT/
51. Duncan and Durrant, “An Assessment of the Usability,” 590–99; Yavuz Inal, “University Students’ Heuristic Usability Inspection of the National Library of Turkey Website,” Aslib Journal of Information Management 70, no.1 (2018): 66–77, https://doi.org/10.1108/AJIM-09-2017-0216; Katja Kous et al., “Usability Evaluation of a Library Website with Different End User Groups,” Journal of Librarianship and Information Science 52, no.1 (2020): 75–90, https://doi.org/10.1177/0961000618773133; Laura Manzari and Jeremiah Trinidad-Christensen, “User-centered Design of a Web Site for Library and Information Science Students: Heuristic Evaluation and Usability Testing,” Information Technology and Libraries 25, no.3 (2006): 163–69, https://doi.org/10.6017/ital.v25i3.3348.
52. Chen et al., “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” 953–68.
53. Ibid
54. Ibid
55. Yu-Hui Chen, Carol Anne Germain, and Abebe Rorissa, “The Current State of Library Web Usability Practice at ARL Academic Libraries” portal: Libraries and the Academy 23 (2023) (forthcoming).
56. Chen et al., “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” 953–68.
57. Ibid
58. Chen et al., “The Current State of Library Web Usability Practice at ARL Academic Libraries.”
59. Chen et al., “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” 953–68.

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Article Views (By Year/Month)
| 2025 |
| January: 32 |
| February: 90 |
| March: 49 |
| April: 80 |
| May: 77 |
| June: 42 |
| July: 68 |
| August: 65 |
| September: 71 |
| October: 90 |
| November: 126 |
| December: 65 |
| 2024 |
| January: 22 |
| February: 7 |
| March: 21 |
| April: 33 |
| May: 23 |
| June: 21 |
| July: 32 |
| August: 4 |
| September: 18 |
| October: 39 |
| November: 21 |
| December: 25 |
| 2023 |
| January: 0 |
| February: 3 |
| March: 420 |
| April: 136 |
| May: 54 |
| June: 28 |
| July: 21 |
| August: 19 |
| September: 13 |
| October: 10 |
| November: 16 |
| December: 15 |