03_Wiggins_etal

Digital Scholarship Programs in Practice

Digital scholarship programs, a university unit of relatively recent origin, provide support and community for scholars integrating digital technologies into their research, teaching, and engagement work. But they have not been well defined in higher education scholarship and sometimes not even well understood on their campuses. To clarify the nature of digital scholarship programs, we surveyed what they do in practice. Using a combination of systematic searching of university websites and a survey instrument with 12 qualitative and 5 quantitative questions, we investigated the infrastructure, activities, and perceived successes and challenges of digital scholarship programs at Carnegie Classification R1: Doctoral Universities. Our study reveals that these programs exist at more than three-fourths of R1s, and their staff serve a critical function at the intersection of technology and scholarship. While our survey finds many commonalities between digital scholarship programs, such as supporting the application of research programming languages or offering professional development training, it also illustrates that these units have more heterogeneity and a broader scope than more established scholarly support units at research institutions. The degree to which we find digital scholarship programs already representing interunit partnerships and striving for even more collaboration illustrates increased cooperation and a will for further coordination in the face of a culture of internal competition under academic capitalism. Digital scholarship programs’ partnership structures offer higher education a model for building bridges between organizational silos in a fashion that respects the autonomy and distinctiveness of individual units, reduces internal competition, and offers user-centered scholarly support.

Introduction

Librarians Joan Lippincott, Harriette Hemmasi, and Viv Lewis define digital scholarship programs as “service organizations, staffed by individuals with specialized skills, who support work in the digital environment.”1 If that definition sounds broad, that is because digital scholarship programs combine three terms that each encompass a wide range of activities in their own right. “Digital” encompasses any technology that uses binary numbers to function, encompassing activities that range from the basic arithmetical functions of an electronic calculator to holographic simulations of thermodynamic processes. Similarly broad in its designation, “scholarship” often acts as a catch-all for the professional activities of students, researchers, and faculty within academe, including but not limited to research, teaching, and engagement efforts. And “program” in the context of the academy signals all sorts of internal organizational administrative structures from degree programs to faculty development units to community relations organizations. We, the authors of this study, have wrestled with the expansive, nebulous meaning of such units firsthand. Each of us plays a role in the University of Minnesota’s digital scholarship program, which is both historically rooted in our university libraries and presently a partnership among nine units across the institution. The complexity of our own program has prompted us to explore what a digital scholarship program is for ourselves and for our peers. To this end, we decided to ask not what a digital scholarship program is theoretically, but rather what it does in practice. Using a combination of systematic-review methods and a survey instrument with 12 qualitative and 5 quantitative questions, we investigated the infrastructure, activities, and perceived successes and challenges of digital scholarship programs at Carnegie Classification R1: Doctoral Universities.

Our study reveals the structure, practices, obstacles, and aims of digital scholarship programs at large research universities. It establishes a baseline of digital scholarship program design from which stakeholders may consider the design and redesign of these relatively novel units. Our research finds that digital scholarship programs tend to be structured as support-oriented organizations most commonly rooted in libraries but often partnered with or funded by other units within a university. These programs exist at more than three-fourths of R1 institutions whose staff serve a critical function at the intersection of technology and the three core facets of scholarship: research, teaching, and engagement. Our survey finds many commonalities in the work of digital scholarship programs; for instance, our evidence suggests that most programs help scholars use research programming languages and offer professional development training on how to use emerging software. But our findings also illustrate that these units have more heterogeneity and a broader scope than other more established scholarly support units—such as centers for teaching and offices of information technology—at research institutions. Additionally, our research reveals how the premier research institutions of the United States have addressed the ascendent role of digital technologies in scholarship and should help top-level university administrators, university library leaders, and those involved in digital scholarship themselves better understand the purpose and potential of these now prevalent organizations. The degree to which we find digital scholarship programs already representing interunit partnerships and striving for even more collaboration illustrates that digital scholarship program leaders and their partners are working to build cooperation and coordinate their scholar support efforts even in the face of higher education’s culture of internal competition under academic capitalism. The implications of our finding that digital scholarship programs tend to stem from and maintain roots in academic libraries, but rarely stem from or are supported by libraries alone, has implications for the design of other programs and higher education administration generally. Digital scholarship programs, we find, offer the potential to become a model for building bridges among the organizational silos in a fashion that respects the autonomy and distinctiveness of individual units, reduces internal competition, and offers a user-centered design for scholarly support.

Literature Review

Although a handful of surveys about digital scholarship programs have been conducted, there has not been a systematic survey to date of Carnegie R1 institutions focused explicitly on broad programmatic support of digital scholarship as defined above. Two Systems and Procedures Exchange Center (SPEC) kits from the Association of Research Libraries (ARL) surveyed libraries regarding digital humanities and digital scholarship activities, infrastructure, funding, and staffing in 2011 and 2016.2 The Digital Scholarship Centers Interest Group from a different library association, the Association of College & Research Libraries (ACRL), also conducted a survey in 2015/2016 to gather information on digital scholarship centers and the services they provide.3 This ACRL survey follows a related one done in 2008 by yet a third library association, the Council on Library and Information Resources (CLIR), about digital humanities centers in the United States.4 However, these studies included not only digital scholarship programs, but also programs that were exclusively focused on the more narrowly defined “digital humanities” support. Additionally, these studies centered on library-based programs. As ARL is primarily composed of libraries at research-intensive universities in North America, there is considerable overlap with institutions with Carnegie R1 Classification. Unlike these surveys that were sent to ARL member contacts in the libraries and focused primarily on library support and programming for digital humanities and digital scholarship from the library, our survey is the first to investigate digital scholarship programs regardless of where they reside, inclusive of those that are administratively outside a university library.

Despite their library-centered design and conflation of digital scholarship and digital humanities programs, these studies still hold relevance for our own. Bryson et al. indicate “a work in progress” or “in development” status for digital humanities support at many institutions. Forty-eight percent noted they provide ad hoc services for digital scholarship projects in the Humanities.5 Results from Mulligan’s study “Supporting Digital Scholarship” found a majority of ARL member libraries have dedicated digital scholarship or digital humanities units in their libraries. Fifty-nine percent of the 70 responding libraries noted that a department or unit was created or reorganized specifically to support digital scholarship activities, while another 11 percent is in the process of creating such a unit.6

All of the studies examined noted that digital scholarship activities are supported by different types of staff including librarians, library support staff, IT staff, faculty, and students. Bryson et al. found that the total number of permanent staff supporting digital humanities activities overall ranges from 0.5 to 16 staff, with 4.31 as the mean.7 Mulligan asked respondents to note the number of staff with responsibilities for specific digital scholarship activities. In the case of computational text support, the range for the number of staff was between 1 and 5 staff, with 2.42 as the mean.8 Bryson et al. provided data on the types of staff dedicated to support digital humanities and those called in on an ad hoc basis. Among the 51 respondents to this question, “digital humanities/scholarship librarian” was the highest category for dedicated staff with 13, followed by “IT staff” with 7.9

Both ARL studies indicate a strong need to collaborate with other university departments to fully support digital scholarship activities. Seventy-five percent of the respondents in the 2011 study conducted by Bryson et al. note collaborations with other units, most notably university information technology units, humanities computing units, and academic departments and centers.10 Mulligan specifically asked in their 2016 survey whether researchers can find support for 20 digital scholarship services inside the library, elsewhere at the university, or outside the university. For instance, 59 out of 72 respondents indicated 3-D modeling and printing support was available outside the library in comparison to only 42 who stated support in the library. Support for statistical analysis is also primarily provided outside the library. In contrast, support for digital preservation, data curation, and metadata management is primarily available in libraries.11 Like Bryson et al., Mulligan found strong collaborations with information technology units and humanities computing units. Partnerships with external digital humanities centers or multi-institutional entities were also noted by some respondents.12 In addition to providing information and statistics on successful partnerships, Zorich noted that 78 percent of the 32 responding digital humanities centers described partnerships that were to some degree unsuccessful. Reasons noted include lack of institutional support, staff changing, loss of funding, mismatched expectations, and communication issues.13

A number of trends are evident in the literature regarding specific digital scholarship support services offered. Ippoliti and Mulligan report that more than 75 percent of responding libraries offer services related to data curation and preservation, text mining and analysis, metadata, digital publishing, and Geographic Information Systems (GIS).14

By and large, a multipronged marketing approach is used to promote digital scholarship services with methods such as email, the library website, digital signage, posters, listservs, social media, speaking at orientations and other events, and word of mouth. Ippoliti found that email and word of mouth were the most frequently employed followed by the library website and social media.15 Bryson et al. noted a reliance on subject librarians as the main way to advertise services.16 Comments in Mulligan’s survey results also noted subject librarians as a prominent means of marketing these services.17

Although digital scholarship services and activities are funded via a myriad of ways, the library’s operating budget is by far the most prevalent source. Ninety percent of respondents selected the library’s general budget in Bryson et al.18 and, similarly, 100 percent did in Mulligan.19 Sixteen of the 71 respondents (23%) also noted a designated digital scholarship budget in Mulligan.20 More than 70 percent of respondents noted grant funding to the library as a funding source.21 Zorich noted that many digital humanities centers are considering establishing endowments as a means for more stable funding. Twenty-two percent of centers have endowments in place.22 Mulligan corroborated this trend, with 25 percent of respondents selecting endowments as a funding source.23 A decrease in financial support for digital scholarship services and activities among academic departments is also evident in the literature. Whereas Bryson et al. found that 50 percent of the digital scholarship and digital humanities programs received funding from academic departments,24 this amount decreased to 18 percent, according to Mulligan. However, support from the university is still significant, as 27 percent of digital scholarship programs receive funding from central university funds.25

Mulligan noted that most institutions provide support for faculty, students, staff and other groups affiliated directly with an institution, while 23 percent extend support to external researchers and 15 percent to the general public.26 Likewise, Bryson et al. found only 29 percent of respondents provide support to outside researchers. Looking specifically at affiliated users, this study found that 98 percent of respondents provide digital humanities support to faculty, whereas only 85 percent provide support to graduate students, 77 percent to postdoctoral or affiliated researchers, and 65 percent to undergraduates.27 This primacy of faculty and graduate students over undergraduates is corroborated in a number of free-text comments in Mulligan’s 2016 study.28

While far-reaching studies of digital scholarship programs are scarce, many institutions have published case studies about how they have implemented programs. Since digital scholarship support varies by institutional size and type, these case studies offer road maps for organizations still working to implement digital scholarship programs of their own. Librarians at Kansas State, for example, illustrate how they have built a digital scholarship program around digital publishing and a robust institutional repository, stressing the impact well beyond the confines of their campus.29 At University of Colorado-Boulder, the team of librarians and technologists that comprise that school’s Center for Research Data and Digital Scholarship focus on the promotion of data literacy.30 University of North Carolina-Charlotte librarians describe the shift from à la carte digital services to a more formal program and staffing model in their Digital Scholarship Lab.31 Miller describes the elements that led to Middle Tennessee State University’s Digital Scholarship Lab—which combines campuswide collaborations with integration of existing library services to provide a range of digital scholarship support and outreach—as an “if they ask, try” model.32 Mitchem and Rice described how Appalachian State University built a digital scholarship program, despite a culture of skepticism around the concept on their campus.33 Longmeier and Murphy discussed how a logic model and corresponding data-gathering plan were used to assess digital scholarship services at the Ohio State University Libraries to measure sustained engagement and long-term impacts.34 While each of these case studies offered expert advice for practitioners on other campuses, they are far from universal experiences and point toward the need for more expansive and systematic study. Though our study overlaps with some of the above literature, our methods surface programs that support digital technology across the university, highlighting the depth of libraries’ involvement in these programs without ignoring the significant contributions of other units to these endeavors.

Data, Methods, Research Design

Identifying Digital Scholarship Programs at R1 Institutions

Many institutions around the world and many types of institutions within the United States are home to digital scholarship programs. However, for the purposes of our study, we chose to investigate only those programs at universities categorized by the Carnegie Classification of Institutions of Higher Education as “R1: Doctoral Universities—Very high research activity.” The R1 designator represents institutions of higher education physically located in the United States that awarded at least 20 doctorates in the 2016–2017 academic year, reported at least $5 million in research expenditures, and ranked “very high” in either the Carnegie Institution–developed index measuring aggregate research activity or its index measuring per-capita research activity or both.35 Though the R1 designator includes some overlap with other groupings that have been used to investigate digital scholarship programs in the past, we selected the R1 classification to limit our analysis so that our pool of participants would be limited to higher education institutions in one country. Other groupings, such as the member institutions of the Association of Research Libraries, include institutions from multiple countries (such as Canadian universities) and non–degree-granting institutions (such as the Boston Public Library), which introduce the confounding factors that come with an international sample and additional diversity in institutional mission. We recognize that there is a notable presence of digital scholarship programs both at smaller institutions and beyond the United States. However, digital scholarship programs at liberal arts colleges, community colleges, and other institutions of higher education often present distinctive resource constraints and tend to support few if any graduate students, making budgetary and user comparison difficult. And while it is possible to identify large, degree-granting research institutions throughout the world, we feel that each country or at least each region of the world’s digital scholarship program landscape deserves an article-length analysis. It is our hope that other research teams use and modify our baseline survey to study their own national contexts so that eventual cross-national comparison may become possible.

Our research of digital scholarship programs at R1 institutions took a two-phased approach. In the first phase of our work, we conducted what evidence synthesis scholars Maria Grant and Andrew Booth classify as a comprehensive “mixed studies review/mixed methods review” that combined a “mapping review/systematic mapping” with supplemental “stakeholder consultation.”36 We began by searching the public websites of each R1 institution to identify whether a given institution maintained a digital scholarship program. As mentioned at the outset, the component terms of “digital scholarship program” refer to an expansive set of possibilities that are only marginally narrowed when compounded. But because we hoped to let these programs speak for themselves to reveal their practices, we paired “fuzzy”-yet-systematic searching with direct correspondences to determine whether a university had a “program” that supported the use of “digital” technologies in “scholarship.” Navigating first to each institution’s main page, we queried its domain search to return results for “digital scholarship.” In many cases, the top results were easily identifiable as a digital scholarship program with names such as Digital Scholarship Services, Digital Scholarship Lab, or Center for Digital Scholarship. This search also regularly returned results of clearly identifiable digital scholarship programs without “digital scholarship” in their title such as Scholars’ Lab or Innovate Make Create. For institutions where a digital scholarship program was not found during the initial search, we searched the two terms without quotes on the university website and combed the top 20 results for any university programs that might reasonably be identified as supporting digital scholarship by some other name. One “positive result” in this search identified a digital scholarship program that was no longer in service. For all remaining institutions that did not display any clearly identifiable digital scholarship program, we sent an email to an Associate Dean of Libraries or Associate University Librarian asking if their institution has a digital scholarship program we had overlooked. Out of the 54 cases that required such contact—representing 41 percent of the 131 R1 institutions—an additional 24 digital scholarship programs were identified. Email from the library administrators also confirmed that 30 institutions had no digital scholarship program whatsoever. From these results, we identified a point of contact for each digital scholarship program. We manually scanned each program site to identify its leader and their email (87), or, in the cases in which leadership was not identifiable (13), we used the program’s generic email address. To these 100 email contacts, we sent our survey. One of these contacts let us know that their digital scholarship program was no longer in service, making our maximum possible N = 99, representing 76 percent of R1 institutions.

The Survey Instrument and Process

We designed our 17-question survey to accomplish three goals: 1) gain a description of the infrastructure of each program; 2) develop a qualitative outline of each program’s activities; and 3) invite respondents to reflect on their program’s perceived successes and challenges. In the service of our first goal, we asked participants for the program’s name, their title, the job classes dedicated to the program’s administration, and the program’s funding sources. To meet our second goal, we asked respondents about the percentage of activities, topics, and technologies their program most frequently supports, how their program conducts outreach to potential users, the academic demographics of their users, how their program overlaps with others on campus, their means of assessing their work, and what they are prioritizing for the near future. Finally, to our third goal, we asked respondents to identify and describe their program’s successes and challenges in recent years. Though we reviewed the instruments of related studies before drafting our own survey, no clear model was found to meet these specific aims. Since our questions were not drawn from previous studies, we sent a test version of the instrument to an external volunteer in a digital scholarship program leadership role at an institution that was not included in the R1 classification and took the survey ourselves. We calibrated the questions to our aims based on these tests.

The survey was submitted to IRB and determined to be “Not Human Research.” Nevertheless, we asked participants for consent and offered the option of withdrawing that consent or concluding the survey at any time. We distributed our survey on May 4, 2020, via email from the first author’s email address. Potential respondents were given one month to complete the survey using a Qualtrics link. After seven days, we sent a reminder email to those digital scholarship program leaders who had yet to complete the survey, and we sent one more reminder two weeks after that. A third and final reminder with a deadline extension was sent one week later. The full survey instrument is available in the appendix.

Data Analysis Methods

Survey responses were exported from Qualtrics and imported into Python for data analysis using the pandas, numpy, and matplotlib packages.37 Of the five quantitative questions in the survey, one asked respondents to share the number of employees in their digital scholarship program (Q4). These figures were summed and plotted according to employee type. The remaining four quantitative questions asked respondents to estimate percentages of contributions associated with specific types (of employee, funding source, activity, and user). These data were analyzed by calculating the mean percentage for each type or category in question.

The 12 qualitative questions were analyzed using automated methods, lightweight manual coding, and full inductive human coding workflows. Word frequencies were calculated to explore key concepts in free-text answers for questions about institution names (Q1), program names (Q2), and job titles of respondents (Q3). Text-normalization was used to count technology support topics (Q10) by developing a dictionary to replace variations on common technologies named in free-text answers (such as “Adobe,” “Adobe CC,” “adobe creative suite,” and “Adobe CS”) with controlled terms for each technology or tool (like “adobe”). Two of the authors inductively coded the relatively simple and short free-text answers about funding sources (Q6), user support topics (Q9), types of outreach (Q11), and duplicative sources (Q13). Responses for funding sources and duplicative sources were each tagged with one code per response. Most responses for user support topics and types of outreach also received one code per response, though 1.5 percent of the user support responses and 10 percent of the responses for outreach received two codes. For these four questions, authors 1 and 2 looked at the survey answers independently to draft potential codes. After discussing and refining initial code categories via email, one author coded the remaining responses following the agreed-upon category definitions. Due to the relatively straightforward nature of these category assignments, they were not independently coded.

The final four questions of the survey asked respondents to share perceptions of their programs’ major successes and challenges, to reflect on future priorities, and to describe assessment practices. Free-text responses for each question were inductively coded by authors 1 and 2. First, each author independently created codes related to program practices and applied those to 20 percent of the responses for each question. The authors then discussed their initial coding and finalized a codebook that was used to independently code the remaining survey responses [see data repository documentation38]. When the independent coding was complete, the authors discussed, tracked, and resolved all coding discrepancies for the four question responses. Intercoder percentage agreement for the four questions ranged from 94 to 96 percent, and each question had between 20 and 24 possible codes. Each question response was coded for multiple themes related to program practices. All responses for the four questions that were not submitted as blank were assigned at least a single code. The most codes assigned to any single response was 10, and more than 50 percent of all question responses were coded with at least three practice codes.

Results

We received 77 survey responses from 99 invited participants, ultimately removing six replies from this initial set: three duplicate responses from institutions where multiple individuals replied; one response was withdrawn at the participant’s request; one response did not name an institution so could not be included; and another indicated they did not consent to the survey (upon email follow-up this respondent indicated they were no longer at the institution). Our final data set includes 71 responses, which constitutes a 72 percent response rate for the survey overall. Ten of the 71 survey responses in our dataset were partially completed, and all completed responses for individual survey questions are included in the analysis below.

Program names (Q2, n = 64). Seven respondents either did not indicate a program name or noted that the program was not yet formally titled. Of the remaining 64 responses, 56 included the term “digital” in the program title, and 42 included the exact phrase “digital scholarship.” Given that a website search for “digital scholarship” was a primary means of identifying institutions to survey, this is no surprise. A variety of place and service terms were present in the program names, demonstrating little cross-institutional consensus: “services” was present in the program title at 13 institutions, “center” at 12, “lab” at 6, “commons” and “studio” at 4 each, and “hub” at 3. Relatively few programs signaled disciplinary affiliations, though 6 programs have “humanities” in the program title, 5 have “arts,” and 2 have “sciences.”

Job titles (Q3, n = 71). The majority of respondents’ job titles reflected their roles as heads, directors, or co-directors of digital scholarship programs (44), while 6 respondents serve as associate university librarians (AUL) or in equivalent positions. Twelve respondents had librarian job titles—7 of those as “digital scholarship librarians”—and fewer respondents identified as assistant (1) or associate professors (2), coordinators (2), strategists (2), analysts (1), or postdoctoral fellows (1). Some job titles reflected dual roles (for example, Head of digital scholarship program and Associate Professor), and one survey was co-submitted by both an AUL and librarian.

Number of employees (Q4, n = 71). At the institutions reflected by the survey, librarians, curators, and archivists (185) are far and away the most common full-time employee types to contribute at least 50 percent of their time to digital scholarship programs, followed by technologists (74), faculty (36), administrative staff (31), postdocs (17), and other (21). Significant variation exists from institution to institution, however. At the low end, five institutions noted that no employees contribute more than 50 percent of time to their programs, while a majority of institutions (38) listed between 1 and 5 employees. Five institutions at the higher end shared that 14 or more employees contribute 50 percent or more of their time to their programs, with 18 employees comprising the most substantial workforce of responding institutions. Two institutions listed only faculty members, another two listed only postdocs, while 14 listed only librarians as contributors to their programs.

Percentage of employee type contribution (Q5, n = 71). The percentage by which employee types contribute to the core activities of the program largely mirrors the number of full-time employees indicated in Q4 above. The mean contributions for all survey respondents show librarians, curators, and archivists contributing at 51 percent, followed by technologists (15%), faculty (11%), administrative staff (7%), graduate students (6%), undergraduates (4%), postdocs (3%), and other (2%). It is notable that the only employee class to contribute 100 percent of the core activities of an institution’s specific program were librarians, who constituted 100 percent of the contributions at 10 institutions. The highest portion that other job classes were noted to contribute was 90 percent (postdocs), 80 percent (faculty), 80 percent (administrative staff), 70 percent (graduate students), 65 percent (technologists), and 25 percent (undergraduates).

Q6: Funding sources (n = 71) and Q7: Percentage of funding (n = 68). Respondents submitted free-text responses to the prompt to list “all units (e.g., Office of Information Technology) or sources (e.g., Mellon Foundation) that provide funding for your digital scholarship program.” These answers were qualitatively coded by authors 1 and 2 into eight general categories: Libraries, Grants, Gifts, Office of President/Provost/Research, IT, Colleges, Institutes/Centers, and Departments. Three respondents listed funding sources in Q6 but provided no corresponding percentages of contributions in Q7, so they were dropped from the analysis. Thirty of the 33 institutions that listed a single funding source were funded entirely by the library system at their institution.

Each respondent provided a percentage contribution for each funding source identified in Q6. To calculate the mean contribution of each of the eight kinds of funding sources from Q6, source categories that were not listed by a specific respondent were assigned a percentage contribution of zero percent. The mean funding contribution was then calculated for each of the eight funding categories. Essentially, these means are derived by dividing the sum of the percentage values reported for each single category by the sum of the percentage values for all categories [see table 1].

TABLE 1

Counts, Percentage of Institutions Reporting, and Mean Funding for Each Funding Source Category Identified in Q6/Q7

Funding Source

Count

% of Respondents Receiving Funds from Source (Q6)

Mean Funding Contribution (Q7)

Libraries

62

91.18%

78.23%

Grants

28

41.18%

2.60%

Colleges

20

29.41%

3.71%

Office of Provost/President/Research

18

26.47%

6.29%

Gifts

7

10.3%

6.1%

IT

7

10.29%

2.40%

Departments

5

7.35%

0.45%

Institutes/Centers

5

7.35%

0.22%

There are notable differences between the percentage of respondents receiving funding from specific sources and the size of the contributions those sources each made to digital scholarship programs. While grants were listed at 28 institutions, for example, constituting the second most common type of funding source, the average percentage of their overall contribution across all digital scholarship programs was only 2.53 percent. Conversely, gifts, when present—less commonly listed by institutions as a funding source—contributed more significantly to the overall budget of digital scholarship programs, averaging 6.1 percent. Specific respondents also provide notable exceptions to the general trend of libraries providing much of the funding to most institutions: one respondent reported that their program is funded entirely by grants; a second reported that 86 percent of the program is funded by Offices of President/Provost/Research, while the remaining 14 percent was funded by grants; and a third was funded 80 percent by gifts and 20 percent by Offices of President/Provost/Research.

FIGURE 1

The Mean Percentage of Categories of Activities and Support That Digital Scholarship Programs Reflected in the Survey Engage In (Q8)

Figure 1. The Mean Percentage of Categories of Activities and Support That Digital Scholarship Programs Reflected in the Survey Engage In (Q8)

Q8: Categories of activities (n = 66). Looking at the activities and support that digital scholarship programs engage in, the mean across the 66 respondents shows a relatively even split. Most institutions engage in all three of the primary categories of activity: only five institutions responded that research comprises none of their activities or support, while another three listed teaching at zero percent, and two engage in no outreach or engagement activities. On the high end, one institution noted that outreach and engagement constituted 100 percent of their activities; but, on the whole, institutions largely reported a relatively even spread across these categories.

Q9: Individual support topics (n = 65). Respondents could enter up to five topics for which individuals at their campus most often sought support from their digital scholarship program. Authors 1 and 2 then qualitatively coded these responses into 26 discrete categories.39 Considering that only a single institution included the word “data” in the title of their program, it was surprising to note that the two most common topic categories for which individuals seek support were Data visualization (36) and Data analysis and programming (32). Other areas of common support across institutions were GIS and geospatial (30), Digital archives, collections, and exhibits (30), Text mining (28), Online publishing and scholarly communication (27), and Web design and development (17).

Q10: Technology support topics (n = 71). Respondents each shared up to five software applications, hardware, and/or programming languages for which individuals most often go to their program for support. Of the 94 technologies that were mentioned overall, only five were noted by more than 20 institutions. Python was the most common technology mentioned (39 times), followed by R/RStudio (30), WordPress (28), Omeka (25), and ArcGIS (22). The remaining 89 technologies were mentioned by 10 or fewer respondents, which includes a long tail of application support in areas such as design (Adobe products [8]), publishing (Pressbooks [3]), web development (Scalar [9], HTML/CSS [3]), research computing (NVivo [7], Gephi [3]), and GIS (StoryMaps [6], QGIS [4]).

FIGURE 2

The Total Number of Times Each of the Ten Most Common Categories of Technologies Supported by Digital Scholarship Programs Were Identified by Survey Respondents in Q10 of the Survey

Figure 2. The Total Number of Times Each of the Ten Most Common Categories of Technologies Supported by Digital Scholarship Programs Were Identified by Survey Respondents in Q10 of the Survey

Q11: Outreach activities (n = 65). Survey participants were asked to share up to five of the most effective ways they share program activities, services, and events with their constituents. Sixty-five respondents replied to the question, providing at least two modes of outreach, while 26 of those respondents provided examples for all five modes. Authors 1 and 2 qualitatively coded the free-text responses into 16 different categories. Email—a category that includes regular newsletters as well as direct email to individuals on campus and professional listservs—was the most frequently mentioned kind of outreach at 66, by more than a factor of 2, followed by department visits (including departmental orientations).40 Social media (mentioned 26 times) and website posts (20) are also common modes of outreach, but were mentioned far less than email. More traditional modes such as signage (11)—including print flyers/handouts, as well as digital displays—and tabling at campus events (10), were rarely mentioned as viable ways to communicate to campus users.41 Only two respondents mentioned arranging more formal outreach via campus communications/PR professionals.

Q12: Who uses the program (n = 66)? The primary users of digital scholarship programs across institutions are faculty (34%) and graduate students (29%), followed by undergraduates (14%) and librarians, curators, and archivists (10%). The long tail of users includes postdocs (3%), along with technologists, administrators, community members, and other users (2% each). No program appears to cater specifically to a single category of user, as the highest percentage of use that any institution reports is 75 percent by faculty, 70 percent by graduate students, and 65 percent by undergraduates.42

FIGURE 3

The Mean Percentage of User Types at Digital Scholarship Programs as Collected in Q12 of the Survey

Figure 3. The Mean Percentage of User Types at Digital Scholarship Programs as Collected in Q12 of the Survey

Q13: Duplicative services (n = 54). Seventeen institutions did not list any campus units that offer services or programs that are closely related to, or duplicative of, their own program offerings. Only one institution listed 10 units (the maximum number allowed by the survey), another listed seven, and all others listed fewer than six. Most commonly, institutions listed three or four similar or duplicative services at their campuses. Authors 1 and 2 qualitatively coded the 193 services listed overall by the 54 respondent institutions into one of 16 categories that were chosen to encapsulate the distinct and common divisions within a university’s administrative structure. Research Centers—which were coded to include research institutes and all named centers with the exception of those focused on teaching and learning—were the most common kind of similar or duplicative service (49), constituting nearly 25 percent of the identified services overall. Central IT, Libraries, Labs, and Research Divisions each accounted for between 10 and 12 percent of the services named here, with the remaining 10 categories of similar/duplicative services each representing less than 5 percent of the total.

Q14: Major successes (n = 61). Question 14 asked respondents to identify the major successes of their digital scholarship program during the past several years. Authors 1 and 2 coded the 61 free-text responses into 21 program practice themes, and the initial independent coding matched on 1,216 out of a possible 1,281 codes, yielding an intercoder percentage agreement of 94 percent.

After resolving coding differences, the most common successes mentioned were related to building initiatives/services (44%) and building partnerships (44%). The building initiatives/services theme was used to capture any mention of launching or expanding initiatives, services, or other strategic and intentional areas of support. Common examples from the responses were initiatives related to library publishing, open access, research data management, as well as faculty and graduate fellowships. The practice of building partnerships was the other most common success mentioned and includes the development of new partnerships or the strengthening of existing partnerships with other institutional units or community partners. It was not used when respondents mentioned “partnerships” with clients of the program, such as faculty or graduate students.

The third most common practice identified as indicative of a program’s success was delivering training/events (38%), which includes workshops, lectures, and other noncurricular events. A number of codes were used for other teaching and learning practices: the offering degree/certificate/credit courses (13%) theme includes more substantial and formal teaching paths; supporting teaching projects (10%) includes curriculum integration, assignment development, and in-class visits; while creating guides/OER (8%) was used for the creation of open online educational resources, including library guides.

Other common responses relate to the development of a well-used and sufficiently resourced digital scholarship program. These largely concern the ability of the program to meet institutional needs: expanding utilization (25%), securing/budget/grants/gifts (23%), adding staff (11%), improving/maintaining reputation (11%), and building physical spaces (10%). Supporting research projects (23%) and offering grants/fellowships (18%) were other common successes and speak to a focus on research-support. Less commonly mentioned activities relate to building or maintaining digital tools and infrastructure: licensing/maintaining a tool (18%) refers to the support of open source and commercial software (such as digital libraries/repositories and content management systems) as well as hardware infrastructure, while developing a tool (3%) includes the development of entirely new digital systems.

TABLE 2

The Number of Major Success Categories Mentioned by Respondents in Q14 (n = 61)

Success Category

Count

% of Respondents Reporting Success

Building Initiatives/Services

27

44%

Building Partnerships

27

44%

Delivering Training/Events

23

38%

Expanding Utilization

15

25%

Supporting Research Projects

14

22%

Securing Budget/Grants/Gifts

14

22%

Offering Grants/Fellowships

11

18%

Licensing/Maintaining a Tool

11

18%

Q15: Challenges (n = 56). Fifty-six individuals responded to the question “What are the primary challenges your program currently faces or has recently overcome?” Authors 1 and 2 coded these responses using 20 inductive themes, some of which correspond to codes used for Q14 and Q17 and matched on 1,145 of a possible 1,200 codes, leading to an intercoder percentage agreement of 95 percent.

The most common challenges mentioned by survey respondents related to insecure/inadequate funding (39%) and understaffing (38%). Another notable staffing challenge, tracked separately for instances when staff members either left for other jobs or job positions were lost, was turnover/loss of staff (21%). These challenges clearly intersect with another prominent theme, insufficient capacity, which was mentioned in 32 percent of the responses and was used when a lack of ability in services, programs, or events to meet demand was noted or respondents shared a sense of being “spread thin.” More than one in 10 (13%) respondents also noted challenges in broadening/deepening expertise, a theme that was the second most common future priority mentioned in Q17.

Other key challenges relate to a program’s placement within an institution and its ability to promote and effectively communicate its role. Hierarchy/institutional barriers (32%), the third most common challenge encountered, was coded when respondents shared issues related to institutional and organizational structures. Respondents also regularly noted outreach difficulties: promotion of program (27%) was coded in cases where promoting digital scholarship services was a challenge, and communication (16%) for difficulties communicating about the program and/or a lack of user understanding of the program and its services.

Building initiatives/services, the top theme for both program successes (Q14) and future priorities (Q17), was only raised as a challenge in 11 percent of responses, suggesting that, while building new initiatives and services has been and will continue to be important work for digital scholarship programs, it is not, in and of itself, a prominent challenge. Similarly, building partnerships—the second most common theme for both successes (Q14) and future priorities (Q17)—was less commonly mentioned as a challenge, though it was still present in 20 percent of the responses.

Q16: Assessment (n = 59). Question 16 asked respondents to identify how their “program assesses its own impact, successes, and challenges” and was answered by 59 respondents. The free-text answers were assigned 23 codes to reflect various assessment practices, the intercoder percentage agreement for which was 96 percent, with 1,301 codes matching out of a total universe of 1,357 possible codes.

TABLE 3

The Number of Challenge Categories Mentioned by Respondents in Q15 (n = 56)

Challenge Category

Count

% of Respondents Reporting Challenge

Insecure/Inadequate Funding

22

39%

Understaffing

21

38%

Hierarchy/Institutional Barriers

18

32%

Insufficient Capacity

18

32%

Promotion of Program

15

27%

Turnover/Loss of Staff

12

21%

Building Partnerships

11

20%

Communication

9

16%

Unsurprisingly, the most common practices mentioned to assess digital scholarship programs were the collection of consultation statistics (41%) and training/event statistics (37%), the latter of which was defined to include in-class instruction statistics. Other areas where activities were counted, though less frequently, were: website analytics (15%), which included data collected to reflect visits to digital scholarship websites and guides; inquiry statistics (14%), used to track the number of inquiries, new users, and requests; project/publication statistics (12%), used to track the outputs of scholars who use the digital scholarship program; and funding statistics (5%), used when funding, such as the number and amount of grants and outside gifts received, was tracked for assessment purposes.

The third most prevalent assessment practice mentioned was the utilization of user surveys (36%), including needs assessments, which lends a qualitative aspect to the otherwise heavily quantitative measures that were most frequent. Other qualitative data was collected in person via informal feedback (17%), senior leadership/partner input (8%), and leadership/staff retreats (7%). Along with the collection of data and user input, respondents commonly mentioned practices such as report writing (27%) and strategic planning (20%) as critical to the assessment of its impact, successes, and challenges. Communicating success stories (10%) was noted for cases when respondents mentioned documenting and sharing best practices or user success stories, and standardized evaluations (10%) for cases where respondents used assessment forms or templates required or recommended by institutions or professional organizations.

TABLE 4

The Number of Assessment Categories Mentioned by Respondents in Q16 (n = 59)

Assessment Category

Count

% of Respondents Reporting Assessment Type

Consultation Statistics

24

41%

Training/Event Statistics

22

37%

User Surveys

21

36%

Report Writing

16

27%

Strategic Planning

12

20%

Informal Feedback

10

17%

Website Analytics

9

15%

Inquiry Statistics

8

14%

Q17: Priorities (n = 60). Question 17 asked respondents to share the most important priorities for their program across the next three to five years. Sixty free-text responses were collected and coded using 24 program practice themes, 21 of which were the same as those used for Q14. Authors 1 and 2 created three new codes—broadening/deepening expertise, completing scholarly projects, and social justice—none of which appeared in survey responses to Q14. The initial independent coding matched on 1,368 out of a possible 1,440 codes, yielding an intercoder percentage agreement of 95 percent.

Building initiatives/services (52%) was the most frequently mentioned priority for future work, followed by broadening/deepening expertise (35%), which was assigned for responses that mentioned building staff expertise in new domains or tools to help expand or improve digital scholarship program offerings. The expertise of staff in digital scholarship programs appeared to be notable to respondents primarily when it was lacking: it was not mentioned by any respondents as an example of success in a program (Q14), nor was it ever mentioned directly as something to be formally assessed (Q16), yet it was identified as a priority for the future at more than a third of the institutions. Adding staff was also a more prevalent theme as a future priority (23%) than it had been as a notable success (11%, Q14), suggesting that for many digital scholarship programs, human resources and expertise are key issues for their continued success. Building physical spaces appears to be a continued focus, mentioned by 15 percent of respondents for Q17, compared to 10 percent of respondents for Q14. And building partnerships (35%), though slightly less prevalent than it had been as a notable success in Q14 at 44 percent, was mentioned as often as broadening/deepening expertise as an area of future focus.

Themes related to teaching activities were not uncommon as future priorities but were significantly less prevalent than they had been as successful practices in Q14, suggesting that some programs find their teaching activities to be sufficient at current levels.

TABLE 5

Percentage of Themes Identified as Successes and Future Priorities by Respondents

Teaching Practice Theme

Successes (Q14, n = 61)

Future Priorities (Q17, n = 60)

Delivering Training/Events

38%

18%

Creating Guides/OER

8%

8%

Supporting Teaching Projects

10%

7%

Offering Degree/Certificate/Credit Courses

13%

7%

Other areas that were mentioned less frequently as future priorities than as notable successes were: supporting research projects (15% in Q17, down from 23% in Q14), offering grants/fellowships (5%, down from 18%), and licensing/maintaining a tool (10%, down from 18%). Areas that were mentioned at almost exactly the same rates as future priorities and key successes were expanding utilization (26%), securing budget/grants/gifts (22%), and improving/maintaining reputation (12%).

The code sustaining the digital scholarship program—used when respondents mentioned working to improve sustainability of program (such as aligning/dropping services to improve the program, building political capital)—appeared as a future priority in 12 percent of responses (Q17), while as a notable success in only 3 percent of the Q14 responses.

Social justice was named explicitly as a future priority by three respondents, while another referred to the need to reorganize the program in such a way as to not report up through a “particularly racist leadership structure,” leading to the theme’s appearance in 7 percent of the responses to Q17.

TABLE 6

The Number of Future Priority Categories Mentioned by Respondents in Q17 (n = 60)

Future Priority Category

Count

% of Respondents Reporting Priority Type

Building Initiatives/Services

31

52%

Broadening/Deepening Expertise

21

35%

Building Partnerships

21

35%

Expanding Utilization

16

27%

Adding Staff

14

23%

Securing Budget/Grants/Gifts

13

22%

Program Evaluation

13

22%

Delivering Training/Events

11

18%

Discussion

Since no study of digital scholarship programs has specifically analyzed programs at Carnegie Classification R1 institutions, the foundational finding of our research is that 99 institutions out of the 131—more than three-fourths of all R1 institutions—are home to a digital scholarship program of some sort. This prevalence signals that support for the integration of digital technologies into scholarly activities is seen as critical enough to warrant a dedicated unit at most very-high-research-activity doctoral institutions.

Our survey, however, reveals much beyond the prevalence of these programs. As we had hoped, it offers us insight into their practices. Generally, such programs tend to present themselves as “digital scholarship” programs and be led by a director. Our findings show, however, that there is a broad spread of program names and leadership titles, pointing to their relative novelty in the landscape of higher education. Inclusive of this leadership, we found that digital scholarship programs are staffed by teams with an average of five individuals who dedicate the majority of their time to a digital scholarship program’s activities, a team-size average that represents a slight increase from findings in the 2011 survey of ARL institutions conducted by Bryson et al.43 Our findings that more than half of the individuals who dedicate the majority of their time to digital scholarship programs are academic librarians and that 10 programs are administered entirely by librarians is also in line with the fact that the overwhelming majority of previous studies have focused on digital scholarship programs residing administratively within libraries. That said, our survey also reveals that very nearly half of the staff of digital scholarship programs come from other job classes, with strong representation from technologists, faculty, and administrators. Moreover, we found that 85 percent of programs have at least some mix of job classes, which echoes the findings of Bryson et al., Mulligan, and Zorich.44 This suggests that most digital scholarship programs are indeed cross-unit collaborations.

While collaboration between types of staff is common, libraries tend to be the main funding source for digital scholarship programs; and it is likely that some of the nonlibrarian/archivist/curator categories of staff are employed by libraries, such as technologists and administrators. Indeed, 42 percent of responding institutions were fully funded by libraries. And 91 percent of responding institutions fund their digital scholarship programs at least in part through libraries. Though the history of digital humanities and its centers have been explored in depth, the origins of digital scholarship and its centers are comparatively underexplored. In a brief commentary on the history of formal programs for supporting digital scholarship, Ed Ayers suggests that such programs took root in the first decade of the twenty-first century.45 However, neither Ayers nor others trace the original institutional units or original funding sources of programs. Our findings, along with the findings of Bryson et al., Mulligan, and Zorich, demonstrate that libraries have an outsized role in digital scholarship programs throughout the second decade of the twentieth century,46 but the question—why did digital scholarship programs tend to develop from within libraries?—remains unsettled. The fact that six of our respondents’ programs were funded only by gifts, colleges, or offices of a provost, president, or research suggests that alternative units from which to build and manage digital scholarship programs exist. Further complicating previous assumptions about libraries’ dominance of digital scholarship programs is the fact that the second largest measure of success respondents reported was “building partnerships.” Not only did more than half of respondents identify partnership building as a success, but multiple respondents directly reported this as alleviating confusion between units and facilitating access to infrastructure and personnel or, in the words of one respondent, “a user-centered … comprehensive suite of services.”

Speaking to the breadth of digital scholarship programs is our finding that these programs support research, teaching, and outreach and engagement—the three core activities of scholarship—in roughly equal measure. And that diversity of activity-support-type is further reinforced by the dozens of distinct support topics and the nearly 100 distinct technologies that respondents mentioned supporting. While our survey suggests a few key applications in certain areas of support—such as Python and R/RStudio for research computing and WordPress and Omeka for web development—digital scholarship programs otherwise support a wide array of applications with few areas of direct overlap. This breadth suggests that digital scholarship programs live up to the name of the field, supporting and directly engaging with a rich spectrum of technologies rooted in zeros and ones across all aspects of scholarship from research to engagement to teaching and beyond. This breadth of focus may also be evident in the fact that the term “data” only appeared in the digital scholarship program name at a single institution, despite the frequent support offered by digital scholarship programs for common data analysis tools such as Python and R/RStudio.

This wide range of support is likely the core reason that so many digital scholarship programs report their programs as having some overlap and duplication with other units on campus. Our qualitative analysis found at least 16 distinct types of duplicative units with particular overlap between digital scholarship programs and research centers, central IT units, and libraries. But while the array of support that digital scholarship programs offer often feeds boundary confusion with other units, we found that such programs focus the vast majority of their support activities—around 80 percent—on faculty, graduate students, and academic staff with research or teaching responsibilities, such as librarians and postdoctoral researchers, which aligns with the “scholarship” support mission of these programs. And, reflective of digital scholarship programs serving an internal population of scholars or scholars in training is the fact that these programs tend to use more traditional forms of digital communication (such as email and websites as opposed to social media) and high-touch outreach at faculty meetings and scholarly events such as research talks.

Though our short-answer and quantitative questions reveal much about the form and function of digital scholarship programs, our open-ended qualitative questions reveal even more about what digital scholarship programs do in practice. Extant literature on digital scholarship programs has proceeded from the assumption that these programs are “support” programs. We find little evidence to dispute this generalization and, indeed, find activities related to “support” of scholarship as the most common benchmark for success, with “building initiatives/services,” “supporting research projects,” “offering grants/fellowships,” “licensing/maintaining a tool,” and “supporting teaching projects” each reported as key successes by at least half a dozen or more institutions. The centrality of digital scholarship programs’ support mission was reinforced further by the qualitative responses to the way in which these programs evaluate themselves. The top three methods of program evaluation—consultation statistics, training/event statistics, and user surveys—all speak to a mission of supporting scholars. But while digital scholarship programs clearly serve a supporting role, many also act as intellectual hubs and their staff author or collaboratively produce scholarship too. This fact is evidenced by responses that measured success through securing external grants; offering degrees, certificates, and credit courses; and publishing scholarship.

Beyond the practices of digital scholarship programs, our survey also offers insight into the hurdles and hopes of these programs and reveals the tensions these programs experience within their universities and the socioeconomic pressures they face from the world at large. At the turn of the century, Sheila Slaughter and Larry Leslie outlined the neoliberalization of American universities and demonstrated how “academic capitalism” was reorienting the behaviors of universities and their staff. Universities, Slaughter and Leslie illustrated, were increasingly basing their decisions on revenue generation and increasingly behaving like their counterparts in the for-profit sector.47 The responses to our qualitative questions illuminate Slaughter and Leslie’s concept of academic capitalism in microcosm. Concerns about funding insecurity, understaffing, institutional inefficiencies and barriers, insufficient capacity, and staff turnover—all difficulties related to external market pressures and internal pressures to do more with less—represent five of the top six areas of concerns for digital scholarship programs. A concern that intersects and ties together many of these categories of challenges is a culture of internal competition between digital scholarship programs and other units that are outside their partnership or even, in a few cases, in overt and direct conflict with them. As one respondent stated, “Our university IT department, who are fully aware of our strategic plans for data viz and other services, are beginning to offer their own competing services to get funding from research grants.” Concerns like this also speak to common struggles to promote the program, build partnerships, and clearly communicate what the program does and does not support, categories that were reported as challenges by at least 15 percent of respondents.

One challenge that was a particular result of the timing of our survey, but is likely to only grow in its impact, is the COVID-19 pandemic. Our survey was distributed in May 2020, roughly two months after COVID-19’s initial exponential spread in the United States, which led to the physical closure of most academic institutions and the pivot to online support. It is perhaps surprising then, to note that only 14 percent of respondents identified challenges related to COVID-19 specifically in their responses. While many institutions were still only beginning to come to terms with the potential impact of the pandemic on institutional budgets and the well-being of their constituents, the authors suspect that COVID-19 would have been a far more prominent theme had the survey been distributed later in 2020 as higher education institutions began to more specifically reckon with the lengthy and significant impact of the virus on their operations and the world, generally.

One program pointed to the particularly notable challenge of racism, reporting that “white supremacy and blatant racism” were occluding the program from “the budget, staff or other material resources to do the work.” Though only one respondent reported this challenge, research on the administration of higher education institutions suggests that structural racism of this ilk is prevalent throughout academia.48 While neither we nor others have studied the social demographics of digital scholarship program leadership, the general underrepresentation of Black people, Indigenous people, and People of Color in leadership roles throughout higher education—but especially in the information technology and research offices sectors—may explain why only one respondent thought this challenge to be notable.49 Racism was only reported by one institution as a challenge, but four institutions identified social justice of some sort as a priority for the next three to five years. With this code representing only 7 percent of respondents, diversity, equity, inclusion, and other forms of antibias work did not appear to be a particular priority for most digital scholarship programs at the time of this survey.

The activities that programs aspire to focus on in the next half-decade or so correspond rather closely with the categories around which they measure their successes. It seems programs hope to do more of what they have done well such as building support services and expanding expertise. Concerns of internal marketing, internal competition, securing internal and external funding, expanding staff capacity, finding dedicated physical space, and generally sustaining the program again speak to the inescapability of academic capitalism within which these programs exist. But, while digital scholarship programs strive to perform well in the face of the market pressures that increasingly pervade academia, many also seek to design their programs’ future around ideals of partnership and resource sharing. Building partnerships with other units in their university was tied for the second most common goal of digital scholarship programs, and narrative responses representing such sentiments hoped to “deep[en] our relationships with … other allied units on campus,” “physically and conceptually connect existing (but historically siloed) experts, services, and activities,” “redirect funding to a shared pool … or partner more closely with campus affiliates,” “better integration with more units on campus,” “continu[e] to build a network among practitioners on campus,” “develop a cohesive network for providing services… and expertise across campus,” and “integrate our unit more closely with other related units on campus.” The reasoning for such strivings for partnership ranged from decreasing redundancy to clarifying access to support resources to working with greater efficiency, but most commonly these efforts aimed to quell internal competition and never mentioned external market pressures or management strategies as influencing such aims. Given these responses, it appears that building a culture of cooperation is an aim of digital scholarship programs second only to their core mission of supporting more projects that integrate digital technologies into scholarly activities.

Conclusion

Our survey and analysis of digital scholarship programs at R1 universities presents the structure, practices, obstacles, and aims of these relatively novel units within the landscape of large research institutions. Digital technology is increasingly present in contemporary scholarship, and these programs represent a critical means for scholars to enhance their research, teaching, and engagement work with established and emergent digital techniques and tools. By better understanding how digital scholarship programs at the premier research institutions of the United States have sought to build support structures for the application of digital technologies to scholarship, we are hopeful that our work sets a foundation from which comparative research of program design can be built.

To date, surveys of digital scholarship programs have presumed that these units were library-centered. Our work tells a more complex story. Academic libraries do, indeed, maintain an outsized role in funding, staffing, and housing digital scholarship programs. However, we find cross-unit collaboration is a critical component of the design of digital scholarship programs, too. If digital scholarship program leaders can point the leaders in higher education toward successful models of digital scholarship services and programs, they strengthen their case for investments in well-defined and user-centered services, professional development training and events, depth and breadth of support staff expertise, coordinated messaging to faculty and students, and, above all, cross-unit partnerships. And while we have focused on digital scholarship programs in our work, we believe our findings hold potential for program design in other fields as well. Within academic libraries, programs rooted in libraries yet connected formally or informally to other campus units such as scholarly communications and research data services may draw lessons from this work. Moreover, our findings related to cross-unit coordination and partnership should inform central administration planning within large research institutions when they consider the design or redesign of any units with a mission of supporting scholarship.

APPENDIX

The Survey Instrument

By proceeding to take this survey, I agree to participate in the R1 Digital Scholarship Program Study to be conducted by Cody Hennesy, Emily Janisch, Alexis Logsdon, Brian Vetruba, and Benjamin Wiggins of the University of Minnesota Libraries, Twin Cities. I may withdraw from the survey at any time without consequence. No means of identifying me will be provided to any future researcher in any event.

  • □ Yes, I consent to participate
  • □ No, I do not consent

Program Background

Q1. What is the name of your institution? ______ (text entry)

Q2. What is the name of your digital scholarship program or service? ______ (text entry)

Q3. What is your job title? ______ (text entry)

Q4. How many full-time employees dedicate at least 50% of their time to the administration or primary activities of your digital scholarship program?

  • □ Faculty ______ (number of employees)
  • □ Librarians, curators, and archivists ______ (number of employees)
  • □ Technologists ______ (number of employees)
  • □ Administrative staff ______ (number of employees)
  • □ Postdocs ______ (number of employees)
  • □ Other ______ (number of employees)

Q5. Across everyone involved in the program (whether formally or informally affiliated), at what percentage do the following job classes contribute to the core activities of the program? (Percentages should add up to 100%)

  • □ Faculty ______ (percentage of contribution)
  • □ Librarians, curators, and archivists ______ (percentage of contribution)
  • □ Technologists ______ (percentage of contribution)
  • □ Administrative staff ______ (percentage of contribution)
  • □ Postdoctoral ______ (percentage of contribution)
  • □ Graduate students ______ (percentage of contribution)
  • □ Undergraduate students ______ (percentage of contribution)
  • □ Other ______ (percentage of contribution)

Q6. List the name of all units (such as Office of Information Technology) or sources (for example, Mellon Foundation) that provide funding for your digital scholarship program:

  • □ Funding source 1 ______ (text entry)
  • □ Funding source 2 ______ (text entry)
  • □ Funding source 3 ______ (text entry)
  • [Up to fifteen funding sources accepted]

Q7. What is the rough percentage of overall funding that each unit has provided the digital scholarship program over the course of the past 12 months? (Percentages should add up to 100%)

  • □ Funding source 1 (determined via text entry from Q6) ______ (percentage of contribution)
  • □ Funding source 2 (determined via text entry from Q6) ______ (percentage of contribution)
  • □ Funding source 3 (determined via text entry from Q6) ______ (percentage of contribution)
  • [Up to fifteen funding sources accepted]

Program Activities and Engagement

Q8. What percentage of the following categories of activities and support does the digital scholarship program engage in? (Percentages should add up to 100%)

  • □ Research ______ (percentage)
  • □ Teaching ______ (percentage)
  • □ Outreach and Engagement ______ (percentage)
  • □ Other ______ (percentage)

Q9. For what topics do individuals most often seek support from your program? (examples: web design, data visualization, text mining) List up to five.

  • □ Topic one ______ (text entry)
  • □ Topic two______ (text entry)
  • □ Topic three ______ (text entry)
  • □ Topic four ______ (text entry)
  • □ Topic five ______ (text entry)

Q10. For what software applications, hardware, and/or programming languages do individuals most often seek support from your program for? (examples: WordPress, Raspberry Pi, Python) List up to five.

  • □ Technology one ______ (text entry)
  • □ Technology two______ (text entry)
  • □ Technology three ______ (text entry)
  • □ Technology four ______ (text entry)
  • □ Technology five ______ (text entry)

Q11. What are the most effective ways you share your program’s activities, services, and events with your campus users? (examples: email newsletter, department visits, tabling) List up to five.

  • □ Outreach one ______ (text entry)
  • □ Outreach two______ (text entry)
  • □ Outreach three ______ (text entry)
  • □ Outreach four ______ (text entry)
  • □ Outreach five ______ (text entry)

Q12. Who uses your program? Estimate how much each of the following groups uses your program’s service offerings. (Percentages should add up to 100%)

  • □ Faculty ______ (percentage of use)
  • □ Graduate students ______ (percentage of use)
  • □ Undergraduate students ______ (percentage of use)
  • □ Postdoctoral ______ (percentage of use)
  • □ Librarians, curators, and archivists ______ (percentage of use)
  • □ Technologists ______ (percentage of use)
  • □ Administrative staff ______ (percentage of use)
  • □ Community members ______ (percentage of use)
  • □ Other ______ (percentage of use)

Q13. Please list any units on your campus that offer services or programs that are closely related to, or duplicative of, those you offer. List up to 10.

  • □ Unit 1 ______ (text entry)
  • □ Unit 2 ______ (text entry)
  • □ Unit 3 ______ (text entry)
  • [Up to ten funding sources accepted]

Program Perceptions

Q14. What would you identify as the major successes of the program in the past several years?

________________________________________ (text entry, long form)

Q15. What are the primary challenges your program currently faces or has recently overcome?

________________________________________ (text entry, long form)

Q16. Please describe how your program assesses its own impact, successes, and challenges.

________________________________________ (text entry, long form)

Q17. What do you envision as the most important priorities for your program across the next three to five years? ________________________________________ (text entry, long form)

Notes

1. Joan Lippincott, Harriette Hemmasi, and Viv Lewis, “Trends in Digital Scholarship Centers,” EDUCAUSE Review 16 (2014), https://er.educause.edu/articles/2014/6/trends-in-digital-scholarship-centers [accessed 1 August 2020].

2. Tim Bryson et al., Digital Humanities, SPEC Kit 326 (Washington, DC: Association of Research Libraries, 2011), https://doi.org/10.29242/spec.326; Rikk Mulligan, Supporting Digital Scholarship, SPEC Kit 250 (Washington, DC: Association of Research Libraries, 2016), https://doi.org/10.29242/spec.350.

3. Cinthya Ippoliti and ACRL Digital Scholarship Centers Interest Group, “Survey of Digital Scholarship Centers Final Report” (March 11, 2016), https://drive.google.com/file/u/1/d/0B6OjJzCOmpfudDNzVG1vTUE3ck0 [accessed 7 November 2019].

4. Diane M. Zorich, A Survey of Digital Humanities Centers in the United States: CLIR Publication No. 143, Council on Library and Information Resources (Alexandria, VA: Council on Library and Information Resources, 2008), https://www.clir.org/pubs/reports/pub143/ [accessed 5 August 2020].

5. Bryson et al., Digital Humanities, 11.

6. Mulligan, Supporting Digital Scholarship, 45.

7. Bryson et al., Digital Humanities, 19.

8. Mulligan, Supporting Digital Scholarship, 36.

9. Bryson et al., Digital Humanities, 18.

10. Bryson et al., Digital Humanities, 50.

11. Mulligan, Supporting Digital Scholarship, 13.

12. Mulligan, Supporting Digital Scholarship.

13. Zorich, A Survey of Digital Humanities Centers in the United States, 34–35.

14. Ippoliti and ACRL Digital Scholarship Centers Interest Group, “Survey of Digital Scholarship Centers Final Report,” 5–6; Mulligan, Supporting Digital Scholarship, 13.

15. Ippoliti and ACRL Digital Scholarship Centers Interest Group, “Survey of Digital Scholarship Centers Final Report,” 10–11.

16. Bryson et al., Digital Humanities, 39–40.

17. Mulligan, Supporting Digital Scholarship.

18. Bryson et al., Digital Humanities, 42.

19. Mulligan, Supporting Digital Scholarship, 91.

20. Mulligan, Supporting Digital Scholarship, 91.

21. Bryson et al., Digital Humanities, 42; Mulligan, Supporting Digital Scholarship, 91.

22. Zorich, A Survey of Digital Humanities Centers in the United States, 32.

23. Mulligan, Supporting Digital Scholarship, 91.

24. Bryson et al., Digital Humanities, 42.

25. Mulligan, Supporting Digital Scholarship, 91.

26. Mulligan, Supporting Digital Scholarship, 28.

27. Bryson et al., Digital Humanities, 38.

28. Mulligan, Supporting Digital Scholarship.

29. Rebel Cummings-Sauls et al., “Transcending Institutions and Borders: 21st Century Digital Scholarship at K-State,” Kansas Library Association College and University Libraries Section Proceedings 6, no. 1 (November 9, 2016), https://doi.org/10.4148/2160-942X.1059.

30. Shelley L. Knuth et al., “The Center for Research Data and Digital Scholarship at the University of Colorado-Boulder,” Bulletin of the Association for Information Science and Technology 43, no. 2 (2017): 46–48, https://doi.org/10.1002/bul2.2017.1720430215.

31. Heather McCullough, “Developing Digital Scholarship Services on a Shoestring: Facilities, Events, Tools, and Projects,” College & Research Libraries News 75, no. 4 (April 1, 2014): 187–90, https://doi.org/10.5860/crln.75.4.9103.

32. A. Miller, “DS/DH Start-Ups: A Library Model for Advancing Scholarship through Collaboration,” Journal of Web Librarianship 10, no. 2 (April 2, 2016): 83–100, https://doi.org/10.1080/19322909.2016.1149544.

33. Pamela Price Mitchem and Dea Miller Rice, “Creating Digital Scholarship Services at Appalachian State University,” portal: Libraries and the Academy 17, no. 4 (October 10, 2017): 827–41, https://doi.org/10.1353/pla.2017.0048.

34. Meris Mandernach Longmeier and Sarah Anne Murphy, “Framing Outcomes and Programming Assessment for Digital Scholarship Services: A Logic Model Approach | Longmeier | College & Research Libraries,” accessed April 1, 2021, https://doi.org/10.5860/crl.82.2.142.

35. Carnegie Classifications of Institutions of Higher Education, 2018 Carnegie Classifications Update Facts and Figures (Bloomington, IN: Center for Postsecondary Research Indiana University School of Education, 2019), https://carnegieclassifications.iu.edu/ [accessed 1 August 2020].

36. Maria J. Grant and Andrew Booth, “A Typology of Reviews: An Analysis of 14 Review Types and Associated Methodologies,” Health Information & Libraries Journal 26, no. 2 (June 2009): 91–108, https://doi.org/10.1111/j.1471-1842.2009.00848.x.

37. Charles R. Harris et al., “Array Programming with NumPy,” Nature 585, no. 7825 (September 2020): 357–62, https://doi.org/10.1038/s41586-020-2649-2; John D. Hunter, “Matplotlib: A 2D Graphics Environment,” Computing in Science & Engineering 9, no. 3 (May 1, 2007): 90–95, https://doi.org/10.1109/MCSE.2007.55; Wes McKinney, “Data Structures for Statistical Computing in Python,” Proceedings of the 9th Python in Science Conference 445 (2010): 51–56.

38. Benjamin Wiggins et al., R1 Digital Scholarship Program Survey Dataset, 2020 (November 10, 2020), distributed by the Data Repository for the University of Minnesota, https://doi.org/10.13020/tdtb-2b96.

39. Fields in which respondents entered more than one topic were coded for multiple categories.

40. Respondents often mentioned multiple discrete forms of email outreach in their answers (such as newsletters or personal email), which, when coded to the more general category of “Email,” resulted in more responses for “Email” than the total number of respondents.

41. There is some likelihood that such “on campus” promotion was underreported because respondents completed the survey when most if not all universities in the survey were running reduced in-person operations in the midst of the COVID-19 pandemic.

42. An outlier here reported 100 percent use for the “Other” user category, but the same respondent failed to answer the preceding three questions or any of the following questions, so we suspect this was entered in haste. Nothing else in their response suggests they would fail to serve the primary categories of University constituents.

43. Bryson et al., Digital Humanities.

44. Bryson et al., Digital Humanities; Mulligan, Supporting Digital Scholarship; Zorich, A Survey of Digital Humanities Centers in the United States.

45. Edward L. Ayers, “Does Digital Scholarship Have a Future?” EDUCAUSE Review 48, no. 4 (2013): 24–26, https://www.learntechlib.org/p/133289/ [accessed 1 August 2020].

46. Bryson et al., Digital Humanities; Mulligan, Supporting Digital Scholarship; Zorich, A Survey of Digital Humanities Centers in the United States.

47. Sheila Slaughter and Larry L. Leslie, “Expanding and Elaborating the Concept of Academic Capitalism,” Organization 8, no. 2 (2001): 154–61.

48. Paul H. Barber et al., “Systemic Racism in Higher Education,” Science 369, no. 6510 (September 18, 2020): 1440–41, https://doi.org/10.1126/science.abd7140.

49. Jacqueline Bichsel et al., Professionals in Higher Education Annual Report: Key Findings, Trends, and Comprehensive Tables for the 2019–20 Academic Year (Knoxville, TN: College and University Professional Association for Human Resources, 2020), https://eric.ed.gov/?id=ED604980 [accessed 1 August 2020].

* Benjamin Wiggins is Digital Arts, Sciences, & Humanities Program Director in the University Libraries and Affiliate Associate Professor in the Department of History at the University of Minnesota; email: benwig@umn.edu. Cody Hennesy is the Journalism & Digital Media Librarian in the University Libraries at the University of Minnesota; email: chennesy@umn.edu. Brian Vetruba is European Studies, Jewish Studies, and Linguistics Librarian in the University Libraries at the University of Minnesota and Bibliographer for Germanic Literature and Scandinavian Studies at the University of Chicago Library; email: bvetruba@umn.edu. Alexis Logsdon was Humanities Research and Digital Scholarship Librarian at the University of Minnesota, and currently works as a content strategist in the private sector. Emily Janisch is Online Teaching Project Coordinator in Liberal Arts Technology & Innovation Services at the University of Minnesota; email: janis036@umn.edu. ©2022 Benjamin Wiggins, Cody Hennesy, Brian Vetruba, Alexis Logsdon, and Emily Janisch, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Benjamin Wiggins, Cody Hennesy, Brian Vetruba, Alexis Logsdon, Emily Janisch


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2022
January: 0
February: 0
March: 0
April: 0
May: 0
June: 8
July: 727
August: 129