You are here: Home / The Empirical Librarians Conference / Conference Program Abstracts for Empirical Librarians 2017
2017 Empirical Librarians Conference, F.D. Bluford Library, N.C.A. and T., Greensboro, North Carolina, Friday, February 24, 2017

Conference Program Abstracts for Empirical Librarians 2017

A summary of the accepted and confirmed presentations and lightning talks for the 2017 Empirical Librarians Conference.



Getting started: Librarian’s Role in Supporting Research Reproducibility

Jeanne Hoover, East Carolina University
Shaina Jamani, JoVE

Multiple published reports (Ionnadis, 2005; TheGuardian, 2008;  Nature, 2016) suggest that scientific research faces a reproducibility crisis wherein scientists are unable to reproduce up to 80% of the results found in published studies. This has created concern among researchers, funding institutions, publishers, and the general public as to which findings and advancements can be relied on, and how quickly and affordably new scientific advancements can be achieved.
In this presentation, we will first explore the current landscape of research reproducibility and highlight what different groups - such as publishers and technologists - are doing to help scientists publish with greater transparency. Then, we will turn our attention to librarians as they have the opportunity to play an active role in the research reproducibility conversation. We will discuss how librarians are already supporting research reproducibility and steps to learn more and do more about this important issue. Our goal is to open the discussion to everyone to share their ideas on avenues for further research and/or collaboration opportunities.

Librarian Integration in a Working Group of the REDCap International Consortium

Fatima Barnes, Howard University

With the rapid growth of translational science, novel opportunities have arisen for biomedical librarians to contribute in unique ways. Two librarians who actively supported research initiatives at their institutions using REDCap (Research Electronic Data Capture), a free globally-utilized software system for collecting and storing research data, were invited to assist in the development of an electronic data collection instrument library application (the Shared Data Instrument Library) housed in REDCap. The librarians assisted with the instrument curation process, governed by the REDCap Library Oversight Committee (REDLOC), by assuming primary investigative roles for determining copyright ownership and subsequently obtaining permission from copyright holders to adapt instruments into REDCap. Over time, the librarians' role expanded to include helping to assess validity and usage of the instruments and contributing expertise to committee discussions. Additionally, one librarian served as co-chair of the committee in 2013-2014. Integration of biomedical librarians into the REDLOC Committee demonstrates the value and applicability of librarianship skills in translational science.

More is More: Planning for Unexpected Developments in Library Research

Justin de la Cruz, Atlanta University Center Robert W. Woodruff Library

Librarians typically run into some standard issues with conducting research in libraries. For example, there is a widely recognized problem with the recruitment and sustained engagement of participants, which can impact research sampling and therefore the generalizability of research results. Another issue is that many times researchers will have trouble adhering to their proposed research project timelines due to their time commitment to other assigned duties, new and unexpected developments, and the natural ebb and flow of their library's operations. During my first research project in an academic library setting, I purposefully dedicated planning time to figuring out various approaches to answering my research question. Initially I was interested in only conducting a survey, but my planning resulted in a mixed-methods approach that allowed me the flexibility of using surveys, focus groups, and follow-up interviews to address my topic. By listing all of these methods in my initial IRB proposal, I was able to save time and avoid the trouble of seeking approval for additional research methods during the course of my data collection. This proposed conference session would advocate for mixed-methods research and for planning multiple avenues of research in advance. I will outline my research project, including all of the large and small changes that took place over the course of it, and thereby advocate for being flexible when planning for and executing library research. In particular, I will present practical solutions to participant recruitment and retention, including seeking faculty support, meeting participants where they are, and providing targeted incentives. Additionally, I will provide ideas that new researchers could use to jump-start their projects, including tips on keeping research goals realistic, staying motivated, keeping timelines flexible, and following where the research leads you to.

Retraction Action - deduping and machine learning with twenty thousand citations

Ciara Healy, Perkins Library at Duke University

Beginning as part of a faculty member's grant proposal, I took on the large project of creating a data set of citations. Specifically the data is a set of all retractions in the science and social science literature, initially replicating a comprehensive survey method used by Greinstein & Zhang. When the grant was not funded I took it up as an independent research project, with the initial result of over twenty thousand citations stored in the citation manager Zotero. In this presentation I will discuss my search strategies, specifically how they differed from Greinstein and Zhang's, how Zotero worked as a place to organize and analyze the citations, and finally the methods I used to clean and de-dupe the set, which forms the bulk of my work on the set. In addition, I will also detail the soft-skill methods I used to recruit help from within the library to get my set ready for ingest into the institutional repository, including describing the data for use by others and preparing to offer the set as open source content.

Supporting Broader Impacts

Danica Lewis, North Carolina State University

Federal granting agencies, particularly the National Science Foundation (NSF), evaluate proposals on both their intellectual merit and their potential to advance desired societal outcomes. Most researchers are well prepared to prove the intellectual merit of their proposed research thanks to their training in research design and project investigation, but without similar training in communicating broader impacts many researchers struggle to meet this secondary merit review criteria. In less translational fields broader impacts can be especially difficult to identify and communicate. Libraries are well situated to help researchers with broader impact statements and activities. They are already working to fill knowledge gaps in data management, are a neutral crossroads and collaborative space, have venues for talks and outreach, are practiced in user design, and provide access to a host of technological resources. This talk will address the ways in which libraries can, and already are, supporting broader impact activities, and how those services can be communicated to both researchers and granting agencies.

Survey Design 101

Leo Lo, Old Dominion University

Survey is one of the most popular assessment methods used by academic librarians. However, survey research methodology is rarely taught in library school, and few librarians have the knowledge to collect accurate data and perform useful analysis. This presentation will help attendees understand the purposes of survey research; how to design a survey based library assessment; and describe the steps in designing a questionnaire. The presenter, who is currently pursuing a master's degree in survey research, will use examples to guide attendees through the basics of how to phrase questions to minimize response errors, how to utilize pre-tests to improve the surveys, and discuss general best practices in conducting survey research.

Under Construction: Building Research Data Services at UNCG

Anna Craft and Lynda Kellam, UNC Greensboro

Many university libraries are expanding their research services to include support for research data needs. But in a climate of growing demands and limited resources, how can libraries explore these options and address these needs without breaking the bank in terms of staff time or expertise?

The University Libraries of The University of North Carolina at Greensboro began exploring these questions in 2012 and are continuing to grow a suite of research data support services to meet demands on campus and to educate new researchers. This collaborative effort builds on existing strengths within the Libraries by bringing together personnel from Research, Outreach, and Instruction (ROI); Technical Services; and Electronic Resources and Information Technology (ERIT).

This presentation will discuss the scope and component pieces of the University Libraries' research data management services. We will present on the development of our service model and various associated activities, including support for data management planning, a series of workshops for graduate students created in collaboration with our Institutional Review Board, outreach and training efforts, and institutional repository integration.

Presenters will address specific roles, workflows, challenges, and lessons learned, and will provide perspectives from both public and technical services. Throughout the presentation we will encourage participants to consider points for collaboration within their institutions to support research data management.

Understanding Student Behavior to Support Student Success: Two Empirical Lenses for an Impactful Open Education Program

Will Cross, Lillian Rigling, & Eka Grguric, North Carolina State University Libraries

Librarians have become leaders in the open education movement, leveraging collections and expertise to assure that all students have access to instructional materials. In order to effectively support students and engage the faculty and administrators who make decisions about instructional materials, librarians need empirical information about student experiences. To inform our outreach and support, the NCSU Libraries have taken a two-pronged approach to understanding student needs and behaviors.

First, we mined library request and usage data from our textbook collections, and analyzed this data in conjunction with available course and bookstore data. This data provides detailed information about materials students are unable or unwilling to purchase from the bookstore. By examining this data we have been able to develop a clearer picture of the “pain points” where high-cost materials did the greatest harm to student access. This allowed us to engage in targeted advocacy with departments and faculty.

In addition, we did deep research on informal student information-seeking behavior. By examining student use of social media sites to locate and evaluate assigned learning materials, we sketched a picture of student culture at NCSU. Mapping these social interactions helped us identify patterns of information sharing based on department and demographics as well as the use of market “safety valves” such as used books, access codes, and engagement with pirate sites.

Taken together, these two streams of research have helped us understand our users and develop a stronger open education program. By doing similar empirical research, other institutions can bolster their own outreach by gathering granular information about student use and improve their understanding of student information-sharing to design more impactful services and programs.

Utilizing Public Services Staff to Test Web Assessment Tools and Create a Sustainable Iterative Usability Testing Framework

Ashley Brewer & Leo Lo, Old Dominion University

Old Dominion University Libraries is currently in the process of developing a strategic assessment plan with the intention to create a culture of using empirical data to inform decisions. One of the priorities is assessment of the libraries’ online presence, with the website being the main focal point. Most research and literature on usability studies focuses on direct assessment of users, which requires a great deal of time and planning for study design and participant recruitment. While conducting direct user assessment is important and valuable, and a priority for ODU Libraries, we wanted to pilot a preliminary assessment with our own Public Services staff, as the bridge between staff users, super-users themselves of our web tools, and the frontline in interacting with users, both in person and virtually over chat, often at users’ direct points of frustration or need. Based on research of academic library and market usability assessment surveys and questionnaires, as well as our own analytic data, we designed a questionnaire for Public Services staff for our website, discovery system, and online tools. We then analyzed the feedback we received, as well as our own questionnaire in order to both refine our future testing of Public Services staff, as well as inform our design of usability testing for our user population, with the goal to create a sustainable usability testing model to facilitate an agile development approach to iterative improvements to our web tools. In this presentation, we will focus on the larger picture of ODU library’s new assessment planning and the creation of a new Online User Experience Librarian position; our approach to the usability study of our online presence; the design and objectives of the questionnaire; our methodology and analysis of our data from Public Services staff; and how we intend to use this preliminary data to inform web improvements, testing of our user population, and the implementation of a sustainable and scalable agile development model.



Developing and Implementing a Usability Study: What We Learned

Alisha Webb and Amy Bondy, Guilford Technical Community College

In spring 2015, the Guilford Technical Community College Library started using OCLC's WorldCat Discovery Service as the primary gateway for students to search for and find both print and digital library resources. However, soon after the implementation of this new search tool, the librarians started asking questions: How does this system help students find information? How easy is it to use? Is it intuitive? Do we need to make any changes? To get answers to these questions, we had two options: 1) operate off of our trusty librarian's intuition, or 2) design and implement a usability study to get answers that could be backed up by data. While the idea of conducting a formal research project was a bit intimidating to two novice researcher-librarians, we chose option #2. Join us as outline each step of our research process from "idea" to "implementation". We will share both our successes and our road blocks, and any other tips we learned along the way.

The Digital Silk Road at Guilford College

Rachel Sanders & Tierney Steelberg, Guilford College

The Digital Silk Road is a multi-disciplinary course that has been taking place over the last few months at Guilford College. The course involves students doing high-level research on specific topics related to regions of the historic Silk Road and creating a website based on that research. The process has been a partnership between the professors teaching the course and the librarians who have supported the research. Librarians have also been a central part of the website process, serving as the primary source of information with regard to instructional technology.

An important part of the process involved an explanation of Digital Humanities in general, followed by an overview of the intricacies of historical research. A LibGuide was created and multiple technology exploration sessions were held to assist students in the process of documenting their learning, creating digital maps of their historical research, and preserving digital multimedia sources for use on the website. Students relied on a number of both primary and secondary materials to complete their research.

Finding Your Audience: Helping Digital Humanities Scholars Define their Active Users

Adam Griggs, University of North Carolina - Chapel Hill

The focus on digital scholarship has presented exciting opportunities for humanities scholars to engage with new audiences on the web. However, these Digital Humanities projects are often the product of researchers' and collaborators' drive to create novel insights through original research. Their work is directed at a particular scholarly audience, but this audience can shift after being discovered online. In my presentation, I will discuss how I have assisted Project Vox, a Digital Humanities initiative, to better understand their actual users and how this has guided our assessment efforts of the overall project. I was able to use Google Analytics to assess the amount of active and returning users, as opposed to one-time visitors, and what their behaviors revealed about what they were interested in. There were limitations to this approach, especially when it comes to questions about what users liked about the site and its usability. I then engaged directly with a cohesive group of actual users to elicit more detailed feedback. I did this by developing a Qualtrics survey and administering it to some of Project Vox's enthusiastic and previously known contacts. This has gone on to form the basis of our evaluation of the project and a touch point for having evidence based decision making for the future of the project.

Filed under: