Auraria Library: Taking Steps to Address Algorithmic Bias in Library Systems

¾«¶«´«Ã½

Intersections%20Header

By: Sommer Browning, Associate Director of Technical Services, Auraria Library, and Kelsey Brett, Head of Discovery and Metadata, Auraria Library

Auraria Library is the library for the Community College of Denver, Metropolitan State University of Denver, and the University of Colorado Denver. We are committed to equity, diversity, and inclusion (EDI) in our services, resources, spaces, and organization. One way we engage with these topics at Auraria Library is through a Technical Services Inclusion and Equity group, where we discuss EDI initiatives specifically related to our work in the technical services division. We also have book and film discussions to raise awareness of inequities within librarianship, and to develop a shared understanding of EDI.

In 2020, we focused on algorithmic bias and how it can perpetuate racism against marginalized communities. As a division, we read Masked By Trust: Bias in Library Discovery by Matthew Reidsma and had a discussion with the author. In this book, we learned how algorithmic bias plays a role in library software, specifically discovery layers. We also watched Shalini Kantayya’s film Coded Bias which was screened at the 2020 virtual Denver Film Festival. The film explores the wider implications of how this kind of bias in facial recognition software and surveillance technology actively disenfranchises Black and brown people. Through these readings, viewings, and discussions, we learned about the harm that algorithmic biases cause and that libraries, who often preach neutrality, are not exempt from imposing algorithmic biases on our users.

Like all academic libraries, Auraria Library licenses many commercial software systems that use algorithms for relevancy rankings, recommending resources, facet filtering, and more. The most important and well-used of these systems is the library’s discovery layer, called Start My Research on the library’s website. Start My Research is powered by Summon, a product we license from ProQuest/Ex Libris. It is the primary search tool for discovering and accessing the library’s eBooks, online journals, databases, streaming videos, and other scholarly resources. We thought it was important to acknowledge that biases exist even in Start My Research and that they impact what our students, faculty, and staff will discover. To that end, in the fall of 2020, Auraria Library released a within library systems and we curated a to promote resources that address the topic of algorithmic bias.

Image of someone at a computer. Text reads "Do you use Start My Research"

ProQuest/Ex Libris has been transparent around their work in reducing algorithmic bias. They recognize that biases exist within their algorithms and encourage customers to report biases when they see them. Reporting negative impacts of algorithmic bias has resulted in improvements to discovery system algorithms in the past. Many of the examples of bias in Summon’s “Topic Explorer” feature, a widget that provides a short reference entry when a term is searched, that Matthew Reidsma exposed in his book can no longer be replicated because the vendor adjusted their algorithms to reduce offensive associations between terms. However, these biases still exist, and it is important to continually report them.

In the spring of 2021, Auraria Library will conduct a survey asking for feedback on search results and giving users an opportunity to report biases they have encountered while using Start My Research. We want to hear directly from our users about their experiences using our discovery system, and we will share these experiences with our vendors so they can improve their results.

Check out these resources to learn more about this topic.

  • Kantayya, Shalini, director. Coded Bias. 7th Empire Media. 2020.
  • Reidsma, Matthew. Masked by Trust: Bias in Library Discovery. Sacramento, CA: Library Juice Press, 2019.