top of page

Library 

Exploration 

Reimagined.

How do you choose just one online library catalog to serve —

31 libraries and thousands of patrons,

All with different needs, devices, and habits?

image.png

ROLE​

UX Researcher​

Experience Strategist

TIMELINE

13 Months

SKILLS

User Interviews

Usability Testing

Mix Methods User Research

Rubric Design

Data Analysis

Information Synthesis

Strategy Development

Report Generation

Team Collaboration

TOOLS

Microsoft Forms

Microsoft Excel

Microsoft Word

Zoom

Google Forms

Mural

PowerPoint

Canva

Google Docs

Otter.ai

COMPANY

Cooperative Computer Services (CCS)

​​​

A consortium-wide UX evaluation to identify a library discovery system that could make searching and borrowing simpler, faster, and more accessible. Through patron testing, surveys, demo rubrics, and cost analysis, I compared four discovery layers and guided CCS toward adopting a platform that best balanced usability, functionality, and long-term value.

OVERVIEW

pexels-polona-mitar-osolnik-899667-1837726.jpg

THE CHALLENGE

One system, many needs. Which catalog delivers?

Patrons were finding it harder to search, discover, and manage their accounts through CCS’s aging catalog interface. The system’s design and functionality hadn’t kept pace with user needs. Our challenge was to find a discovery layer that could make the experience seamless, accessible, and enjoyable for every patron across 31 libraries.

image-removebg-preview (6).png

Outdated look,

Outdated feel.

image-removebg-preview (7).png

Search that

didn't deliver.

image-removebg-preview (8).png
image-removebg-preview (9).png

Not accessible

to all.

image-removebg-preview (10).png

Disconnected 

library experience.

MY IMPACT

  • Led a full-scale discovery layer evaluation integrating usability testing, staff scoring, and cost analysis.

  • Designed research instruments - from usability scripts and post-test surveys to demo rubrics and reference interview guides.

  • Synthesized qualitative and quantitative data into actionable insights for leadership.

  • Created a visual framework in Mural mapping patron pain points and emerging usability themes.

  • Informed the Governing Board’s final decision with a comprehensive, evidence-based evaluation report.

  • Established a repeatable evaluation model for future technology and vendor assessments at CCS.

  • Strengthened cross-team collaboration between UX, member libraries, and system administrators.

image.png

So what does a truly user-friendly library search experience look like today?

CCS libraries had been using PowerPAC, a legacy discovery layer that no longer met patron expectations for accessibility, search experience, or mobile responsiveness.


The goal: to identify a new discovery system that could offer a modern, intuitive, and accessible search experience for both staff and patrons.

image-removebg-preview (5).png
image-removebg-preview (4).png

PowerPAC

image-removebg-preview (3).png
image-removebg-preview (2).png

KEY DELIVERABLES

Final Discovery Evaluation Report

Multi-phase Research Plan

Usability Testing Report

Post-test Survey Summary

Vendor Demo Scoring Rubrics

Post-demo Staff Survey Summary

Reference Check Interview Synthesis

Comparative Analysis Report

Accessibility Evaluation Summary

Cost and Implementation Overview

1

2

3

4

5

6

7

8

9

10

Here’s a closer look at the research framework that guided this project.

image.png

1

DEFINING THE RESEARCH PLAN

Grounding every decision in real patron experience.

I developed a structured, multi-phase plan that combined data-driven assessment with real user insights. Each stage built on the last - starting with user interaction, moving through internal and staff evaluations, and concluding with external feedback - to capture a full picture of system performance. Cost and implementation considerations were layered in to support a well-rounded recommendation.

Usability Testing

Search Exercise

Reference Checks

Cost Review

1

2

3

Post-test Survey

4

5

6

7

Vendor Demos

Data Synthesis

RECRUITING PARTICIPANTS

Real patrons. Real searches. Real feedback.

image.png

2

image.png

From digital natives to casual users, how do different patrons navigate the same catalog?

I recruited 20 library patrons across diverse demographics, ensuring a mix of ages, devices, and comfort levels with technology.

After contacting our member libraries, 4 expressed interest and assisted with participant recruitment by displaying sign-up banners on their library catalogs, promoting the usability testing and offering gift cards as an incentive for participants.

3

DESIGNING THE USABILITY TEST

Empathy through structure.

With 20 patrons, I conducted hands-on usability sessions across four discovery layers: PowerPAC, Vega Discover, Aspen Discovery, and BiblioCore.

Each participant completed standardized tasks such as:












 

 

Sessions followed a think-aloud protocol, allowing me to capture real-time thoughts and emotional cues. I tracked completion rates, errors, navigation paths, and nonverbal behaviors, from hesitation to moments of ease.

Usability test.png

POST-TEST SURVEYS

After each test, participants filled out a post-test survey rating usability, satisfaction, and overall preference. These responses, paired with qualitative notes, were coded and synthesized to uncover recurring pain points and themes.

image-removebg-preview (2).png

Catalogs selected as "Favorites"

Vega Discover

Bibliocore

Aspen

PowerPAC

Key Takeaways:

Bibliocore was the most preferred catalog.

 

Praised for clean layout and strong search functionality.


“Bibliocore just feels polished, and it knew what I meant, even when I typed it wrong.”

Vega Discover was liked for its visual design and ease of use, but the oversized layout affected scores.


“Vega was beautiful, but the images were too big and I kept forgetting to hit apply.”

PowerPAC maintained some support due to familiarity. Participants also appreciated its simplicity.


“PowerPAC is fine. I know where everything is, but it’s kind of stuck in time.”

Aspen had the lowest ratings, with recurring comments about layout clutter and inconsistent results.


“Aspen had good features but it overwhelmed me at times.”

SYNTHESIZING INSIGHTS

From quotes to themes.

ScreenRecording2025-11-02at16.03.43-ezgif.com-video-to-gif-converter.gif

I synthesized data in MURAL, mapping patron quotes and behaviors into emerging pain points.

  • Filtering & Sorting Confusion: Many struggled to locate or apply filters intuitively.

  • Visual Clarity: Interfaces with clean layouts and fewer distractions enabled faster task completion.

  • Accessibility Gaps: Limited color contrast and small clickable areas affected users with visual or motor challenges.

  • Consistency Matters: Patrons valued predictable elements like consistent icons, button placement, and clear feedback.

These patterns highlighted what patrons valued most: clarity, predictability, and low cognitive effort.

4

Screenshot 2025-11-02 at 19.36.14.png

5

VENDOR DEMOS & EVALUATION RUBRICS

Making subjective experiences measurable.

To ensure consistency, I designed detailed rubrics for both patron and staff evaluations. Each rubric used a 1–5 scoring scale, assessing usability, accessibility, customization, integration, and vendor support. Separate rubrics were later adapted for the DBM (Database Management) team and CCS staff during vendor demos.

To complement the usability findings, I designed

post-demo surveys to capture staff impressions & confidence in each system. These results were conveyed using different data visualization formats.

 

 

 

 

 

 

 

 

 

 

 

 

Findings:

  • Vega Discover stood out for its modern interface, robust patron engagement tools, and responsive vendor team.

  • Aspen was praised for flexibility but seen as high-maintenance.

  • BiblioCore earned points for polish but lacked control over configuration.

Screenshot 2025-11-02 at 17.19.02.png
Screenshot 2025-11-02 at 17.20.07.png
Screenshot 2025-11-02 at 17.21.34.png

INTRODUCING THE CANDIDATES

Four contenders, one future.

Each discovery layer offered distinct strengths:

  • PowerPAC: Familiar legacy system.

  • Vega Discover: Modern interface, strong integrations, vendor responsiveness.

  • Aspen: Open source flexibility, good customization.

  • BiblioCore: Polished design, popular in large systems.

6

image-removebg-preview (2).png
ScreenRecording2025-11-02at17.40.11-ezgif.com-video-to-gif-converter.gif
image-removebg-preview (3).png
ScreenRecording2025-11-02at17.43.34-ezgif.com-video-to-gif-converter.gif
image-removebg-preview (5).png
ScreenRecording2025-11-02at18.12.39-ezgif.com-video-to-gif-converter.gif
image-removebg-preview (4).png

PowerPAC

ScreenRecording2025-11-02at17.39.52-ezgif.com-video-to-gif-converter.gif

7

REFERENCE INTERVIEWS

Learning from Real-World Implementations

To validate our findings beyond demos, I conducted reference interviews with libraries using Vega Discover, BiblioCore and Aspen.

These conversations revealed how each system performed in real-world settings- from vendor responsiveness and feature development to integration and patron reception.

These real-world insights reinforced the usability and support findings from earlier phases, giving the recommendation strong practical grounding.

wejryfgkwrf.png

ACCESSIBILITY EVALUATION

Balancing experience with feasibility.

image.png
image.png

I reviewed each vendor’s Voluntary Product Accessibility Template (VPAT) and accessibility statements to understand their compliance with WCAG standards. To validate these claims, I cross-referenced them with insights from patron testing, where real users encountered color contrast issues, small touch targets, or confusing focus states.

Findings:

  • Vega Discover demonstrated steady progress toward WCAG compliance and clear transparency in its accessibility roadmap.

  • BiblioCore had partial alignment but limited recent updates in its VPAT.

  • Aspen Discovery offered open-source flexibility but lacked formal accessibility documentation.

  • PowerPAC struggled with low contrast and inconsistent keyboard navigation.

These insights were reinforced through external resources, including guidance from the Library Accessibility Alliance and LEAP (OCLS) accessibility studies.

8

9

COMPARING COSTS AND IMPLEMENTATION

Balancing experience with feasibility.

What We Compared-
Each discovery layer was evaluated for both cost-effectiveness and ease of rollout. Key considerations included:

  • Licensing and support costs

  • Implementation and training requirements

  • Ongoing maintenance workload

  • Vendor responsiveness and product roadmap stability

Screenshot 2025-11-03 at 19.10.54.png

Findings by Discovery Layer-
 

Aspen Discovery: Open-source and highly customizable, but demanded significant in-house technical expertise and ongoing maintenance.

 

BiblioCore: Polished and feature-rich, built within its own tightly integrated ecosystem. However, each additional module carried separate costs, making it the most expensive option overall with a risk of increased dependency on the discovery system.

 

Vega Discover: Balanced cost and partnership value, with a responsive vendor team, steady feature development, and lower internal support needs.

COMPARATIVE ANALYSIS OF CATALOGS

Balancing experience with feasibility.

To ensure a fair and comprehensive evaluation, I consolidated findings from multiple sources such as usability tests, staff rubrics, post-test surveys, vendor demos, reference interviews, and cost data.

Each discovery layer was compared across critical dimensions: usability, accessibility, integration, vendor responsiveness, and overall sustainability.

  • Vega Discover excelled in usability and vendor communication.

  • BiblioCore offered strong community engagement but limited flexibility.

  • Aspen Discovery provided customization strengths but required higher technical maintenance.

  • PowerPAC, the current system, lagged behind in accessibility and design intuitiveness.

The analysis highlighted clear trends in user satisfaction and operational feasibility, setting the foundation for an informed recommendation to leadership.

10

Screenshot 2025-11-03 at 19.25.59.png
Screenshot 2025-11-03 at 19.31.52.png

THE OUTCOME

Proven reliability: Backed by years of success with large public library systems.

image.png

=

image-removebg-preview (3).png

The final synthesis pointed clearly toward Bibliocore as the best-fit discovery layer for CCS.

Comprehensive ecosystem: Integrates discovery, events, and more under one platform.

Brand alignment: Offers a polished experience that fits CCS’s community-focused goals.

Vendor reputation: Highly regarded for responsive support and proactive communication.

Future scalability: Chosen for its potential to expand into a unified digital ecosystem.

image.png
  • Implement BiblioCore across member libraries.

  • Survey libraries on catalog setup preferences.

  • Set a clear launch timeline and milestones.

  • Sync and migrate catalog data.

  • Collaborate with the vendor on refinements and rollout.

NEXT STEPS

REFLECTIONS

image.png

This project reminded me that great UX isn’t just about data, it’s about connecting people, systems, and decisions. It deepened my empathy for both patrons and staff and sharpened my time management across a complex, multi-layered process.

bottom of page