Usability testing with Optimal Workshop

ow_logoUsability testing is one of the best parts of my job. I love hearing from users about how they interact with the library’s website and then figuring out what we can change to better meet their needs.

The dark side of this testing is the sheer time involved. Recruiting, scheduling, and sitting down with each individual user can be a daunting commitment of staff hours. I’ll say upfront: that type of testing is still great! It definitely has a place. But we’ve started using a tool that lets us run more tests, more often: Optimal Workshop.

One important bit: While Optimal Workshop has a free plan, you’ll get the most out of it if you spring for the paid level. It’s on the pricey side, but keep in mind that they offer a 50% discount to educational customers.

What we did

We used two of the suite’s three tools in a study earlier this year: Chalkmark and Optimal Sort. We advertised the tests with a pop-up on our homepage that was displayed to half our visitors. All respondents were able to enter a drawing for a $50 Amazon gift card at the end. We expected to run the tests for at least two weeks to get enough responses. But after just a week we had more than 500 and were able to conclude it early. That number exceeded my wildest expectations! Here’s how we used each tool:

Chalkmark

Think of Chalkmark as a first-click test. You display a screenshot or design draft to your users, and ask them where they’d click first to accomplish a given task. Results are displayed in a heatmap that’s easy to parse at a glance. For example, we asked users where they’d click first to search for a book on our homepage:

Click for larger view

Click for larger view

82% of clicks were either in our main search box or on the link to our catalog. That’s great! They were able to find their way to a book search easily. Another 7% clicked on our Research Tools menu. While that’s not ideal, it’s also not a bad option; they’ll see a page with a link to the catalog next. That leaves about 11% of our users who went astray. Thanks to some demographic questions we asked, we know a little about them and can try to figure out what was confusing or unintuitive to them in future tests. We can also view other heatmaps based on those demographic questions, which is proving useful.

(Side note: We asked library staff to take the same test, and got very different results! Fascinating, but the implications are still unclear and a topic for another time)

Optimal Sort

Analogous to an in-person card sorting exercise, in an Optimal Sort test users are shown a list of text items and asked to sort them into categories. We used it to get at how our menu navigation could or should be organized. Results are shown in a matrix of where each item got sorted:

Click for larger view

Click for larger view

Our results mostly validated our existing menu organization choices, but along the way we accidentally discovered something interesting!

We provided users with the option to sort items into a category called “I don’t know what these items are”. The original idea was to avoid users sorting an item randomly if they didn’t truly have an idea of where it should go. But a couple of items proved unexpectedly popular in this category, so now we know that some of our naming conventions need to be addressed.

Optimal Workshop’s third tool is Treejack, which is designed to test a site structure. We haven’t used it yet, but I’m looking forward to putting it through it’s paces.

Summing Up

Our website is an iterative project, one that is never truly finished. Optimal Workshop lets us run frequent tests without significant staff time involved in the execution, and to reach more users than we ever could in person. Even the free plan, with it’s 10 response limit, is still useful enough to get actionable data in the right context.

Are any other libraries using it? I’d love to hear what you’re testing.

Building a new Library.unc.edu

screenshot of a spreadsheet, with a content audit in process. Columns include Page title, URL, Coder, and Code(s)

Background

Library.unc.edu last received a comprehensive overhaul in 2013. Since then, the Libraries as an organization have substantially changed. A new site needs to serve as a front door to a unified, one library set of services, resources, and collections. The Libraries have a large amount of legacy web content, some of it almost as old as the web itself, which presents unique challenges in moving forward. I’m leading the team working on this overhaul, with an estimated completion of December 2023.

This work notably includes merging the UNC Libraries’ website with the UNC Health Sciences Library’s website.

Process

We began with a full information architecture audit of both major sites and examples of legacy content. We looked at published pages and asked ourselves about currency, responsibility, and types of content.

Next, we grouped content by like types.

Alongside this, we reviewed over 8,000 site searches by hand to gain understanding into what users come to our site for. We also interviewed internal stakeholders about their requirements for web content.

We found lots of common threads in our content, repeated elements that follow similar formats to each other. We started thinking about content in a fundamentally different way – at the item level, not at the page level. For example, our instruction services shouldn’t be a collection of static pages. They should be represented in a list of services, searchable and filterable by instructors looking for classroom assistance.

We’ve iterated on and validated that item-level architecture through user research. And today we’re working on implementing it. We’ve partnered with designers to bring it to life, and will continue to iterate based on research and feedback until launch.

My Roles & Methods

  • Project Manager
  • Content audit data processing
  • Card Sorts
  • Optimal Workshop’s Treejack for validating architectures

Visualizing User Needs by User Types

By asking simple demographic questions, user research can provide deep insight into the different ways different kinds of people use our sites & products. This is my favorite example of how visualizing this data can have an impact on stakeholders:

We were conducting a first-click test using Optimal Workshop’s Chalkmark tool. Participants are shown a screenshot, given a task, and asked to click where they’d click first to try and accomplish that task. Chalkmark makes it easy to visualize this data as heatmaps. When paired with demographic survey questions, it becomes possible to filter those heatmaps by user types. The results can be striking.

We asked users where they’d click first to find a book.

All responses

Heatmap of user clicks on the UNC Libraries' Homepage, with over 70% clustered on the screen's main search box.
About 70% used the main search box, and 15% clicked on the link to the Catalog.

Responses only from Libraries’ staff

Heatmap of user clicks on the UNC Libraries' Homepage, with over 50% clustered on the Catalog link.
52% went to the Catalog link, and only 33% used the search box.

This was one of the first visual examples we had that we are not our users. It was a powerful tool to show stakeholders and establish the need for further research into the design of our tools & services. Put simply, we are not our users. We behave differently than them, and can’t make design decisions based on our assumptions or the way we work. This one survey convinced the organization that we needed a dedicated UX Research team.

UNC Libraries’ Intranet

screenshot of results from a task in a study.  94% of users were able to find procedures for a common task in their department.

Summary

My goal was to analyze gaps in functionality and structure of the UNC Libraries’ internal intranet. Changes I made to the information architecture, based on my usability tests, increased task completion rates from 74% to 91%.

My Role

I served as primary researcher to analyze and improve the information architecture of the Libraries’ intranet. Built in Sharepoint, we had limited control over interface. But we had full control over architecture and structure, and were able to make key improvements. I conducted all research for this project. We’re revisiting this research in 2019 to look at how the Intranet’s content and structure has evolved. Qualtrics and Treejack work well to gather and analyze this kind of data.

Methods & Tools Used

  • Qualtrics Surveys
  • Interviews
  • Optimal Workshop’s Treejack

Sample Research Reports

UNC Libraries’ Library Catalog

Screenshot of the UNC Libraries' catalog, showing search results

Summary

Since 2017 I’ve served as UNC Libraries’ Product Owner of the new Library Catalog. This collaborative project, known as TRLN Discovery and built through cooperation with Duke and NCSU, replaced the libraries’ aging public catalog back end and interface. Try it yourself. The catalog is used by almost two million visitors per year to find Library materials.

My Role

As UNC’s Product Owner for the catalog, I work closely with POs at our partner institutions to gather and evaluate user needs, prioritize development tasks, and communicate with stakeholders. Under my leadership and advocacy, project staff built a roadmap of twice-monthly usability testing with results communicated to stakeholders and integrated into development. Our research reports, pulled from a variety of methodologies, always make practical recommendations to stakeholders and administration. Whenever possible, I invite those stakeholders to witness a test in progress. I find this invaluable for building a case to make a change. Since the product launch in 2019 I’ve worked to prioritize fixes & improvements, gather continual stakeholder input, and manage complex software updates.

Methods & Tools Used

  • Agile workflows
  • Qualtrics: Surveys to gather feedback and user needs.
  • Optimal Workshop’s Chalkmark: First-click testing on UI concepts
  • Observational Tests
  • Interviews
  • Log & Analytics Analysis

Sample Research Report

Final report of an observational study
Representative of many other studies from this project.

Spring Webinars on Usability Testing & WordPress

8c20ccc1d53c5bd8bb8b15b50bafe2a1I’m excited to announce two free webinars I’m doing this spring with ASERL – one in March, and one in April. Registration (again: free!) is open for both. Recordings will be available afterward. (And I know March 11th isn’t technically Spring, but I like to pretend it is.)
 

Assess Your Website Cheaply or Free

Friday 3/11, 2PM EST
Register

You don’t have to break the bank to test your website! This webinar will introduce you to tools that you can use for free to remotely get in the heads of your users.

You’ll learn about common remote usability testing techniques like:

  • Card sorting
  • First click testing
  • A/B testing

Services like Optimal Workshop and others make it possible to use all these techniques at low or no cost. And you can do it all remotely without even placing a burden on your staff. In this webinar you’ll get an introduction to these tools and hear about how they’ve been used to improve the UNC Libraries website.


Building an Academic Library Website in WordPress

Monday 4/11, 2pm EST
Register

WordPress isn’t just the most popular blogging software in the world, but also a powerful content management system that runs more than 23 percent of all websites. The current version alone has been downloaded more than 30 million times, and the WordPress community has built more than 43,000 plugins to extend and enhance the system. Academic Libraries are using WordPress to create community-oriented websites, blogs, subject guides, digital archives, and more.

This practical session will walk you through the entire process of creating a basic WordPress website for your library, including:

  • Setting up a simple WordPress website from scratch
  • Selecting a theme and customizing the look of your site
  • Using plugins to enhance and improve your WordPress site
  • Maintaining and updating your WordPress website for the long haul

You’ll also learn about how UNC Libraries migrated their website to WordPress, including challenges encountered and tips learned along the way.

Assess Your Website with Free Usability Testing Tools

Back in April I did a webinar for NCLA’s Technology and Trends roundtable: “Assess Your Website with Free Usability Testing Tools”

I didn’t get around to posting it at the time, but the recording is freely available! I covered how to use tools like Optimal Workshop, Optimizely, and Marvel to do quick and free usability testing of your site. It’s a beginner-friendly presentation, starting with an intro to just what usability testing is and why it’s useful.

The slides are available below, but they probably make more sense if you just watch the recording instead.