Thursday, May 15, 2008

Two weeks of online library interaction

You Must See This

The Resource Discovery Task Force (RDTF) tested 28 library homepages for two weeks, using the online tool CrazyEgg (CE) to record user site interaction. We collected a lot of data, which you really *must see* to begin to understand the significance of the study:
How It Works

Using a tiny bit of javascript, CE records the operating system, browser, referrer, associated search terms, window size and time to click for every user-generated click on the webpage (technically, the 95% of the world with javascript enabled). This data is aggregated into multiple views to help you understand how users interact with your site--where they click, how long they take to find a link, etc.

The screen-captures of the heatmaps we've collected show which spots on the pages were most frequently clicked--the hotter the area the more clicks.

Analyzing the Results

Individually, libraries can learn a lot about which links on their site are popular and which links are not. It's simple to see the results, change a few links, reorganize your page a bit and retest. In a few iterations of your design, you'll greatly improve the usability of your site.

On our campus, we have a common library site template. Looking across all the libraries using the template, here's what I believe to be true:
  • Headers - institutionalized and standardized content works well. Our headers see very consistent use across all the implementations. I believe this means they are well designed and very effective.
  • Databases - if you look at the Business library's results, you'll see their users really want simple access to database links. Looking across all the homepages, it turns out that quick, homepage level access to subject specific database links is the right way to go.
  • Search - our library template buries the optional search box in the footer of the design. When elevated, such as Wendt Library's search box, users opt for search with much more frequency. Having a consistent and comprehensive search solution across the campus library websites would be a major boost to usability.
  • Serial Content - many library sites have "dynamic" content indicating news and events or recent additions to the collection. These links are not frequently clicked, which makes me think we need to consider the staff cost of generating this content. Certainly, we should strongly consider downsizing the footprint of this content on our homepages.
Usability and Maturity

What do we do with all this new information about user interaction on our sites? Answer: we begin to shape a better, more user-friendly web presence for our libraries. Jakob Nielsen wrote a classic pair of web posts on the 8 stages of usability maturity. These posts are a great read and help illustrate the difficulty of achieving great usability in any corporation.

The stages:
  1. Hostility Toward Usability
  2. Developer-Centered Usability
  3. Skunkworks Usability
  4. Dedicated Usability Budget
  5. Managed Usability
  6. Systematic Usability Process
  7. Integrated User-Centered Design
  8. User-Driven
I think our libraries are somewhere between stages 3 and 4 at the moment. We have a few large projects (such as the RDTF) in the works to measure and recommend usability enhancements. There is a formal staff-time commitment (LWS Web Site Committee) towards improved design and functionality across our library system.

Gathering data to improve user interaction is critical to improving usability. This study should be seriously considered by every person who is a webmaster inside our libraries. We've done some good work towards improving our sites, but we have a lot of work left to do to make them great. I hope this study leads to more and continued user data gathering across campus. I also hope our future LWS brownbags lead to a greater sense of web-development community within the libraries.

Your Turn

If you made it to the brownbag Wednesday, you saw me demo the "confetti" view CE provides. This is were the CE data truly shines. Unfortunately, I cannot give everyone on campus the password/login to the CE site itself to produce these reports... having that information would allow you to add/delete tests or cancel our account altogether.

This was a one-off study, so all we can make available are the screenshots and data collected during the testing phase. However, you and your library *should* strongly consider signing up for your own CE account (their product is amazing so be nice, purchase a paying account!). Running tests across many of your pages helps gain a better perspective on how your site could be improved to better service your patrons.

If you have any questions about buy or using CE, just let me know. BTW--I'm not paid by CE is any way, I am just a big fan.

Questions? Comments?

Please let me know what you think of these screenshots. I would love to see many people comment on their reactions to seeing this data.

Cheers,
- Eric for the RDTF

5 comments:

Ron W said...

the link to the complete study does not work

Eric said...

DOH! Thanks for the tip Ron. This is fixed.

Steve said...

Very well done, Eric. Thanks for taking us into the skunky stage...

Ron W. said...

Based on the click study, I made a small change to our home page. The two items in the lefthand box were switched, so that the audio links, which were the most-clicked links during the study, are now at the upper left.

Ron W. said...

forgot to add the link: http://music.library.wisc.edu