Restructuring, Without Restructuring, the Navigation Experience of the Georgia O’Keeffe Collections Online Website

Abstract

The Georgia O’Keeffe Collections Online Website wanted us to give insights into the functionality of the current website and offer recommendations to guide their interface redesign, which is currently in development. Our team evaluated the usability of the Collections website using remote moderated user testing methods. While the website provides an extensive resource to browse, our team has found that clarifying the way information is searched for and displayed in addition to adding features to track viewing history will improve the overall navigation experience of the website.

Team

Our team of four are all in our second semester in our respective programs. Victoria, Akshata, and Tanisha are pursuing an M.S. in Information Experience Design, while I am pursuing an M.S. in Data Analytics and Visualization.

My Role

My role in this project included designing and moderating user tests on the Georgia O’Keeffe Collections Online Website based on client concerns; analyzing test data and consolidating it into recommendations; developing mock-ups for recommendations; synthesizing methods, findings, and recommendations into a usability report; and creating a presentation of the report contents to the client.

Deliverables

Presentation DeckUsability Report

Making Usability Recommendations to a Collections Website Currently in Redevelopment

Initial Impressions: Vast Database, but Overwhelming Structure

Our client was the Georgia O’Keeffe Museum Collections, represented by Liz Neely, who has been working with the museum for almost a decade. The Collections Website, separate from the Georgia O’Keeffe Museum website allows users to browse the work, possessions, and correspondences of Georgia O’Keeffe to gain more insight into her life and who she was as an artist. The Collections website is a vast database of different objects and offers a unique feature that allows users to explore relationships between objects. However, the overall structure of the website is not intuitive to navigate and the filters require some refinement for more effective browsing.

Our Approach: Usability Test for Navigation Feedback & UX Research

Based on interest in working with this client, Victoria, Akshata, Tanisha and I were placed in a group to conduct a moderated usability test on the existing collections website. We chose this method because it allowed us to collect thorough feedback from participants from hearing them think-aloud while completing tasks, through implementing questionnaires for quantitative data, and directly asking them for feedback in real time. These additional options for collecting metrics were important because our client also wanted feedback on user experience, as well as general thoughts on mock ups of the website interface redesign, which was not yet functional at the time of beginning this project.

Initial Client Meeting: Investigate How Website is Used. Make Recommendations Based on Findings.

Prior to meeting with our client to discuss their goals and based on exploring the website individually, as a team we agreed that the website has a major information architecture issue, which would require a major restructuring of their current website to effectively fix it, but logistically we knew that wasn't feasible.

Our meeting with the client was very informative in learning more about the current functions and goals of the website. The client was most interested in finding out what how people use the website (e.g. filters, search bar) and whether they were finding the information that they were looking for. Even though the client expressed interest in “sky’s the limit” thinking in our proposed recommendations, we had to take into consideration what is actually feasible in recommending improvements to manage expectations when recommending larger scale solutions.

Defining Target Users

User Profiles: Researchers & Art Enthusiasts

The client expressed the desire for this website to be a research resource for the life and works of Georgia O’Keeffe, so we knew that researchers and art enthusiasts would be the main user group for the website. However, as it is an extensive collection of her works, we felt it was important to widen the pool to anyone with an interest in art to make sure we are making recommendations for an intuitive experience for all. In this initial planning stage we decided on two screening questions: (1) Do you have an interest in art? and (2) Have you been to an art museum? The intention was to keep the pool of potential participants wide, but acknowledge there should be some familiarity with art and the details that should be available when looking at art works.

Recruiting a Representative Sample, but Also Considering Overall Accessibility

In our initial meeting, the client offered to put us in contact with several researchers for the Collections website, but we also agreed to do our recruiting to round the rest of our eight required participants. We reached out to the list of researchers provided to us and only two responded that they were willing to participate in our user test. From there we knew we had to recruit six additional participants and brainstormed people who fit a wide variety of age ranges and professions within our own personal circles that we felt would fit the user profile and distribute the screening questions. The target audience of researchers and art enthusiasts are likely people who feel comfortable using the internet and comfortable using databases. We felt this was important to record in our pre-test survey of actual participants to help gauge how easily they were able to navigate through the tasks.

Reflection: There was a misunderstanding within our team about if and when to implement screening questions. Based on the recorded responses within our Google form, it looks like I was the only one out of my team who distributed the screening questions. Because we were hoping to cast a wide net, I don't think missing this step affected our overall results. However, in the future I think there needs to be more clarity within the team that this is an important step. We should ensure we are actually screening potential participants to fit a certain profile, instead of basing it off our perception of people to fit that profile.

Identifying the Navigating Styles of Users…and then the Usability Problems

Structuring the Usability Test

We knew early on in planning our usability test that we would like to have two team members present for all tests: one moderator and one observer who would take notes. This allows the moderator to focus on hosting and prompting the participant, asking for the appropriate follow up as needed, and ensuring the participant feels comfortable throughout the test. We then started developing the script for the usability test, and concurrently developing the Google form that would be used in conjunction with the test. The form kept track of a pre-test questionnaire, consent form, the task list, and a post-test questionnaire.

Simulating Research Queries

We decided to develop a series of tasks that ask for information that one might need when doing a research paper on Georgia O’Keeffe. The goal was to tailor the tasks in a way that users would be forced to use the different ways one could query information on the website: by using the filters and the search bar. Additionally, we wanted to make sure the tasks covered the different pages on the Collections website, from her art to her correspondences, her library, and her possessions.

Developing 6 Tasks to Test Different Filtering Methods

This was one of the more challenging parts when developing the usability test, as the tasks are really the best indicator that you understand the client’s goals. It was also challenging because as a team, we already had discussions about issues we had with the website. Even though we did our best to not write tasks tailored toward the problems we’ve identified, I personally feel our biases came through a bit. However, I’m not sure if this can really be completely avoided. 

Reflection: I think the biggest indicator that makes me question whether our biases came through, is that we weren’t surprised by any of the issues that our participants found. I was hoping if we made our tasks general enough we would find some usability issues that we didn't anticipate since everyone would approach the tasks differently. In the future I think we could spend a little more time being intentional about what we want to test, conceptually, and then boil down from there. I think our strategy felt like we knew, for example, there were many issues with the filters, so we wrote tasks that exposed those issues, which only helps reinforce our observations. I think we could have been more thoughtful in tailoring our task to fit our scenario more cohesively and sequentially, like parts to an overall goal.

Developing a Post-test Questionnaire to Evaluate New Design

We conducted our usability test on the working website, but our client was interested in also getting some feedback on the mock ups for the website redesign that was in development. In order to collect information on this, we showed screenshots of the website participants had just been tested on next to mock ups of the new website that our client provided us. We then asked them “Can you provide some feedback on these mockups of the new Georgia O’Keeffe Collections website shown below & what you like & dislike about them in comparison to the original website you just browsed?” to see if we needed to make additional recommendations to the interface.

Pilot Testing Before Moderating Our User Test

I chose to pilot test someone also taking INFO 644 this semester, but in a different group. I thought this was a strength because they understood the process and was able to offer an outside, credible perspective on how we were approaching our test. Not only did it give me a chance to review our script in practice, but also to work out the kinks in moderating remotely through Zoom. We chose to use a Google Form with different sections that held our pre-test questionnaire, a consent form, written reminders of the current task for our participants to return to, and our post-test questionnaire. Our script had been mostly developed at this point, but I chose to moderate the test more casually, asking for their thoughts during points of the test that I wasn’t sure how to navigate. It was great to get feedback on that experience from the pilot test, and whether toggling back and forth between the Collections website and the Google form felt clunky or too distracting for them. It also exposed some unclear phrasing of our tasks, which we were able to refine after comparing all our pilot tests.

Pro: New Design Addresses Many Concerns and is Well-Received. Con: Navigating is Hard, Structure is Unclear

From our eight usability tests, our team organized and condensed all the issues we found into 17 unique usability issues, as shown on the chart below on the right. Then, each of us selected a color and made eight circles to place on the usability issues that we felt were most important to address. We then developed our recommendations based on issues that received the most circles, as shown on in the chart on the right and began brainstorming potential solutions to incorporate into our mock ups. From there we each selected a mock up to work on for that recommendation individually, and then came back together as a team to review the first pass at the mock up for critiques and suggestions.

Unforeseen Challenges While Moderating User Tests

The tests I moderated went smoothly, however I observed and took notes for a couple user tests with older participants who struggled with our tasks. Despite feeling confident in their experience using the internet, it was clear that they didn’t fully understand how to query information and had some difficulties navigating the web browser, Zoom, and the Google form together.  

Reflection: It was interesting to see firsthand how less tech-savvy individuals approached tasks and what they were looking for and preferred when querying information. It was also a great experience to see how the moderators of those tests tried to diffuse the frustrations that they felt either by reassuring them that the test was about testing the website as opposed to their ability, and sometimes just skipping the task all together. Since this demographic is often unaccounted for, despite them having the most time to use a resource like this, it made us feel empathetic toward these struggles so much that we ended up focusing an entire recommendation on improving this experience for this demographic group.

Deciding on a Mock Up Style

Our team felt conflicted about which style we should do our mock ups in: the style of the website we tested on, or in the style of the new mock ups. On the one hand we tested all the functionality on the old website, but on the other hand we know this style of the website was on its way out. We were unsure about the timeline of the launch of the redesign, but from a practical standpoint it made sense to me to do them in the new style. We reached out to the client to gain some clarity on the situation and she left it up to our judgement. Ultimately we decided to present mock ups in the style of the new website, but also to include mock ups in the old style in the appendix for reference.

Post-Survey Results Reveal Participants Liked the New Design

The new design already addresses some concerns that we had with the old website, including adding descriptions to the individual pages within the collections, as well as condensing the footer. Overall, it is a cleaner design that helps break up the page with color blocking in a way that organizes the information. In our post-test questionnaire, we found our participants liked the new designs, which helped put to rest which style of the website we should do our mock ups in and present.

Recommendation 1: Clarify the "About" Page

When we structured a task about asking for information about Georgia O’Keeffe’s biography, users tended to click on the “About” page thinking it would lead them to information about Georgia O’Keeffe. However, the “About” page currently brings you to a page about using the collections website. We recommend clarifying this page by changing it to “About the Online Collection”.

Recommendation 2: Condense Filter Categories

The length of filters listed in the drop-down menu in the current website is overwhelming and defeats the purpose of having a filter when the categories are too specific, as shown above. We recommend simplifying the filtering options by separating out medium from the current “Material” filter, as shown below. This similar recommendation could be applied to all lengthy filters.

Recommendation 3: Reorganize Object Details

The amount of information for each artwork or object details were overwhelming to our participants. Additionally, information was not consistently displayed in the same areas of the page. We recommend reorganizing the details by placing the description to the right of the object of focus, organizing the details into dropdown menus below the object of focus, adding filters to related artwork to replace the structure of the “Relationships” tab on the current website, and clarifying the headings for the related exhibitions with subheadings.

Recommendation 4: Add an Accessibility Feature

Since over25% of our participants were over the age of 55, we felt it was important to address a demographic that is often forgot about when designing web experiences. Those who were over the age of 55 had a harder time finding the details we requested in our tasks due to issues with text legibility and extended screen time. This accessibility feature would give users the ability to listen to the artwork description and details.

Recommendation 5: Introduce a New Feature to Track Viewing Sessions

According to our post-test questionnaire, 75% of users felt lost navigating the website, which effects how the website can be used as a research resource. While this would require huge back-end development and determining the scope of the feature, we recommend adding an option to save items to view within the same session, though this can be scaled up for a more permanent record. Additionally we recommend adding a “Recently Viewed” bar across the bottom of the page to keep track of browsing history. The mock up on the right shows the bookmark icon on the object page, while the mock up on the left shows the actual "Saved Items" page.

Client is Enthusiastic About Implementing All Recommendations

Our client appreciated the insights we gathered from our usability test and was excited by the mock ups we created. Overall, she felt our recommendations made sense, and for many of the issues we found, she also thought were issues that needed to be addressed. She was very engaged throughout the presentation by asking for clarifying points or additional information along the way. After we finished the presentation, we went through each of our recommendations with a more in-depth discussion on how our recommendations could be implemented. Most importantly, she felt all our recommendations were feasible and was excited about developing a plan to get the easier recommendations in as soon as possible, as well as determining full scope of the more involved recommendations.

Using Our Usability Test for Next Steps in Grant-Funded Audience Research

Between the time we began working with the Georgia O’Keeffe Collections Online website on the usability report and our presentation, our client secured a grant to fund additional audience research. She is looking forward to using our findings to guide the scope of this new research noting, “You all were the first to dive in here, and I will be using this as we can make changes, and also as we make scopes with our audience research consultants, and maybe even work with future students at Pratt. I’m really impressed by your work, and it will be used to make changes, and also to inform our future work so I really appreciate it.”

Final Thoughts on Our Process

Getting to work on this client-focused, real-world project to assess usability was a fun and challenging final project. I think we delivered a strong final product on aggressive team deadlines and in a highly collaborative process. Some major lessons include an importance of our goals as well as the client’s goals, remembering to be thorough in making sure every step along the way cohesively aligns with those goals, considering usability for all users of different browsing experience (not just the convenient ones to test), and that communication among teammates is key for a smooth execution. If we were to continue working on this project I would like to not only conduct more user tests with more thorough follow up questions on the nuances of the challenges participants faced, but also to start analyzing and developing recommendations for the other usability issues we identified.