⬅ Go back to Illumina internship
Help Center & Help Menu Usability Testing
I worked as a lead researcher for this project, leading all efforts in defining the research opportunities & goals, designing the research study, identifying, recruiting, and scheduling study participants, gathering and analyzing user feedback, and communicating the results in written/verbal format to the stakeholders (UX Designers, Tech Writers, Product Managers, and Technical Application Scientists).
Define & Hypothesize
Project Background
When I joined at the beginning of summer, the team had just launched a new version of Help Center website and a new Help popup menu on the product dashboard (a.k.a. "Help 2.0"). Since the contents of Help 2.0 would correspond to the new product features that were to launch soon in BaseSpace Sequence Hub, we wanted to understand the users' overall experience and potential pain points regarding navigating, searching, and understanding the contents on Help 2.0 system.
Old version of Help Center website
New version of Help Center website
Research Objectives
1. Can users find the right content?
- Can they find the right article or WalkMe guide for a given topic via Help 2.0?
- How do users navigate through Help 2.0?
- What do users think about new interfaces?
- WalkMe Help icons: Can users understand what they mean?
2. Can users understand the content?
- How do the article vs. WalkMe guide experiences compare to each other? Do users prefer one over the other? If so, in what circumstances?
- Can users fully grasp the new concepts to be introduced by the new product features?
- What can help with better understandability and clarity of the Help 2.0 contents?
Design & Plan
Research Design + Learning Plan
With the research objectives above in mind, I decided on gather user feedback by conducting task-based usability testing. Each 1-on-1 session would be about one hour long, and would have the participant to try to find appropriate help contents on major user task scenarios. These task scenarios were identified and prioritized by analyzing the Google Analytics user metrics and monitoring tech support call logs.
Before the testing, the participants would fill out a pre-survey via SurveyMonkey on their current usage patterns and behaviors when looking for help on our product. Specific questions revolved around what features they use the most, in what situations they frequently seek for help, and how they seek for help.
The first 25 minutes of the testing would be around the Help popup menu. The tasks revolved around understanding the users' behavior patterns and pain points when accessing help contents from the dashboard, and learn their experience of article-based vs. WalkMe-based help.
The next 25 minutes would be regarding the new Help Center website, which aimed to assess the usability and navigability of the new site, and examine the understandability of the contents provided by the site.
The last 10 minutes consisted of wrap-up questions that would review the overall user experience of Help 2.0, and gather any other final feedback.
Overview of the test design
Participant Recruiting
I aimed to recruit existing BaseSpace Sequence Hub users who are naïve to the newly launching version of the product, therefore naïve to Help 2.0. This was to make sure we had participants who would be using Help 2.0 when new features later roll out within the product they are already familiar with.
I also worked with Sr. Researcher on the team to aggregate previous user research participant databases, and with a Business Analyst to gather all internal user contacts who use our product. Then I set up user groups on company internal communication platforms (Yammer and Workplace) which served as active channels to find/recruit study participants more easily.
Through these efforts, I was able to recruit and schedule 9 internal users. They included 2 pilots (who helped iterate the test design & protocol), 4 Technical Application Scientists, 2 Field Application Scientists, and 1 bioinformatician user.
Conduct & Iterate
Study Logistics
After the learning plan was shared with stakeholders to receive final feedback and go through final iterations, the usability studies were conducted over the next 2-week period of time.
First, I provided participants with a test user account for the functioning prototypes. The participants would share their computer screens via WebEx video conferencing platform, and their screens and voices would be recorded under their permission. I conducted all sessions via UserZoom, and decided whether it would be done in-person or remotely based on their geographical location. I introduced the Concurrent Think-Aloud (CTA) protocol to the participants and encouraged them to freely share their thoughts.
Overview of Help 2.0 usability testing schedule
Data Collection
Notes were taken during the testing, and were saved real-time on the team's user research database on Confluence. The screen recordings were also later reviewed to fill any gaps in the notes. The test design and the script also continued to have minor tweaks based on the finished sessions.
Synthesize & Communicate
Data Analysis
Gathered notes were organized into a single Excel sheet, and were affinitized to group the user feedback under common themes and patterns. This extensive thematic analysis led to clear identification of users' concerns, frustrations, and wants from different elements of Help 2.0, including the icons, UI navigations, content types, search functions, and more. After reviewing the video/audio recordings, I also gathered verbatim quotes and video snippets that can effectively capture the users' experience and directly support the research findings.
Research Findings
At the end of the data analysis, I was able to summarize the key findings as below:
- Users appreciate the interactive WalkMe guide that appear on the product dashboard, but know it can’t be used for all purposes.
- "Walk-thru" help contents are effective for guiding first-time users through complicated tasks step-by-step, while article-based contents are better when refreshing memories, sharing instructions with others, and when tasks have longer instructions and need more complex judgments.
- New Help Center website’s content documentations can benefit from more verbal and visual guidance.
- Users had difficulty navigating the menu, front page, and the contents due to confusing wording and lack of visual guidance. Help articles sometimes had insufficient and/or incorrect information which hindered with users' understandability and clarity.
- Help site’s search bar should provide the most relevant results in order to gain trust from first-time users.
- Users want to see improved search results from the search bar mechanism, and do not want to be overwhelmed when results show everything that includes the keywords.
- Users have concerns moving to the new version of the BaseSpace Sequence Hub, based on what they saw from Help 2.0 contents.
- New concepts and changes: What changes are coming up & why are they being made? How will I be notified about them?
- Changes to workflow: How will the new UI and features affect how I currently use BaseSpace Sequence Hub? What if some features only benefit certain types of users?
- Autonomy: Can I decide what new features I want to use? Do I have the choice to opt out? Can I try this feature without having coding knowledge, or would I mess something up by trying it?
- Data integrity: Are my data going to stay intact? Will they be stored somewhere else? How will I be able to find them?
Delivering the Results + Design Recommendations
I created a written report on the team's Confluence database which includes detailed descriptions and examples for the research findings above. This report broke things down based on the different elements of the Help 2.0 system. The key usability pain points were supported by screenshots and relevant user quotes that can help communicate user concerns effectively. This written report focused on storytelling the users' feedback in a coherent and linear fashion, as well as including other supporting materials related to planning and executing this study.
Afterwards, I also created a Powerpoint deck that included all the highlights and key takeaways from the usability testing, and presented it at a 1-hour meeting with relevant stakeholders. The slides for this deck focused on clear visual representation of the users' feedback and their concerns, along with concrete design & content recommendations for the designers, tech writers, and product managers.
Impact
This research project provided the team with in-depth insights on how users want to look for help if they ever need to, and the kinds of concerns they have regarding the upcoming launch of the new version of the product.
It also identified clear opportunities and provided concrete recommendations to improve the Help 2.0 experience in its navigability, discoverability, and understandability, which have been taken into consideration by stakeholders after the presentation of the findings.