A tilted iPhone displays the Q-Life Employee 'Resources' screen in front of a straight view of an iPhone displaying the Q-Life Employee 'Journal' screen

UX Research, UX/UI Design

Q-Life Employee

Redesigning a workplace mental health and well-being platform for a new audience

Company

JackHabbit Inc.

Sector

Health and Wellness, B2B2C

My Role

Research Lead, UX/UI Designer

Year

2023

An isometric repeating view of six unique, redesigned Q-Life Employee mobile screens against a blue-green gradient background

Project Background

The Challenge

Our company had previously launched a mental health and wellness platform for post-secondary students. We wanted to expand this success into the workplace market. To make a new version of the platform resonate with a larger employee demographic, we needed insight into employees’ diverse needs and use contexts.

The Approach

We set out with two goals that would impact most of the app’s features and interactions: 

 

  1. Compile a list of recommendations to improve usability and appeal
  2. Adapt the current user experience (UX) to meet the diverse needs and contexts of a new employee audience

 

Because of our project’s unique starting point, I structured our strategy as a customized version of the design thinking process. The project phases were:

 

  1. Research

    • Test the current application with our new audience

    • Empathize with their unique needs and contexts

  2. Define and communicate our findings and opportunities
  3. Ideate design solutions
  4. Prototype new designs

 

I chose tools and methods that would deliver the types of insights we could leverage while working within our time and budget constraints.

Icons: Flaticon.com

The project's customized process: Test and Empathize, Define, Ideate, Design and Prototype

My Role

My responsibilities included:

 

  • Creating our UX research strategy
  • Conducting research
  • Processing and analyzing the data
  • Presenting findings to stakeholders
  • Collaborating with development and leadership on solutions feasibility
  • Redesigning the user interface to improve usability and engagement
Two Q-Life mobile screens on the left, followed by an arrow pointing to a blank white mobile screen with a blue question mark on its center. The screens are against a light blue-green gradient background.

The Q-Life platform before redesign

Research

Creating a UX Research Plan

To inform our approach, I began by asking:

 

  • What relevant data do we already have?

  • What new data do we need?

  • What resources can we leverage and what are our constraints?

 

Our team had feedback from a previous user survey conducted with our student demographic. This gave us some general usability insights. Now, we needed insights specific to an employee demographic with its diverse needs and use contexts. Given our limited time, I opted for a more qualitative approach using the following research methods:

 

  • Contextual inquiry (i.e., observing people use the app as they naturally would in everyday life)

  • Usability testing

  • 1:1 Interviews

  • Initial and closing surveys

An isometric view of a three-page document entitled Q-Life Employee Use Experience Research Plan. The document's sections are: Project Brief, Background, Objectives (Research Objectives, Business Objectives, Research Success Criteria, Deliverables), Hypotheses/Assumptions, Methods, Participants, Setting and Tools, Timeline, Project Documents repertory.

The User Experience Research Plan

Initial Survey: Discovering the Needs of a New Audience

An initial survey sent to 15 testers helped us get an aggregated snapshot of their day-to-day lives; the impact of work on their health and well-being; and how an app could best fit with their routines, preferences, and needs.

An isometric view of survey result examples consisting of two pie chart visualizations and two bar graph visualizations

A few survey questions and answers

Contextual Inquiry:

Observing Application Use Within Everyday Contexts

In an application use experience, what we say and what we do can sometimes differ significantly. To supplement the limitations of self-reporting with observational data, I conducted observation sessions with 3 of our 15 testers. These were people employed in varied jobs including remote office work, on-site labour, and customer service.

 

I observed each person doing a set of prompted tasks on our platform in which they interacted with each area and feature. Some of these tasks included logging a journal entry, completing a daily check-in, and progressing through educational content.

 

I kept an audio recording going, so participants could talk through their actions at a natural, uninterrupted pace. These contextual observations gave us helpful, sometimes surprising insights.

Usability Testing

I conducted moderated usability testing with 3 of our testers and collected unmoderated usabilty testing with the remaining testers. To do this, I gave tasks to each tester, followed by questions about their experience. I based these questions on Jakob Nielsen’s 10 Usability Heuristics for user interface (UI) design. With this focus and mixed-moderation reporting, we were able to get a more thorough view of our UI's usability. 

 

Notably, the moderated group gave us some unexpected insights. For instance, I observed the interface become confusing to a couple participants as they completed a specific task. However, when the participants eventually completed it, they expressed a positive perception and forgot about their earlier frustrations or confusions.

An isometric view of a usability testing task prompt and its follow-up questions. The prompt reads, "Activity 1: Sign-up and installation. Time required: ~5 minutes. Check your email for the Q-Life Employee app installation link and click the link to complete the sign-up process."

Usability testing: a task prompt and follow-up questions

Defining Research Findings

Identifying Feedback Themes

We had prior usability data from a student survey, and now had data from our employee group as well. It was time to look at the data and synthesize it into actionable items that told us clear stories about our audience's needs. Using the data, I compiled a thematic analysis. This method told us which issues were most common and which were outliers, which features were most used and appreciated, etc.

An isometric view of two pages of the User Feedback Thematic Analysis. The pages are entitled "Most Common Feedback Areas" and "Most Common Pain Point or Suggestion Themes". Both pages include data for area/theme, percentage of total feedback, and feedback count. Both include corresponding pie chart visualizations.

A sample of the thematic analysis compiled from survey data

Mapping Pain Points and Opportunities

With the newly gathered usability testing data, I could begin looking at patterns and answering overarching questions:

 

  • What were the successes, pain points, and themes here?

  • How might we leverage this data to enhance our platform in the most efficient way?

 

To answer these questions, I created an affinity diagram. This involved separating out the data points and grouping these by area, task flow, and pain point.

An affinity diagram divided into four quadrants: General, UI, Features, Content. Within each quadrant are colourful digital post-it notes, each containing a user feedback data point. These data points are further organized into subcategories such as Layout, Navigation, Requested Features, etc. The whole diagram is against a light blue-green gradient background.

An affinity diagram organizing data points into application areas and common feedback items

Ideation

Communicating Findings and Recommendations

I organized our pain points and recommended solutions into a report, which I shared with our Development Lead to get her ideas and feedback. Her expertise with front-end development and accessibility gave us a valuable, added perspective into development feasibility and accessibility enhancements.

 

We then presented the finalized Pain Points and Recommendations Report to the wider team. Communicating our findings in a concise, solutions-oriented way empowered our team to engage with the data, empathize with our audience, and ideate new opportunities.

A screenshot of page 1 of the Pain Points and Recommendations Report containing sections entitled "Signup and Installation", "Look and Feel", and "Navigation". Within each section are a list of pain points on the left with their recommended solutions on the right.
A screenshot of page 2 of the Pain Points and Recommendations Report containing a section entitled "Hub". Within the section is a list of pain points on the left with their recommended solutions on the right.
A screenshot of page 3 of the Pain Points and Recommendations Report containing a section entitled "Logging, Journal, History". Within the section is a list of pain points on the left with their recommended solutions on the right.
A screenshot of page 4 of the Pain Points and Recommendations Report containing a section entitled "Assessments and Results". Within the section is a list of pain points on the left with their recommended solutions on the right.
A screenshot of page 5 of the Pain Points and Recommendations Report containing sections entitled "Resources" and "User Profile". Within the sections are a list of pain points on the left with their recommended solutions on the right.

Pain Points and Recommendations Report

Prioritizing Solutions

Our development team then met to finalize the next steps in our design and development strategy. We reviewed and prioritized improvements within an impact-effort matrix, brainstorming adjustments as needed. This method enabled us to see at a glance which enhancements would require what level of effort and which would have the maximum impact for our platform users.

Two prioritization matrices on a light blue-green gradient background. The left matrix is entitled "Reviewed and Prioritized". The right matrix is entitled "Completed". Both contain design and development items organized along a vertical axis that reads "high impact/low impact" and a horizontal axis that reads "low effort/high effort".

A teamwide prioritization whiteboarding session using an Impact-Effort Matrix

Design

Navigation Redesign

After ideating and prioritizing with my team, I jumped into Figma and began the design phase. This involved redesigning key parts of the user interface to improve its usability and engagement.

Before

A view of the original Q-Life Employee mobile navigation menu. The dark blue menu is under a darker blue banner containing a wordmark logo that reads "Q-Life". The menu is horizontal across the top and requires side-scrolling. The visible menu items are: Hub, Journal, Assessments, Courses. The letter 'R' fades offscreen on the right. The image is agains a light blue background.

—  Horizontal web navigation hides some menu items, requiring the user to swipe to view more.

—  A non-collapsible menu unnecessarily takes up valuable screen space.

After

An animated view of a newly designed Q-Life Employee mobile navigation menu. The menu is a hamburger menu on the far right of a light blue top bar. A Q-Life logo is far left on the top bar. The menu expands into a drop down menu of menu items and corresponding icons. The expanded menu lists vertically: Hub, Journal, Assessments, Courses, Resources. The image is agains a light blue background.

—  The familiar hamburger menu on mobile reduces work for the user and makes all options visible at once.

—  The collapsible menu frees up valuable screen space on smaller mobile screens.

Lesson Card Redesign

Before

A view of an original lesson card on the Hub screen of the Q-Life Employee mobile application. The card contains a title "Financial Security" followed by a green filled bookmark icon and the words "Remove from your library". Below is a paragraph of text teaching financial literacy. It contains no graphics. The mobile screen is on a light green background.

—  Large blocks of text with no visuals can feel overwhelming for some people. This also reduces scannability.

After

A view of a redesigned lesson card on the Hub screen of the Q-Life Employee mobile application. The card contains an image of a calculator on papers displaying financial data charts. Under the image is the title "Financial Security" followed by three lines of text teaching financial literacy, an ellipsis, and a link saying, "Read more". At the bottom right is a green, filled bookmark icon indicating the lesson has been saved. The mobile screen is on a light green background.

—  Hiding text overflow in an expandable card breaks up the text block while still allowing readers to expand for further reading.

—  Adding relevant graphics to each lesson card adds visual interest and improves scannability.

Check-In Card Redesign

Before

A view of the original Check-In card displayed in the Hub screen of the Q-Life Employee mobile application. The card is white, entitled "Daily Check-In Questions". Below the title is the question, "Were you eager to start your day?" To the right of the question is a small green checkmark and a small red 'x'. Below this is a lightly outlined box containing 10 small, grey circles. The first circle is lightly highlighted. The mobile screen is on a light blue background.

— Usability testing revealed that the small "✓" / "✗" answer controls were sometimes interpreted by users as "right"/"wrong". This can be confusing to some.

—  The dot progression too closely resembles radio buttons, which can also cause some confusion about their functionality.

After

A view of a redesigned Check-In card displayed in the Hub screen of the Q-Life Employee mobile application. The card is light blue with rounded corners and a light shadow. The card title reads "Daily Check-In". Below the title is the question, "Were you eager to start your day?". Below the question are two large answer controls in dark teal. The left reads, "Yes" with a trailing checkmark icon. The right reads, "No" with a trailing 'x' icon. Below is a thin progression bar. The mobile screen is on a light blue background.

—  Larger, labeled answer controls improves clarity and clickability on mobile screens.

—  Using a simpler progression bar improves clarity and saves valuable mobile screen space.

Results

Project Successes

14

Platform enhancements implemented

32

Usability improvements 

recommended

1

Platform redesign ready to prototype for a new audience

  Hub

A redesigned Q-Life Employee mobile screen. The screen displays the Hub view. In the Hub is a light blue Daily Check-In card, and lesson cards. The top lesson cards contain graphics, lesson titles, and a few lines of text, followed by a link to view more and a bookmark icon.

  Journal

A redesigned Q-Life Employee mobile screen. The screen displays the Journal view, containing an interactive Daily Check-In card and an interactive Journal Card.

  Resources

A redesigned Q-Life Employee mobile screen. The screen displays the Resources view containing interactive video cards with colourful images of water and clouds. The top card is titled "Practice Mindfulness" and a description of the guided video resource.

  Assessments

A redesigned Q-Life Employee mobile screen. The screen displays the Assessment view. A top, secondary menu reads "Open Assessments", "Completed Assessments", and "Assessment Results". Below this menu are three assessment cards. The top card contains an image of a smiling factory worker and is titled "Employee Wellness Assessment". The second card contains an image of two smiling office employees in a meeting and is titled "Workplace Wellness Assessment".

  Courses

A redesigned Q-Life Employee mobile screen. The screen displays the Courses view. The top course card displays an image of a silhouette climbing a mountain. The card is titled "Q-Life Resilience Training". Below the title is the course's module count and a brief description of the course objectives. Below the description is a button that reads "View Course". The next course card is titled "Financial Literacy Training" and displays a relevant image, module count, description, and "View Course" button.

Redesigned screens

Reflections

Highlights

What went especially well?

Unexpected insights

Observing and talking with our audience as they used the platform helped challenge my expectations. I had insider knowledge on how the UI was supposed to work, so it was a great learning experience uncovering unexpected use cases.

 

Team-wide collaboration

Collaborating closely with our team’s lead developers in the ideation and prioritization phases allowed our solutions to reflect our development capabilities. This also leveraged our team's diverse expertise in back-end and accessibility.

Takeaways

What lessons will I apply to future projects?

Explore new tools for quantitative research

I am now exploring new third-party research tools that enable teams to accomplish broader quantitative research goals on tight timelines and budgets.

 

Ensure higher unmoderated tester completion rate

With busy lives, people understandably cannot always complete unmoderated testing. In the future, I would better anticipate this by:

  • Broadening our tester pool (this can be accomplished with secure third-party tools)
  • Better incentivizing testing completion (as company budget allows)
  • Creating a testing experience that lets unmoderated testers choose which tasks to complete and allows them to opt for only short tasks if needed

Next Steps

How would I iteratively build on this project?

  • Continue with implementation and iterative testing

  • Create a platform onboarding experience

  • Create a branded design system for this new product
A cropped isometric view of redesigned Q-Life Employee mobile screens against a blue-green gradient background