NantHealth: Coaching Centers

Note: This case study focuses on UX processes and findings rather than on actual product interface details, which are protected by a Non-Disclosure Agreement. Please email me at if you would like more details about UI.


Provide design recommendations
on health coach workflows


UX Designer


Contextual Inquiry
User interviews
Design Workshop
User surveys
High-Fidelity Prototypes (HTML/CSS/JS)
Usability Testing


At NantHealth, I worked on a cloud-based software program that helps health coaches deliver clinically-approved content to wellness program participants through pre-approved interaction protocols. Although the UX team conducted weekly usability tests with health coaches to test new features, we needed more information on how users utilized the current version of the product. However, because the product team was remote, it was difficult to understand how users interacted with the suite of products. Therefore a group of four product managers and I were selected to visit two coaching centers and to observe user workflows and painpoints. As the sole member from the UX team, I led the effort in organizing the research trip.


First off, since we would be broken up into two sites across two time zones, I knew standardized data-collection would be important for data analysis. I created a wiki page and held a kick-off meeting to discuss best practices when conducting contextual inquiry and when interviewing users.  I also created interview scripts and a common note-taking method that would help streamline the process of analyzing results.

Contextual Inquiry

Once onsite, we observed the health coaches in their calls with participants, and noted their process from how they prepare for a call (what documents they refer to, what programs they prep on their screen), their process during a call and any difficulties they faced, and their documentation after a call. We then asked follow-up questions and noted any outside tools they used, and also any documentation they kept outside of the software.

Design workshop

In order to understand to better understand user needs, I conducted a design workshop with the coaches, where I asked them to create an empathy map of wellness program participants, and then another empathy map of a hypothetical coach persona. I chose an empathy map because I wanted to be able to bring back to the team a more contextual and complete picture of our users: what they saw, heard, felt in their work, as well as what they struggled with and what they considered to be 'wins'. The product team also did not have direct access to the wellness program participants, and the coaches were the most knowledgeable and our best bet at getting a more complete picture of participant characteristics.

Through the empathy mapping exercise I learned of the competing pressures that coaches felt throughout their day. For example, by protocol they were expected to deliver clinically approved content according to script, but in order to keep participants engaged they often needed to improvise. Also, many of them had chosen to work in the wellness profession out of a sincere desire to help others, so at times it was painful when participants seemed distant or only interested in earning reward points.

I also asked the coaches to visually depict their workflows in a journey map, which helped me see what they considered to be their major painpoints. Some common sources of anxiety were having to reschedule participants (which often required an extensive amount of maneuvering with the scheduling software) and network connectivity.


Lastly, I sent a follow-up survey to coaches to help the team understand their demographics and also how they rated the severity of the problems we had seen. By triangulating our data methods (observations, in-person workshop, anonymous survey), we could validate our findings and see what users personally thought of as the most severe issues with the software, versus what we had observed.

Analysis and recommendations

After the trip, I analyzed the data we had collected by sorting and comparing the various responses we had received for common questions and tasks. When sorting qualitative data, I usually color-code responses so that it's easy to see which areas users had significant issues (orange), where they struggled but were able to complete their tasks (yellow) and positive feedback or successes (green).


Based on the quotes we had collected during contextual inquiry and from the empathy map workshops, I created user personas of coaches and participants. The goal of the personas were to succinctly capture the main defining characteristics of the coaches and their usual daily workflows.


I documented the coaches' workflow during a typical session, so that we could easily see their process and the time it took for each major step.

Pain points

To better illustrate the pain points in a typical user's workflow, I visualized the different programs used by the coaches (vertical axis) as they went through a typical workflow (horizontal axis).


  1. Getting the system stable and free of workarounds is a high priority: Lack of system reliability means coaches have difficulty trusting the system and have to rely heavily on outside documentation to complete their tasks.
    • Coaches summarize notes in a final notes page (whether completing script documentation or not) and copy/paste into spreadsheet or other tool to use during times system is down.
    • Coaches are hesitant to explore new functionality outside of their learned workflows.
  2. Workflow of reviewing status, call execution and documentation needs to made more efficient so that coaches can handle heavier call volumes and participant data is more easily trackable.
    • Coaches need to quickly access a complete view of a participant’s information before, during and after a call – they currently maintain separate documentation to keep track of participant information.
    • Many coaches are writing notes free text in the final notes page, instead of inline within call scripts, leading to potential loss/inefficiencies of trackable data for client companies. Integration between the various applications will help reduce time spent switching between applications or re-entering data.
  3. The redesign of calendar/scheduling needs to be prioritized: scheduling calls and setting up the calendar takes up a significant portion of coaches’ time and was a common complaint in interviews.
    • Lack of system notifications for important events such as a participant schedule change means that coaches have to keep computers nearby at all times (even on weekends).

Redesign and testing

Based on our findings, I restructured the information architecture of the site, focusing on reducing the coaches' need of outside documentation, and on better supporting larger call volumes by streamlining communication via notes and reducing number of steps in viewing participant information. I then conducted usability tests of redesigns of key problem areas.

To view presentation and UI recommendations, please send me an email at, or use the request form below.

Request Materials

Challenges and what I learned

Understanding timing of design recommendations

Because the company had devoted so many resources to this onsite research trip, I assumed that we wanted to come away with significant improvements to coach workflows. However I faced a sigificant barrier in that the development cycles were very behind schedule, and therefore product management was more interested in completing requirements rather in redesigns.

To try to persuade product managmement about the feasibility of the redesigns, I made sure to include development within conversations and also to get realistic estimates on what the work would entail. I had found that at times simple CSS was interpreted to be more difficult than they actually were for development, and having a clear estimate of the actual time needed to implement a change helped, especially when there were small changes that could have significant improvements for users.

User-centered design in a feature-focused environment

I thought that by gaining more context about our users and their workflows, we could make our design process more user-oriented, rather than being driven by feature requirements. However, I found that our design process did not really change after our trip - the product management team still received feature requests from upper management fairly late in the game. By the time a UX designer was asked to design a feature by a project manager, it often had to be built to very specific client requirements.

Through trial and error I found what was most effective at creating positive change was introducing small tangible improvements to our existing way of working. For example, although continuing to push for more formative user research, I focused on conducting more usability tests and surveys to better validate our results.