SuperDocACO App

SuperDocACO App

Increasing discoverability of time-sensitive hospital alerts by 3.5x through research-driven design

Mobile

B2B

Research

PBACO is a population health startup that partners with physicians to improve healthcare quality for patients while reducing costs. Through their SuperDocACO mobile app, physicians are alerted in real-time when their patients are admitted or discharged from the hospital, enabling them to swiftly intervene in order to prevent unnecessary treatments and hospital readmissions. However, discoverability issues within the app are causing physicians to miss alerts, resulting in patients being put at risk both medically and financially. To address this, I conducted a research study with the goal of improving usability and patient outcomes.

Company

Palm Beach Accountable Care Organization (PBACO), a population health startup

Team

  • Designer (me)

  • Researcher (me)

  • Product Support Manager (also me, it's a startup!)

Contributions

Interviews, research, interactive prototypes, remote testing, customer support

Design Impact

74%

Completion rate for redesigned features

57

Total participants across 2 remote usability studies

$0

Spent to conduct research project

Background

SuperDocACO App Overview

Physicians receive an alert whenever a patient is admitted to or discharged from the hospital. They are expected to complete a checklist of items to ensure that the patient is properly taken care of, which may include contacting the patient, sharing medical records, or scheduling a post-discharge examination. As members of the ACO, physicians' app performance are evaluated on a weekly basis.

The SuperDocACO App alerts physicians of patient admissions and discharges in real-time.

Known Obstacles

Through my duties as Product Support Manager, I had a direct line of communication with our users, who consisted of doctors and office staff. Additionally, my conversations with user-facing teams yielded the following themes:

🔍

There was a constant stream of product support support tickets from users asking for help to find "missing" alerts

🤷‍♂️

Users often weren't sure of what they were expected to do on the app

📚

Hospital alerts often got hidden behind newer alerts and could only be accessed from within the history tab

😮‍💨

Users were often busy juggling multiple tasks and found it difficult to pick up in the app where they last left off

💥

I had difficulty advocating internally for improvements to be made to the app workflow, with low performance blamed squarely on the users

With the mission of ensuring that less patients fell through the cracks, my objectives were to quantify usability issues through a formal research project, and to find solutions to those usability issues through rapid design, testing and iteration.

Designs

Early iterations

I went through several iterations as I honed in on the design and incorporated feedback from stakeholders.

Displaying all alerts to improve wayfinding

Users reported difficulty coming back to a specific alert they viewed earlier in the day, which was often due to it being overwritten by newer alerts. I redesigned the preview screen to show all of a patient's alerts for that week.

Separating alerts by week for easier access

The alert preview screen has been separated into weeks instead of one endless list. Both PBACO and physicians measure their schedules and productivity on a week-to-week basis, so this redesign presents information in a more accessible format.

Showing more details to reduce guessing

In the history tab, there was no easy way for a user to identify which alerts they had already acted on except by opening every single alert. By showing more information on the history tab, users can see at a glance which alerts they may have missed.

Making empty states more useful

The empty alert screen has been redesigned to show users useful contextual info, such as links to unopened alerts on other pages. This helps alleviate concerns that the app wasn't "loading properly."

Testing

I tested my designs by creating a remote multivariate test using maze.co and distributing it to a group of 30 users. Participants had to find a specific alert using the existing app interface, then find a different alert using the redesigned interface. They would also be asked to rank the usability of each experience and leave comments.

Usability study using the maze.co platform

Out of the 26 participants who took the first test, only 21% were able to complete it, meaning that 79% gave up. The existing app design was so unintuitive that many users thought that the test was broken or flawed!

Redesign A performed better with a 64% completion rate, including users who were able to successfully find the shortest path to the alert.

Usability test outcomes for original design (left) vs redesign A(right)

Participants ranked the usability of each test on a scale from 1 to 5. The existing design received predominantly low scores, while the redesign scored significantly higher, which is better than I expected for a design that nobody had seen before.

Participants found the redesign to be more straightforward than the original.

The usability test also revealed areas of the redesign that needed further improvement. For example, usage patterns showed several users opening every single alert except for the right one. This shows that the existing status indicators are not intuitive and need reexamining.

Cumulative heatmap of both tests. Users spent more time on first design.

My redesigns performed better than the existing app design during testing. However, it was obvious that there was much more room to improve.

Designs Round 2

Early iterations

Taking feedback from the first usability study, I iterated on my designs to more closely address user painpoints. One item in particular that I focused on was redesigning the next action indicator, which was shown to be ineffective at communicating to users the status of an alert.

Simplifying the alert preview

Users were confused by the redesign's multiple color segments, confusing alert type with status. Redesign B consolidates the colors and properly labels the alert.

The grey action indicator was swapped out for a checklist that shows at a glance which actions have been completed.

Decluttering language through standardizing syntax

The text description inside an alert can vary depending on the data source, which can result in very inconsistent and wordy examples. By standardizing the syntax, alert descriptions are more concise, predictable and fit better with the other redesigned elements.

Hiding previous alerts

While displaying all alerts on-screen provided a shorter path to accessing content, it did not yield significant benefits to users during testing. It also increased screen clutter significantly. Redesign B focuses on only the most recent updates for each patient, clearing up screen space and decreasing cognitive load.

Improving Readability

User feedback suggested that the alert screen section in the app was difficult to read. Although this was outside of the original scope of the study, I decided to incorporate these changes into the redesign as well.

Testing Round 2

A second usability test was sent out to another set of users, with 31 participants ultimately taking the test. This time, users would try to complete the same task in previous studies using only Redesign B.

Usability study 2 utilizing the maze.co platform

As seen below, participants were much more successful in finding the correct alert in the latest test, with a whopping 33% of participants completing the test using the most direct path for a combined 74% completion rate. This was even greater than the 21% completion rate of the original design.

Redesign B (right) yielded the highest success rate out of all designs.

When asked how straightforward Redesign B was, 33% of participants responded "very straightforward", representing a 10x increase over both Redesign A and the original design.

Participants thought Redesign B (right) had the most straightforward design.

Feedback was also much improved for this round of testing compared to the previous round:

  • "I don't use the app at all and I was able to find it."

  • "I do this often to help my drs find missed alerts."

  • "Very user-friendly. it was quick and easy to locate previous dates."

Naturally, there were also comments we received that indicated places in the app where we could add more signifiers, such as:

  • "It wasn't that obvious that back arrow was previous week"

  • "It should indicate somewhere that this alert is a missing one, like a red flag or something."

Overall, Redesign B was a big improvement over both Redesign A and the original design in terms of usability and user satisfaction.

Takeaways

The two major challenges for this project were convincing internal stakeholders that the problem actually existed, and then making the case that a redesign was the best solution. Due to my product support responsibilities, I interact with users far more than developers or managers do, which was a huge reason for this gap in perspective.

To bridge this gap, a lot of my effort went towards quantifying user pain points in a way that would give my arguments weight. The interviews and usability tests were an effective way of generating quantitative and qualitative user data, which helped tremendously in building my case for the redesign and getting stakeholders on board.

Although development of these features were still ongoing by the time I had left the company, I'm hopeful that their eventual release will benefit both users of the app and their patients.

Next up

Superset

Redesigning Unite Us’ referral system, used to connect over 10 million people to food, housing and social services.

Read Case Study

Next up

Superset

Redesigning Unite Us’ referral system, used to connect over 10 million people to food, housing and social services.

Read Case Study

©2025 Richard Akina

©2025 Richard Akina