top of page

INSIGHTS PERFORMANCE METRICS -
iOS mobile

Users

Airport (station) gate leads and supervisors

​

To enhance an existing iOS app by introducing a feature that empowers airport station customer service teams with real-time performance metrics and historical trend insights. This new feature provides customer service teams with immediate access to their performance data, enabling them to make informed decisions and improve operations. By integrating real-time metrics with historical trends, the feature delivers actionable insights that help teams identify patterns, optimize processes, and enhance customer experiences.

THE PROBLEM 

Station Managers relied on daily performance metrics delivered via email every morning. While this process provided high-level data for morning management meetings, it presented several challenges:

  • Limited Data: The high-level metrics lacked the depth required to identify trends or inform strategic improvement plans.

  • Accessibility Issues: Once printed and pinned to a bulletin board, the data was inaccessible to employees on different shifts, creating a gap in knowledge sharing and collaboration.

  • Outdated Workflow: The static nature of the printed report did not align with the team's need for dynamic, real-time data to respond to operational changes throughout the day.

Screen Shot 2019-08-31 at 6.07.15 PM.png
Example Email of data metrics image

THE SOLUTION

I designed and implemented a new feature within an existing iOS app to address these issues. This feature allows Station Managers and their teams to access real-time performance metrics and historical trends directly on their mobile devices, ensuring timely, actionable insights for all team members, regardless of shift.

To succeed, the feature needed to:

  • Simplify Data Presentation: Transform raw metrics into visualizations and summaries that were easy to interpret at a glance.

  • Encourage Engagement: Provide clear insights that prompted users to take meaningful actions to improve customer service and operational performance.

  • Drive Decision-Making: Empower teams to use the data as a foundation for collaborative, data-driven strategies, regardless of their familiarity with analytics.
     

Addressing this challenge was key to fostering a culture of continuous improvement and ensuring the feature delivered real operational value.

​

DESIGN STRATEGY 

​To address the challenge of making performance metrics accessible and actionable for non-data-savvy users, I implemented the following design strategies:
  • Visualizing Data for Clarity - Simplified data visualizations. These visuals made the information more engaging and easier to comprehend, reducing cognitive load and fostering quicker decision-making.

  • Contextualizing Metrics with Core Values - Each metric was paired with contextual information that tied it back to United's core values and KPIs. This approach helped users see the "why" behind the numbers, emphasizing how their performance impacted customer service and operational success at the airport station.

  • Making Data Feel Personal -To resonate with users, the design framed metrics in a way that felt directly relevant to their daily tasks and goals. 

FIRST: DATA VISUALIZATION HIERARCHY 

To ensure users could seamlessly interpret performance metrics, I designed a clear hierarchy for data visualization that guided them from high-level overviews to actionable details.

​

  • Category Data Visualization - Developed a streamlined approach to represent four overarching categories, providing users with a comprehensive yet accessible view of performance metrics.

  •  KPI Structures - Designed detailed breakdowns of the key performance indicators (KPIs) contributing to each category's score, offering clarity into the underlying metrics and their impact.

1. CATEGORY DATA VISUALIZATION 

I created multiple iterations to represent the category score, exploring various designs to optimize clarity and user comprehension.

 

I advocated for a change to display the overall score as a percentage out of 100%, replacing the previously complex data ratio. This adjustment made the metric more intuitive and aligned with familiar, everyday conventions, ultimately enhancing usability. My recommendation was well-received, and the team agreed to allocate engineering resources to implement this improvement.

Screen Shot 2019-08-18 at 2.53.21 PM.png
Screen Shot 2019-08-18 at 2.53.12 PM.png
Screen Shot 2019-08-18 at 2.53.45 PM.png
Screen Shot 2019-08-18 at 3.11.45 PM.png
Different graph iterations to represent the station score

Multiple Iterations this is the design we went to the first round of testing with
Made the decision to use a full donut chart. Users can focus on reading the length of the arcs, rather than comparing the proportions between slices from a pie chart and clear indication of where the station is to goal and exceeding goal. (https://datavizcatalogue.com/methods/donut_chart.html)

Screen Shot 2019-08-18 at 5.36.58 PM.png
Visual image of the donut chart that would use during testing.
2. KPI STRUCTURES

Each category's overall score was calculated based on a set of KPIs. I designed and iterated on multiple approaches to effectively integrate targets and color-coded thresholds, ensuring the visual representation was both informative and intuitive.


These iterations aimed to provide users with a clear understanding of how each KPI contributed to the overall score while leveraging color thresholds to highlight performance levels at a glance.

Screen Shot 2019-08-18 at 5.47.18 PM.png
Screen Shot 2019-08-18 at 5.51.36 PM.png
Wireframe design strategy using the category colors defined by the data team.

Multiple Iterations this is the design we went to the first round of testing with
The list view provides enough real estate to scale for additional KPIs. I also used the color treatment that represents the score threshold.

Screen Shot 2019-08-18 at 5.36.01 PM.png
PUTTING THE 2 PARTS TOGETHER

I combined the "Category Data Visualization" and "KPI Structure" into a single, consistent layout for all four categories. This unified approach ensured clarity and usability while maintaining scalability.

 

Key goals for the design:

  • Segmented Control for Timeframes - Introduced a segmented control at the top of the page, allowing users to easily toggle between different timeframes for each category.

  • Overall Score at the Top - Positioned the overall score prominently at the top of the page, establishing a clear visual hierarchy and enabling users to immediately grasp the category's performance.

  • Time-Segmented Data Breakdown - Presented detailed, time-segmented data for each category, helping users analyze and interpret performance metrics within the selected timeframe efficiently.

put together .png
Final data metrics layout used for round 1 usability test.

NAVIGATION STARTS WITH THE LANDING PAGE 

The landing page was designed to present the Station Score, a comprehensive aggregate of the four category scores. The primary goal was to provide users with high-level insights that inspire further exploration into their performance and highlight areas for potential improvement.

​

​

​

opti 1.png
op1.png
opt5.png
Different wireframes of the landing page.
LANDING PAGE - FIRST ITERATION FOR TESTING
After collaboration to ensure we were representing the data correctly the screen below is the landing page we decided that would be worth testing. 
*There were no initial requirements written, the requirements of what needed to be apart of the UI constantly changed.
home1.png
Image used for test. UI provided by visual designers.
Colors defined by company. 

ROUND 1 USABILITY TEST

While other parts of the app were tested (not my designs), I focused specifically on the metrics section to ensure it effectively supported user goals and usability. The primary objectives were:

  • Evaluate NavigationAssess how users navigate to and within the metrics section to ensure smooth, intuitive interactions.

  • Test Discoverability of Information - Determine how easily users can locate and identify key performance metrics within the redesigned interface.

  • Understand Content in New Staff View - Validate the clarity and relevance of information presented in the new staff-specific view.

  • Set Notification Settings via Profile Section - Observe how users interact with and set up their notification preferences, ensuring alignment with their expectations and needs.

  • Access and Understand App Tips Screens - Verify that users can easily locate, access, and comprehend guidance provided through the app's tips screens.

  • Identify Relevant Performance MetricsConfirm that the performance metrics displayed are meaningful and actionable, providing users with insights that drive engagement and improvement.

PDF Prototype

By: UX Designer, Michelle Phanthongphay

RESEARCH FINDINGS DATA METRICS ONLY: LANDING PAGE AND DATA CATEGORY
Screen Shot 2019-08-18 at 11.15.19 PM.pn
Excerpt from the report; only the performance metrics that I had worked on.
POST USABILITY TEST; LANDING PAGE RE-STRATEGIZED

To improve usability and align the interface with the brand's identity, I revisited an earlier design iteration. These updates focused on enhancing navigation, interactivity, and visual consistency.

Key Updates:

  • Reintroduction of Button-Based Navigation - Returned to a button-based navigation design for clearer distinction between interactive elements. 

  • Clickable YTD Score Section - Enhanced the design of the Year-to-Date (YTD) score section to visually communicate its interactivity that allows the user to explore additional details. 

  • Metrics Categories as Interactive Buttons - Redesigned the metrics categories UI to resemble actual buttons, reinforcing their functionality and making interactions more intuitive.

  • Enhanced Brand Color Integration - Collaborated with the visual designer to incorporate more pronounced brand coloring into the categories. This adjustment emphasized brand identity while improving the visual hierarchy and overall aesthetics.

Before
home1.png
After
1.png
Visuals provided by UI designer
CATEGORY DATA UPDATES
I revised functionality of segmented control. The original design used the segmented control incorrectly, as it did not adhere to the principle that segmented controls should switch between mutually exclusive datasets or views. To resolve this, I repositioned the segmented control to ensure it changes the entire page's content based on the selected segment.
Before
put together .png
After
Screen Shot 2019-08-21 at 1.04.34 AM.png
KPI'S RE-STRATEGIZED

I reworked the use of colors to include clear status labels, ensuring users wouldn't need to remember what each color represented. This was especially important given the varying meanings and lack of clarity in the original design. By explicitly stating the status alongside the colors, users could easily understand performance indicators without unnecessary effort.

Before
Screen Shot 2019-09-01 at 7.56.49 PM.png
After
Screen Shot 2019-08-21 at 12.06.28 AM.pn

​

ADDITIONAL REQUIREMENTS ADDED

As with most projects, the scope and requirements evolved over time. In this case, the product manager approached me with additional requirements that needed to be incorporated into the design.
1. TREND DATA 
A new set of data visualization to show month-to-month performance compared to previous year.

Line graphs were the best way to represent the trend . Quantitative values over a continuous interval or time period, making them ideal for illustrating trends and analyzing changes in data over time. This approach ensured that users could easily identify patterns and make informed decisions based on the visualized data.
Screen Shot 2019-08-20 at 11.59.59 PM.pn
Wireframe
Screen Shot 2019-08-20 at 11.58.58 PM.pn
UI provided by visual designer
2. ADDING A LEGEND
Another new requirement was to add a legend to the app.

Given the complex nature of the data, the legend provided clear explanations of key terms and incorporated definitions and color codes established by the data analytics team.

 

This addition aimed to simplify data interpretation, ensuring all users could effectively engage with and learn from the performance metrics.

​
insights key.png
Pre-Usability test legend
3.ADDING A THIRD LEVEL OF DATA: ENABLING METRICS

To meet the new requirement of including a third level of data, I designed an interface that detailed the data elements contributing to individual KPIs within the "Dependable" category. This additional layer required displaying both the year-over-year trend graph and the current score for each KPI.

These early designs show the approach I took to displaying the data. I needed to learn how this data would be consumed in order to apply the best interaction.

em1.png
em3.png
em2.png
Wireframes for enabling metrics iterations

Decided on this version to test

  • Category Summary - A high-level overview providing essential context about the category.

  • Static Breakdown of Elements - A detailed, static display of the data elements that contribute to the category score, ensuring clarity and ease of understanding for users.

 

This approach was chosen to strike a balance between providing actionable insights and maintaining simplicity for users navigating the performance metrics.

em4.png
UI used for usability test

ROUND 2 USABILITY TEST

Objectives:​

​

  • Understand Content on Metrics Screens - Assess whether users can effectively interpret and navigate the information presented on the metrics screens.

  • Access Details for Individual Metrics - Evaluate the ease with which users can drill down into individual metrics to view detailed information.

  • Use Metrics Indicators to Identify Problem Areas - Determine if users can utilize performance indicators to quickly identify areas requiring attention or improvement.

PDF Prototype

By: UX Designer, Michelle Phanthongphay

KEY INSIGHTS:EXCERPT FROM REPORT
R2 key insights.png
LANDING PAGE UPDATES FROM RESEARCH - IMPROVING VISUAL CONSISTENCY AND CLARITY

To address inconsistencies in the UI patterns, I collaborated with the visual designer to review and standardize the design elements. This effort ensured a more cohesive and user-friendly interface.

Based on user feedback, I also enhanced the station score by adding a color indicator. Users highlighted the importance of colors for understanding station performance, and this addition aligned the station score with the existing visual language, making it easier to interpret at a glance.

1st Usability Test
home1.png
2nd Usability Test
Landing Page .png

Post Usability Test

I added color indicators to each category to visually represent their current status. Research confirmed that these indicators are crucial for helping users quickly understand performance levels, making the data more actionable and intuitive.

home final.png
KPI UPDATES FROM RESEARCH
We learned a lot about on how the user comprehend the data and how they used the color:
  • Color Associations - Users’ mental models were closely tied to the historical use of colors, making the current color-coding system intuitive and effective. Testing confirmed the positive reception, leading to the decision to maintain these associations.

  • A/B Testing Results - 

    • The image placement in the middle tested significantly better than other layouts, indicating stronger engagement and clarity.

    • However, the icons were removed as they did not provide additional value or aid in comprehension

  • ​Added Descriptions for Colors - Including descriptions alongside the color indicators proved highly beneficial. Users appreciated the additional context, which reduced cognitive load and enhanced understanding.

  • Simplification of MetricsThe "Gap to Target" feature was removed due to its complexity and lack of necessity. Simplifying the design made the metrics more straightforward and user-friendly.

​
1st Usability Test
Screen Shot 2019-09-01 at 7.56.49 PM.png
2nd Usability Test
Screen Shot 2019-09-01 at 8.03.14 PM.png

Post Usability Test

Removed gap to target as it was not important from the research. 

final Data deliverable.png
PUTTING IT ALL TOGETHER
Expanded view of Dependable MTD stats -
UPDATES TO GRAPH NOT FROM TESTING

In collaboration with the visual designers, we made modifications to the donut chart to enhance clarity and usability. By removing a quarter of the circle, we achieved the following:

  • Reduced Visual Crowding - The simplified chart design is less visually overwhelming while still clearly communicating that the score is out of 100%.

  • Improved Text Placement - The additional space created by the adjustment allowed for more prominent and legible text within the chart, improving overall readability.
     

These changes ensured that the chart remained informative while offering a cleaner and more user-friendly design.

Before
Screen Shot 2019-09-01 at 7.07.14 PM.png
After (at delivery)
Screen Shot 2019-08-21 at 10.56.45 AM.pn

ADDITIONAL REQUIREMENTS ADDED POST TEST

Additional requirements were added. This request was to include a leaderboard showcasing airport stations within the same region. This leaderboard allows users to:

  • View Regional Rankings - Display how stations rank against each other within their region, providing a clear performance comparison.

  • Track Rank Changes Over Time - Highlight movement up or down the rankings based on the selected timeframe, using the segmented control for time-based filtering.

     

This feature was designed to foster healthy competition and provide actionable insights for stations looking to improve their performance relative to their peers.

leaderboard wireframe.png
Wireframe
UI leaderboard.png
Final UI provided by visual designers

PUTTING THE PIECES TOGETHER

Delivery screens in collaboration with the UI Designer
Landing Page
home final.png
Enabling Metrics
EM5.png
Data Category (x4)
DATA METRICS.png
Leaderboard
UI leaderboard.png

CONCLUSION 

By the end of the project, we successfully condensed complex data into a user-friendly interface that our users could easily understand. The navigation tested positively during usability sessions, ensuring users would have no difficulty finding any metric they needed.

​

WHAT I LEARNED

Building strong relationships with designated teams was crucial to successfully completing this project, especially when navigating complex data. These relationships not only enabled effective decision-making but also created a supportive and enjoyable environment. Even during high-pressure meetings, moments of humor and laughter helped maintain morale and fostered a collaborative spirit.

While the process often followed an ad hoc approach, which is less efficient, I’m proud of my ability to keep the project within scope. By managing priorities and resisting the urge to continually expand the requirements, I ensured the delivery remained focused and aligned with the original objectives.

"Saruman believes it is only great power that can hold evil in check, but that is not what I have found. 
I have found it is small everyday deeds of ordinary folk that keep the darkness at bay. Small acts of kindness and love."
- Gandalf

  • LinkedIn
bottom of page