2023
Research Process
To understand these pain points, I combined Hotjar analytics, short Maze surveys, and user interviews. Hotjar’s heatmaps and session recordings highlighted that users often hesitated or circled back when they needed help, especially on more complex pages. Through Maze, I collected feedback that confirmed users preferred getting assistance within the page, instead of being redirected elsewhere.
I also benchmarked leading SaaS platforms and noticed a trend: the most successful products provided proactive, contextual help that improved both user satisfaction and retention.
Research Insights
• Legacy tool caused frequent errors and long call times.
• Agents needed better visibility of driver status and fewer manual steps.
• Competitor benchmarking revealed faster, more intuitive workflows.
My core design principles became clear:
• Contextual Help that adapts to the user’s current page or task
• Non-intrusive Access with a floating panel that never blocks key content
• Speed & Simplicity - users can get help or start a chat in two clicks
• Accessibility - fully navigable by keyboard, with strong color contrast and screen reader support
Design Process
When designing for content-heavy platforms, I often see users struggle to find the right support at the right time. While traditional FAQ pages and generic help links are common, my research showed that users really want contextual, in-flow assistance - without interrupting their work.
• Low-fi wireflows to map out improvements.
• Interactive high-fi prototypes tested with agents.
Two rounds of usability testing:
Test #1: SUS 62 (marginal)
Test #2: SUS 74 (good), 83 % task success
Reflection & Next Steps
• Phased rollout allowed us to test and improve with low risk.
• Agent training and feedback loops were crucial for adoption.
• Next step: Add voice support in the car for even faster assistance.
Launch & Measurement
Roll-out step KPI snapshot
• Soft-launch (25 %)
• Full roll-out (100 %)
KPI snapshot
• Task success 62 % → 78 % (30 days)
• Task success 83 %, errors –40 %
Metrics Glossary
Task success: % of users who completed their task without help
Feature adoption: % of active users who used new features
Error rate: Frequency of critical user mistakes
SUS: System Usability Scale (score from 0-100; 68 = “average”)
n = 10/8: Number of usability test participants

Outcome
The redesigned employee assistance panel offers a streamlined, intuitive interface that significantly reduces the time and effort required to support customers. By reorganizing the information architecture and simplifying the interaction flows, the system now enables employees to quickly access the tools they need, ultimately leading to faster resolution of customer issues and improved overall service quality.
This case study demonstrates a holistic approach to UX design—combining user research, iterative prototyping, and rigorous usability testing to deliver a solution that aligns with both business objectives and the real-world needs of support staff.