Get Work
Rebuilding Trust
in a Critical Work Queue
for 450k+ Users
Project Overview
Role: Co-Lead UX Designer
Team: 5 UX Designers, 10 Developers,
2 Product Managers, SMEs
Platform: Nationwide Accounting Platform
Contribution: Research, Strategy,
UX, UI, Analytics
Details have been modified to protect confidentiality while preserving the integrity of the work.
The Challenge
Context: The Get Work feature is the primary gateway for thousands of users across the country to access their daily work. When it fails, operations stall, and it was failing in critical ways.
The Problem: Analysis of ServiceNow support tickets revealed that 20% of tasks pulled through the system were unworkable and had to be returned to the queue. At scale, that's not a minor inefficiency; it's a systemic trust problem.
Root Causes:
A filtering system that excluded entire user groups — built around a narrow set of user flows, users with specialized skill sets had no reliable way to access their work at all. Rigid cascading dropdowns forced multiple dependent selections before surfacing any results.
Data inconsistency eroded trust — users couldn't trust what they were pulling, leading to hesitation, workarounds, and rework that rippled across operations.
The system couldn't scale with the business — as the business grew, adding new filters became increasingly unmanageable, ensuring the problem would only get worse over time.
From rigid cascading dropdowns to a dynamic, role-aware filtering system
Research & Discovery
To understand the full scope of the problem, we conducted a comprehensive research initiative spanning user behavior, expert knowledge, and system architecture.
User Pain Points Investigation:
ServiceNow ticket analysis revealed patterns in user-reported failures, giving us quantitative grounding for the scope of the problem.
Contextual inquiry sessions with users across the platform uncovered how they actually navigated the system, including the workarounds they'd built around its limitations, and requested filters they needed but didn't have.
SME interviews across different product lines surfaced workflow variations that the existing system was never designed to support.
"This filter doesn't fit my work. I have to go through all the dropdowns just to scroll to the bottom and find out there's nothing there for me."
“
"I always cross-reference the dashboard to make sure the numbers are actually right."
— Participant feedback, contextual inquiry sessions.
Key Insight: The rigid filtering structure wasn't just a usability problem; it was fundamentally misaligned with how different user workflows actually operated. The one-size-fits-all approach was failing to meet diverse users across the nationwide platform.
Backend Investigation: A deeper look at the data architecture revealed why users couldn't trust their results. Inconsistent labeling and fragmented database relationships meant filtered outputs were unreliable at the source — frontend failures rooted in backend complexity.
This finding directly shaped our next step: mapping the information architecture across every form to give the development team a clear picture of where filters overlapped, conflicted, or were missing entirely.
“
Strategy & Approach
Focus Areas:
Dynamic, flexible, and scalable filtering — the highest priority across every user group. The existing system was built for a narrow set of workflows and couldn't grow with the business. A rebuilt filtering architecture would serve all users regardless of specialization, and scale as new form types were added over time.
Data reliability and labeling consistency — the UX team flagged widespread inconsistencies in how filters were labeled and how data was structured at the backend level. Without addressing this foundational issue, no filtering solution would hold.
Intentionally Deferred:
A feature for supervisors to distribute work directly from the queue was proposed and generated strong interest, but was intentionally scoped out to keep the project focused and manageable for the development team. It remains a viable opportunity for a future phase.
Rollout Phase:
Rather than rebuilding everything at once, we structured the work around form type groups, releasing 10 forms at a time across roughly 100 form types. This phased approach made the backend data work manageable for developers and allowed the UX team to map and validate each form group thoroughly before moving to the next, ensuring quality and consistency at every stage of the rollout.
PROCESS
🔹
PROCESS 🔹
Collaborative Problem-Solving
The UX team began with whiteboarding sessions to explore solutions grounded in research findings. Concepts were brought to Product Managers and developers to evaluate technical viability. The complexity of the forms and performance requirements shaped not just what we designed, but how the backend needed to be structured to support it.
Mapping Complexity
To bridge the gap between design intent and technical execution, we developed detailed information architecture maps for each form type to visualize the dynamic filtering logic and show developers exactly which filter options should appear based on user selections. Across roughly 100 form types released in groups of 10, this mapping was critical to maintaining consistency and preventing filter overlap throughout the rollout.
Design System Considerations
When exploring UI patterns, we initially considered toggle switches for their clean appearance. After a thorough evaluation, toggles didn't align with our design system guidelines because they imply instant feedback, whereas our filtering required a deliberate "apply" action. This led us to propose chip filtering as the right pattern. Because this component didn't exist in the design system yet, we fully documented the specifications and contributed it as a new reusable component to stay true to the established design system standards while expanding them.
Filter Chip
Spec sample
Core Improvements:
1. Flexible Filtering Architecture
Replaced rigid cascading dropdowns with a scalable filter panel built around a 3-tier architecture — designed to serve every user workflow, not just some:
Tier 1: Auto-populated location filters and required form/task type filtering
Tier 2: Expanded filter options based on selections
Tier 3: Additional work queue-specific filters
This structure was driven by both user needs and backend performance requirements to ensure fast data loading across 100+ form types without sacrificing flexibility.
2. Interaction Design
Chip filtering was introduced as the primary selection pattern, allowing users to see all active selections at once without navigating dropdowns
Applied where 2–3 options exist, reducing clicks and improving scannability
A new component that is fully documented and contributed to the design system for reuse across the platform
3. Backend Architecture Cleanup
Standardized labeling across the system to eliminate inconsistencies that had been quietly eroding user trust
Eliminated confusing label usage that was a root cause of unworkable tasks being pulled
Resolving it at the source made the entire filtering experience more trustworthy and reliable
Validation
Pre-Launch
Usability testing conducted remotely with real users on a working prototype
Formal design pipeline review covering design standards, content, and 508 compliance
Legal review of both the interface and data labeling decisions before launch
Post-Launch
Configured behavioral tracking through New Relic, which was a new tool for us that required close collaboration with developers to set up and troubleshoot
Custom trackable labels added to measure time on task
Pulse survey conducted after rollout revealed:
Users responded positively to the cleaner, more readable UI
Users with previously unsupported workflows no longer needed workarounds, and they felt included for the first time
A small subset showed initial hesitancy around data accuracy, which was identified as habitual skepticism from the old system, not an ongoing data issue
Impact & Results:
eXperience
Improvement
Increased flexibility
users now have full control over their filtering experience, with a dynamic system built to serve every workflow type, not just some
Improved efficiency
time on task decreased by approximately 15 seconds per filter interaction, a small per-task gain that compounds significantly across thousands of daily users
Enhanced trust
consistent data labeling and standardized architecture restored confidence in filtered results; initial post-launch hesitancy was behavioral, not systemic
Business
Benefits
Scalable solution
the new architecture can accommodate new form types and workflows without structural overhaul, growing with the business as needed
Technical efficiency
the 3-tier filtering structure enables faster, more accurate task retrieval with improved backend performance
Inclusion
two previously unsupported user groups gained reliable access to their work for the first time, eliminating the workarounds they had built around a system that wasn't designed for them
Reduced waste
support tickets decreased following rollout, directly addressing the 20% unworkable task rate that had been impacting operational efficiency
Design system contribution
chip filtering introduced as a new reusable component, now available for adoption across the platform beyond Get Work
Measurement was directional as SQL tracking was not fully 1:1, but consistently indicated faster task retrieval across users.
Key Learnings
Cross-functional collaboration
Sharing research findings early with developers, product managers, and stakeholders meant design decisions were grounded in both user needs and technical reality from the start. Alignment wasn't an afterthought; it was built into the process.
Design system evolution
The best solution sometimes requires expanding the system you're working within. Identifying chip filtering as the right pattern and fully documenting it as a new reusable component strengthened the design language for the entire platform, not just Get Work.
Phased delivery
Rolling out 10 forms at a time across 100+ form types wasn't just a scope decision; it was a quality strategy. It gave the team space to map, validate, and refine at each stage rather than absorbing the risk of a single large release.
Closing the loop
Shipping the redesign was not the finish line. Configuring behavioral tracking through New Relic, building custom SQL labels to measure time on task, and running post-launch pulse surveys ensured the work could be evaluated against real usage, not just assumptions. Measuring impact is part of the design process.
Having a great team for this project made it all the more engaging. Great minds, great people, lots of laughter in the midst of this super complex project….what else can a person ask for?! 🥹💖