Enterprise Data Visualization Design System

Enterprise Data & Analytics Dashboard

Modernizing a legacy enterprise dashboard into a scalable, user-centered platform

Role

Senior UX/UI Designer

Timeline

6 months

Team

2 designers, 6 engineers, 2 PMs

Enterprise Data & Analytics Dashboard cover image

The Challenge

Lithia & Driveway (NYSE: LAD), one of the largest automotive retailers in the U.S., relied on a decade-old ASP.NET-based dashboard to deliver analytics across the organization. Originally built for a narrower scope, this legacy system had become outdated and increasingly misaligned with user needs.

The user base spanned multiple organizational levels — C-suite executives, Regional VPs, General Managers, Sales Managers, and accounting staff — each with distinct data needs, goals, and workflows. Creating a unified, flexible solution meant balancing high-level strategic insights with granular, day-to-day operational data.

Core problems:

  • Features were added reactively over time, resulting in fragmented experiences and visual inconsistency
  • No mobile access, creating significant issues for field-based users needing real-time data
  • Executives lacked quick access to performance metrics
  • Dealership staff couldn’t customize views for daily operations
  • Training new users was time-consuming due to UX inconsistencies

This wasn’t just a technology upgrade — it was a strategic redesign grounded in UX research and system scalability.

Research & Discovery

Contextual Inquiry

Conducted formal contextual interviews with end users across five distinct user groups — Executives, Regional VPs/Directors, General Managers, Sales Managers, and Accounting Staff. Using a structured facilitation guide and moderator script, I observed users in their actual work environments to understand:

  • Daily and weekly decision-making workflows
  • Current tools and data sources they relied on
  • Specific tasks that required dashboard data
  • Pain points and workarounds in existing processes

Interview structure:

  • Current state observation — Watched users navigate existing tools and complete real tasks
  • Task walkthroughs — Asked users to demonstrate how they currently access critical data
  • Pain point exploration — Probed frustrations, inefficiencies, and unmet needs
  • Future state visioning — Discussed ideal workflows and desired improvements

This contextual approach revealed not just what users said they needed, but what they actually did — often uncovering gaps between stated preferences and observed behavior.

“By the time I get the report I asked for, the window to act on it has already closed.”

— Director of Marketing

Workflow Mapping

Documented the existing data request pipeline through observation and interviews:

  1. Stakeholder identifies a question requiring data
  2. Files a ticket with data engineering team
  3. Engineer writes a custom SQL query
  4. Results delivered via spreadsheet or slide deck
  5. Stakeholder manually builds charts for presentation to their team

Average turnaround: 4-7 business days per request.

This workflow revealed critical bottlenecks: non-technical users were entirely dependent on engineering resources, creating delays that made data stale by the time it reached decision-makers.

Competitive & Pattern Analysis

Analyzed leading analytics tools — Looker, Tableau, Metabase, and Amplitude — to identify effective patterns for:

  • Dashboard composition and layout
  • Filter and drill-down interactions
  • Data visualization best practices
  • Permission and sharing models

Key Insights

Analysis of contextual interviews and workflow observations surfaced several critical themes:

  1. Context over data — Users didn’t need more charts; they needed metrics paired with context about what changed and why
  2. Role-based needs differ dramatically — Executives needed strategic KPIs at a glance; operational users needed granular, actionable details
  3. Progressive complexity — Executive summaries up front, with the ability to drill into detail on demand
  4. Trust through transparency — Showing data freshness, sources, and definitions increased user confidence in the platform
  5. Shared language gap — Departments used different terms for the same metrics, creating confusion and misalignment

User Personas

Synthesized research findings into detailed user personas that guided design decisions throughout the project. Each persona documented goals, frustrations, needs, motivations, primary activities, and current tools.

General Sales Manager Persona Example persona: Sarah, General Sales Manager

These personas became shared references across the product team, ensuring alignment on user needs during design and development decisions.

Design Process

Information Architecture

Structured the platform around flexible, role-based views:

  • Executive dashboards — High-level KPIs with trend indicators and regional comparisons
  • Operational dashboards — Daily metrics for dealership management and sales teams
  • Custom views — User-configurable layouts for specialized workflows
  • Mobile-optimized layouts — Responsive designs for field access

Design System Foundation

One of my most impactful contributions was designing and partially developing a branded design system for Lithia & Driveway’s enterprise applications.

While other tools in the ecosystem used out-of-the-box Material UI, I created:

  • Custom Figma component library aligned with brand guidelines
  • Material UI theming tailored to enterprise needs
  • Consistent UX patterns that could scale across multiple applications
  • Data visualization standards for charts, tables, and metric cards

Design system outcomes:

  • Established consistency across the product ecosystem
  • Reduced front-end rework and improved engineering handoff
  • Created a scalable foundation for future enterprise products
  • Elevated design maturity across the organization

Key Components

Customizable dashboards: Users could configure their view by dragging, resizing, and selecting which metrics appeared on their home screen.

Metric cards: Each KPI displayed:

  • Current value with clear visual hierarchy
  • Trend indicators and comparison periods
  • Context-aware drill-down to supporting data
  • Responsive behavior from mobile to desktop

Visual hierarchy: Clear distinction between strategic overviews and operational details, preventing information overload while ensuring depth was available on demand.

Mobile responsiveness: Prioritized critical metrics for small screens while maintaining access to detailed views through progressive disclosure.

Prototyping & Validation

Collaborated closely with stakeholders and engineers to prototype and validate key dashboard patterns. Iterative testing focused on:

  • Navigation models for different user roles
  • Customization workflows and save states
  • Mobile interaction patterns for field users
  • Data visualization clarity and accuracy

The new platform was built using a Next.js + React stack, ensuring modern performance, modularity, and long-term maintainability.

Results

The redesigned dashboard successfully addressed the shortcomings of the legacy tool:

  • Improved usability and consistency through a shared design system
  • Mobile access enabled for real-time decision-making in the field
  • Role-based customization supported all key user types
  • Reduced training time due to consistent, intuitive patterns
  • Scalable foundation established for future internal tools across LAD’s digital ecosystem
  • Design maturity increased organization-wide, positioning UX as a strategic partner

Reflection

This project reinforced the importance of rigorous user research and systems thinking in complex enterprise environments. The biggest wins came from deeply understanding user workflows before proposing solutions — observing how people actually worked, not just how they said they worked.

What worked well: Conducting formal contextual inquiry with structured protocols paid dividends throughout the project. Watching users navigate their existing workflows in real time surfaced pain points and workarounds that wouldn’t have emerged in traditional interviews. The persona artifacts became invaluable alignment tools — every design decision could be traced back to specific user needs we’d documented.

Building a comprehensive design system — rather than just designing screens — was equally impactful. It not only solved immediate interface challenges but established a foundation for long-term UX and engineering efficiency across multiple products. The consistency accelerated development and reduced cognitive load for users navigating different parts of the platform.

What I’d do differently: Bring data engineering into the design process earlier. Some of our most elegant UI solutions hit constraints in the data pipeline that could have been identified sooner with closer collaboration. I’d also invest more time in quantitative baseline metrics before launch — while we gathered strong qualitative evidence of improvement, having hard numbers on task completion time and error rates from the legacy system would have made the impact story even more compelling.