Axon Connected
Devices

Product Design Strategy, Metrics and Execution

Principled, Measured Experiences

At the time of my joining Axon in 2021, the user experience across devices and applications was organically "arrived at" as a series of disconnected implementations, each apporpriate individually, but not working well as a connected, seamless and unified product.

Consistent user feedback calling out functional inconsistencies, highlighted the need for coherent and cohesive user experiences across all devices in the platform.

Strategy Definition

Defining the ApproachIn order to design the needed unified user experiences, I started with developing a principle driven methodology. Starting here helped drive effectively the research, production, and measurement phases, guiding the process of crafting optimal connected 1st responder user experiences.

➊ Experience themes across devices

Establishing Experience Design Principles and MetricsAfter extensive product audits, user interviews, ride-alongs, and discussions with product leads, I synthesized the findings into themes ➊ which served to guide the experience design principles and articulate customer promises. Additionally, this principled approach helped anchor the metrics-driven experience validation.

The experience design principles ➋ I established focused us on:

  • Reliability: The product and features are consistently usable.
  • Clarity: All functions and features are intuitive, requiring minimal onboarding.
  • Flexibility: The product and features are context and workflow aware.
  • Connectivity: The product and feature experiences are connected and seamless within the Axon ecosystem.
➋ Experience design principles

The Measurement ProcessIn addition to improving and refining the design methodology, I implemented a measurement process ➌ to evaluate the quality of released user experiences across Axon's products. I established 3 touchpoints for assessment, feedback, and identifying improvements through testing, surveys, and in-person interviews.

This enabled data-driven optimizations to enhance customer experiences based on insights from monitoring released products.

  • Pre-Beta Benchmark: 1st internal usability score and task completion analysis – Conducted with internal SMEs.
  • During Beta: weekly survey and customer call cycle flagging painpoints and improvements — Conducted with Beta partners
  • Post-Release Benchmark: 2nd customer focused usability score and task completion analysis — Conducted with customers
➌ Experience benchmarking and measurement

Process Application

I applied this measurement process to the Axon Fleet in-car camera system companion app, which we transitioned from the laptop Windows platform to mobile, iOS and Android phone and tablet form-factors, for our national and international markets.

After months of research ➍, design ➎, prototyping ➏ and development, accompanied by progressive internal and external user testing, a coded version was ready for beta testing and user acceptance evaluation ➐.

When we were ready to move to Beta and release, we set out to apply the measurement steps with the following results:

  • Pre-Beta Benchmark ➑: SUS score was in the C+ / good range on the scale, identifying 43 issues and recommendations categorized by feature, customer promise, frequency of use, user effort, product, and cross-platform priority.
  • Beta ➒: Weekly four-question experience survey and customer experience calls were conducted, recording user painpoints and prioritizing issues for development and UX focus.
  • Post-Release Benchmark ➓ : The application scored at an A+ level on the SUS scale with 8 categorized issues, providing a clear view of the improvement amount and the positive experience our customers are having with the product.
➍ Fleet in-car video system research
➎ Workflows designed for iOS, Android, mobile, tablet and Apple Car-Play
➏ Workflows prototyped for testing
➐ Customer benchmarking tests
➑ 1st, internal SME benchmark result — C+ range, 40+ issues identified.
➒ Weekly experience surveys and customer calls
➓ 2nd, customer benchmark result — A+ range, 8 issues identified.

TakeawaysThe feature evaluation process results gave the team confidence in the product release and demonstrated measurable user experience improvements throughout the beta phase.

This approach currently ensures well-designed and consistently evaluated features and products, incorporating user input at every stage, improving customer experience and planning along the way.

Outcomes

Unified Experience Design

Unified approach across product and features, covering the Axon Device and Application ecosystem.

Quantified Measurements

Quantified experience measurements and metrics used in planning and reporting.

Increased Customer Satisfaction

3% NPS increase for Fleet 3, 5% increase for AB4 (BodyWorn camera)