How to enter a mature market (serial protocol analysis) owned by an entrenched competitor?
The competitor’s UX, considered industry-standard, was not capable of handling the latest emerging hi-speed serial protocol (PCIE3)
Our system was the most accurate and least invasive of any on the planet, but its user experience was awful.
The instrument's accuracy forced engineers to look at 1GB of data practically one bit at a time.
The business believed with our superior hardware and an improved UX, we could take significant market share from the competition. The only question was, “How?”
BEV Latency display: Each lane of traffic indicates how long a packet request took to be fulfilled. Red exceeds a threshold, indicating potential latency issues.
BEV LTSSM display: PCIE3 circuits "train up" determining what bandwidth they can operate at. The endpoints move through a series of states, prescribed by the protocol. Failure states can be easily seen in the colored bands.
BEV detail view. Each pixel in the BEV represents as much as 1Mb of data. Grazing over the BEV reveals specific data elements within that sliver of time.
BEV Error display: Things go wrong all the time in a PCIE3 circuit. This customizable display shows errors as symbols, placed in time and channels.
BEV Flow control display: PCIE3 circuits constantly monitor the receiver's ability to accept more data. When the display shows thickening or red lines, the engineer can zoom in to see why data isn't flowing smoothly.
BEV Flow control display detail.
We designed an information visualization offering multiple views into the same data acquisition.
This approach offered validation engineers a variety of ways to view the problem, without risking another data capture.
For example, the summary display (shown above), renders six separate views, all linked together in a “zoomable” interface.
One revolutionary approach was a “bird’s eye view” (the BEV): in a 100 pixel-wide container displaying five selectable visualizations. With the BEV, the debug engineer could quickly view important attributes of data sets as large as 16GBs!
Specification image of actual front panel showing LEDs that correspond to configuration screen elements.
Software front panel configuration screen showing 16 channels, 8 up and 8 down.
Software front panel configuration screen showing 8 lanes, 4 up and 4 down.
Note how the TSA permits non-standard routing from the physical probe to the logical lane.
Software front panel configuration screen showing self calibration and configuration.
Equally revolutionary was the front panel itself and its configuration screen. Our design for the software interface mimicked the physical front panel, but the visualization simultaneously provided status and configuration.
Debug engineers never simply "configured" test equipment. Instead, they perfomed a set of experiments to confirm the configuration was working. The display combined configuration with status, permitting the engineer to quickly determine the status of the configuration without having to suffer time-consuming data captures.
Participatory design artifact. Participants were asked to sketch a desired analysis screen using sticky notes and markers.
Participatory design artifact. Participants were asked to sketch a desired analysis screen using sticky notes and markers.
In-situ observations. Typical test bench environment.
Presumptive design prototype: animated sequences (in PowerPoint) rendered PCIE streams as they trained.
Presumptive design prototype: animated sequences (in PowerPoint) rendered PCIE packets as icons marching across the screen.
This effort built on the work I had already done for the Logic Analyzer product line. Because our target users were exactly the same as for the prior instrument, we could re-use the Personas, work models and experience schematics.
But protocol analysis is not at all like logic analysis, and the incumbent competitor had the benchmark user experience against which every other was measured. We had to design a completely different and more compelling experience if our prospects would even consider our solution.
I began with a series of in-situ interviews accompanied by participatory design exercises to tease out what was compelling about the competition's application and where it was weak. In addition, I performed an expert review of the competitor's application to identify key usability weaknesses.
In addition, I fielded a series of Presumptive Designs to test ideas about graphical representations of serial protocol information. (From those, I was able to file and be awarded several patents.)
Our best effort at displaying protocol information, circa 2009.
Competitor's "benchmark" display of protocol information, circa 2009.
Competitor's "benchmark" display of protocol information, circa 2009.
Analysis of PCIE variables against potential applications of software.
Analysis of PCIE state diagram.
Analysis of PCIE variables against Infovis data types and rendering guidelines.
From those engagements, we captured several key findings:
Early sketch of multi-window analysis system.
Early rendering of Summary Window based on Participatory Design sessions.
Evolution of protocol rendering: schematic view of end-to-end packets and transactions.
Armed with these and many other insights, I quickly landed on Tufte's "sparklines" as one way to render end-to-end acquisitions in a tight space. Partnering with a lighthouse customer, the hardware and software teams iterated through an agile approach, rapidly developing the "Summary Statistics" screen. (See my CHI2011 Honorable Mention paper for the details on this entire process).
The Summary Statistics screen was just the first step. We knew we had to show the entire protocol stack, not just the aggregate statistics. My strategy going forward was to eliminate separate reporting screens, whenever possible, and instead integrate visualizations directly into configuration and management tasks. This design strategy completely revolutionized the product line's approach to screen development. It was an opportune time to undertake this work, as the underlying architecture of the machine was also undergoing significant refactoring to address the new serial analysis application.
Excerpt from Bird's Eye View specification showing position and layout specs.
Excerpt from Bird's Eye View specification showing position and layout specs.
Excerpt from Schematic View specification showing several different forms of the schematic representation.
By the time the I had completed designs for the Bird's Eye View, the product line's software engineering teams had become fully agile. As a result, we were able to quickly iterate on initial design offerings, adding features and capabilities with each release (as opposed to delivering a monolithic release one time per year).
In addition to specifications for details on the visualization, I crafted prototypes in Processing to help me refine my designs, test them with users and communicate them with engineering team members.
A project of this magnitude takes multiple teams. I was the lead UX researcher, designer and architect throughout this multi-year effort.
I brought in senior UX designers to assist on specific initiatives: early prototyping sessions and analysis of PCIE as well as late-stage design development of the schematic view.
I worked closely with both hardware and software engineering teams to confirm I wasn't violating laws of physics, or creating a software stack that was unaffordable.
Finally, I partnered closely with Product Management and Sales to get in front of the right customers and help move our go-to-market plans forward.