Clear Airport Pod UI

CLEAR’s Core Product

CLEAR airport pods are central to its business, not only because they are used to verify travelers’ biometric identities, but more importantly the pod is where new users are enrolled in CLEAR. However one of the of the key challenges with enrollment at airports is that everyone is in a hurry and ambassadors have limited time to sell the product to new customers.

I was brought on to an existing rebuild of the airport pod interface that was struggling to get across the finish line. At the time my role at CLEAR included leading visual design for the whole design team, leading CLEAR Design System and leading design for our flagship mobile app. I was asked to help the team working on the airport pods create a user interface system based on our broader design system that could handle the specific needs of this proprietary touchscreen hardware device.

Promotional photo of CLEAR Airport Pod.

Before: An Outdated Platform

The existing interface was over a decade old and was built on a very brittle software stack. It was complicated and out of date, but ambassadors were used to it and quite hesitant to adopt a new system. 

Examples of the existing experience that was almost a decade old when this project started.

Getting Started: UI Sizing

I began by taking a look at the existing pod experience as well as CLEAR’s recently updated enrollment process for mobile app and web. I identified a few key screens and explored some updated layouts for those screens.

One of the most important things I did at this stage was explore many different pixel densities/zoom levels. The pods used an old and relatively clumsy touch screen. So we had to be sure we had the perfect balance of information density and touch screen usability. The designers working on this project before me were trying to use our design system components at 2x size, which had worked well on a UI overhaul I did with our stadium kiosks, but was taking up too much space given the complexity some of the pod experience demanded. I had my own version of the touch screen hardware at my desk so I could rapidly experiment with different sizes and after some experimentation I landed on 1.625x.

Early exploration of key screens using different pixel densities/rem values in order to find the perfect zoom level that ensured component usability and ideal information density. I landed on the third option down from the top.
I tested component size and usability on this Surface Pro 7 at my desk which had the exact same touch display as the CLEAR Airport Pod.

CLEAR Design System (an ongoing project I also led) used relative ems (rems) for all of our UI components (shout out to some very smart front-end engineers for suggesting this, and eliminating the need for more than one system library). We had made sure of this because we knew the system had to work across consumer mobile and desktop devices as well as a variety of older CLEAR proprietary hardware displays. Rems enabled us to resize the UI by simply changing the root rem value (in this case for 1.625x, we bumped the root value of 16 up to 26. I then made sure my Figma art board was scaled up or down to match that adjusted rem value the engineers would use to implement the UI. 

Examples of different products all using the same component library from CLEAR Design System but using different root REM values to scale the components based on the type of touch display the hardware was using.

Exploration: Defining Templates

Another hurdle was layout, we had no UI guidelines for how to layout an experience for a proprietary hardware device where things like scrolling was just about impossible. So as I got to work exploring UI solutions, I also began to define a set of templates. 

UI Exploration

Below are some examples of specific layout explorations from start to finish.

Iteration of terms and condition from left to right.
Iteration of biometric hub from left to right, top to bottom.
Iteration of biometric capture from left to right, top to bottom.
Iteration of phone number entry from left to right, top to bottom.
Iteration of payment method entry from left to right, top to bottom.
Iteration of ID scanning from left to right, top to bottom.
Iteration of cost breakdown ledger from left to right, top to bottom.
This is an exploration for an expanding progress indicator & navigation component that was not needed in the final strategic direction.

Templatization

Here are a handful of basic UI templates we built into the pod’s system (left) and how they were used for specific tasks in the user flow (right).

I developed modal and full screen takeover patterns that also leveraged the core templates above.
Previous designs had used the device’s native keyboard or various numeric keypads. I streamlined all numeric entry to use the same keypad side panel that we focussed on building as a robust, well-tested reusable component.

UX & User Testing: Order, Complexity & Context

So far this case study has focussed on user interface design, templates and systems. But while I was establishing a UI direction I was also thinking a lot about the user flow, pacing and voice/tone as well. One of the assumptions I was operating under was that making the product not only look, but also behave more like a consumer app would set us up for a self service future AND make the process easier for ambassadors. However, I heard early feedback that ambassadors might have a hard time adopting a radically new system. So once we had a v1 prototype I drove up to Dallas-Fort Worth International Airport and tested the new flow with several of our most experienced CLEAR ambassadors.

The complete Figma prototype I created for testing.
Me moderating a user test with one of the experienced CLEAR Ambassadors at DFW.

I brought a Surface Pro tablet that used the same screen as our pods with a Figma prototype and I used black gaffer tape to attach it to one of the airport pods, then I had the ambassadors go through the flow as if they were enrolling me. Because we were working fast and lean, and it was just me and the ambassadors in the field, I set up my phone on a tripod with a zoom call going so my team could watch and take notes.

My product teammates following along and taking notes on a zoom call from my phone.

The biggest take aways from the research were: 1) the friendly voice and tone came from the human ambassador, so the content design could be much simpler and straightforward (more single word labels and fewer conversational sentences), 2) ambassadors were experts in the software and did not need things simplified nearly as much as we thought, they could handle a more complex interface and, 3) having consumers interact directly with the pods was years out in our timeline so it was better to make smaller flow changes for easy adoption for the primary users (CLEAR ambassadors). This user testing proved to be incredibly valuable for making sure the solution would be effective.

This was one of those cool moments when a test participant was so immersed, she forgot she was using a prototype taped to the pod and tried to actually scan her fingerprints.

Final Designs

Here are some selected moments from the final experience using the templatized system I designed.

Sleep state (left) and start screen (right).
CLEAR Airport Pods at Austin-Bergstrom International Airport with the updated software I designed installed and in use.