Simplifying The Boarding Process (STBP) - AR design project
We can all agree that flying can be an anxiety-inducing experience with complex boarding procedures.
With the development of a the COVID-19 pandemic, It has become even more stressful for many people.
This is a look at how using AR helps reduce cognitive load and make the overall boarding experience easier.
Who are my users? - research & analysis
I conducted interviews with three people, each of whom has a different profile from one another.
I focused on the persona experiencing the most pain.
Further analyzing this persona using a customer journey map, I realized the most painful part of the process was from planning to check-in.
What is the problem?
Understanding which part of the process was most painful helped me crystallize a problem statement:
Frequent business travellers are frustrated with wasting time because they don’t know the minimum buffer time they need in order to not miss their flights.
How can I solve this problem?
Based on this problem statement, I defined the design direction for V1 as: getting users to check-in counters on time without being any earlier than they intended to.
To satisfy this direction, I decided to focus on features that simplify and streamline the process, from planning at home to check-in at the airport.
From an AR interaction perspective, the design would focus on requiring minimum effort from users while providing pertinent information at the right time.
Rapid Ideation & Prototyping
I’m heavily influenced by ergonomic principles and agile methodology to make sure the experience is user-centric through iteration.
I broke down the ideation & prototyping process into 3 areas - AR ergonomics, AR interactions, and storyboarding.
Based on ergonomics best practice, the most important information should be placed between eye level and 35 degrees below.
For tapping, the distance was based on a person’s arm length (40cm-55 cm), but to simplify the design, all buttons for tapping were placed at 40cm.
For swiping, the hand needed to be seen by the AR glasses, so users would have to raise their hands.
Most screens would be placed at a 2 m distance.
User Testing & Feedback
I made props out of wire and heavy-weight sketch paper so I could rapidly prototype and test out the above 3 areas with a user.
After observing and talking to the user, I gathered important feedback!
Interaction Feedback
Typing in mid-air is not comfortable at all
Hand interactions are more laborious overall
Blinking control feels weird
Interaction Changes
Overall, I decided to replace hand interactions as much as possible with interactions that feel more natural to users, such as gaze and speech. I also aimed to reduce the need to interact by automating tasks.
Below is the list of interactions I put together based on the feedback:
Gazing to expand the screen
Gazing to start voice typing
Look away to minimize screen
Auto scheduling uber
Auto minimize when crossing street
Auto people detection
Setup Workflow Feedback
Address section control buttons are confusing. One is asking the user to swipe but the other is asking the user to tap
Summary page information is confusing. User is having a hard time connecting the buttons with the time
Users don’t care about integration. They just want to know if the app will call Uber
The last page is confusing. User is not sure what to do
Too much information is provided repeatedly. User feels overwhelmed.
Setup Workflow Changes
I simplified and rearranged the information presented on each screen to reduce information overload, making the following changes:
Moved address section one step back
Moved leave time field closer to the option buttons
Changed the wording on Uber integration
Clarified the information on the reminder page
Hi-fi Prototype
I used After Effect to model the hi-fi prototype of this experience because of its robust features of embedding animated graphics into video recordings. It helped me visualize the experience in the right context, such as gaze interactions during set up.
The below video helped me visualize interaction when users are moving around such as the auto-minimize feature when they cross the street.
What I learned
I was happy with how much the design had improved after testing with the user. Rapid prototyping is extremely important in this context (AR glasses) so the 3D usability can be tested as soon as possible.
I also noticed that when designing a new experience creating familiarity helps enabling affordance and reducing friction. For example, tapping a button felt more natural to my users than blinking.
What’s next?
The AR/VR industry is developing at a light speed and will continue to accelerate. Keeping up with the latest capabilities such as Varjo and Hololense2 is crucial to propose innovative AR/VR experiences.
I will continue exploring this experience, especially the setup workflow, for which my vision is a handless process that feels as easy as telling your mom what to eat for lunch.
More importantly, I will continue to explore VR rapid prototyping tools so I can quickly sketch out ideas and test for AR & VR directly in 3D.
Stay tuned! :)
P.S. Here is a list of deep-dive posts for each stage of this project: