Wes: A.I. Voice Navigation
An A.I. voice navigation concept specifically designed to assist visually impaired people with way-finding when visiting Westfield Century City Mall.
Client
Date
Team
My Role
Process
Question
How might we create an interactive system that solves transportation and mobility issues within a city center?
Concept

Since we chose Westfield Century City as our site, we thought we would kick it up a notch and sought to see how we can also help those with visual impairments, from partial to total vision loss, aka the visually impaired (V.I.).
Site Visit

We took a trip to Westfield Century City and observed some of the way-finding issues that those with and without impairments may face when visiting. This particular mall is very large and has gone through a billion dollar renovation, but way-finding here is difficult for everyone.
The signs are small and scarce, the digital directory is confusing and doesn’t provide turn-by-turn guidance, nor is it inclusive for the visually impaired. There was a feature for visitors in wheel chairs but it just moves the screen down to be more accessible.
Self-Test of Current System

To really get a feeling of how it would be to walk around the mall as someone with visual impairment, we purchased a walking cane, blindfolded ourselves, and took turns recording our thoughts on the experience.

Walking around without sight, we needed to rely on our hearing, the cane, and biological compass to successfully navigate anywhere, but overall, it just felt like navigating through a giant space with no idea where to go; coupled with disorienting, ambient noises.
“These first hand observations let us know how difficult, stressful, and unwelcoming the mall can be for visually impaired people.”
Interviews
After seeing what it was like for ourselves, we needed to speak with our core target themselves, the visually impaired community. When we spoke to them, we asked specific questions about their experiences with Westfield Century City and malls in general. Essentially, they reinforced our observations of how much of an unpleasant experience getting around the mall can be as well as new insights.

“I can’t really go whenever I want because I need to ask someone to go with me and I have to work around their schedule.” -Katy

“I can’t see details so signs don’t work for me.” -Mark

“[The mall] It’s unpleasant. The directory itself is not accessible and you can’t find where everything is.” -Wyam
Prototype Conception

Our main concept was to create a handsfree, yet interactive, way-finding experience for the visually impaired. From our research, we figured out some ideas on how someone with visual impairment could be assisted, such as an A.I. chatbot, hyperlocal geolocation, and environment descriptions.

Testing
We were fortunate enough to have a few connections from the organization called Wayfinder Family Services where we met our interviewees. One in particular, named Mark (see above), agreed to test with us one fall afternoon. We coordinated his Uber and met up with him at the mall, which was where we wanted to start our test. Using just our voices, team member Chase pretended to be the A.I. chatbot and guided our visually impaired guests through the mall. We made sure to record all of his experiences.

Test Results
After testing these concepts with both non-V.I. and V.I. people, we found out these were very useful features and would certainly help them with the needs they addressed in regards to navigating the mall.

"I like a lot of descriptions."
Mark called the day before and he was nervous. It’s in his personality to like to be in the know, especially in a new environment. The descriptions make up the rest of the picture of what he can’t see.
"Preferences matter, How will I be able set defaults and levels of assistance?"
Reiteration
We wanted to really give our users more accurate representation of the concept so we had them use AirPods for a handsfree experience and also created a mid-fidelity prototype with an onboarding screen.

The onboarding needed to be as simple as possible, we used the existing Westfield app and added a start button. We feel the onboarding should be within the A.I. itself for a seamless flow into navigating.
Competitive Analysis
Below is a diagram of how Wes compares to other services and apps available for the the visually impaired community that visit malls and other outdoor venues.

Presentation
When we presented our research, prototype, and insights to the stakeholders (judges), it was met with great enthusiasm. The critique highlighted the high feasibility of such a concept, our ability to showcase our insights in a clear way, and the pleasant aesthetics of our presentation. The stakeholders didn’t feel lost or confused and felt connected to our target audience. It was this strong positive reaction that let us know that we did a successful job on our concept.
Reflection
This project really displayed how underserved our visually impaired community is and how exclusive public spaces can be.
Although there are a lot of people out there trying to tackle this issue, there is certainly some missed opportunities.
As a team, we worked so well together, with each member so integral to this project, that it’s difficult to pinpoint where one’s involvement starts and where the other finishes.
