Charlie

Voice-Based Navigation Assistant

Overview

Within the vast field of retail, there are specialized areas like outdoor recreational gear. From our research we found that novice shoppers do a majority of surface-level research on color and style but lack information on technical information regarding safety and maintenance. Technical information is extremely important to understand because in some cases it could be the deciding factor of life and death. Our team designed Scout, an in-store tool that helps shoppers explore and learn about different products and activities and in turn make informed decisions. 

Scout is a white-label system, with hopes that the product can be used in other specialized areas of retail.

Role

UX Researcher

UX Designer

Director 

Editor

Timeline

1.5 Weeks

Team

Eugene Meng

Mai Bouchet

Marina Lazarevic

Siyi Kou

Shravya Neeruganti

Sponsored by

University of Washington, Seattle

SEE charlie IN ACTION

Process

01

Research

Secondary Research

Competitive Analysis

Multimodal Systems

Portable Navigation Devices

02

Ideation

Concept Generation

Individual Ideation

Team Ideation

03

Design

Script Creation

Film-Making

Animation

RESEARCH

From our research we found that the majority of navigation systems today rely heavily on visual components. This is extremely dangerous as this kind of navigation tends to increase the cognitive load on the driver and therefore attention is taken away from actually driving and placed somewhere else. We found from our research that this division of attention has proved to be fatal.

Our team researching the navigation space.

problem space

"When drivers left their route intentionally, navigation systems produced 56% false acoustics and 65% false visual messages respectively."

 

1. Current navigation applications do not adapt well to changes. For example, if a driver chooses to deviate from a route, he/she will receive insufficient feedback or cryptic, unhelpful commands. 

2. Communicating intentions during a trip is difficult. Often it results in drivers needing to fumble with a touchscreen while driving which is dangerous. 

3. Information exchange loops are inadequate and existing navigation improvements do not focus enough on driver-device interactivity. 

Ideation

After conducting secondary research and understanding our problem space well, we found that the most promising navigation system would adapt to driver behavior and changes integrating multimodal capabilities such as tactile, audible, or visual to enhance voice interactions. Our team set out to find out how other sense modalities could be tapped into for our navigation system. Our team began to brainstorm ideas to explore how someone would interact with a navigation system.

Some of our team's concepts during ideation phase.

Design

After the ideation phase, we identified 3 ways Charlie could help the driver reach his/her destination by combining different multimodal elements to develop a seamless interaction between Charlie, the driver and even a passenger seated in the car.

Designing 3 navigation modes 

Guided Mode - Leader

 

Current navigation applications speak a mechanical language and gauge the road by numbers. The information is absurd for drivers who lack contextual awareness of the road. In guided mode, Charlie offers navigation information in conversational language and develops a customized language set for different users. In order to prevent errors and make accurate decisions, Charlie will notify the user when she does not understand the input or the road condition, and will prompt the user to act accordingly.

Learning Mode - Follower

 

In real-life driving scenarios, people often want to take a detour or their preferred routes different from what recommended by their navigation. Learning mode features background machine learning when Charlie is muted to follow the lead of the user. In this mode, Charlie learns the user's preferred route and driving habits, and applies those to her future work.

Assisted Mode - Helper

 

In assisted mode, a passenger is guiding the driver with visual guidance offered by Charlie while the default setting of Charlie is mute. When the passenger is distracted, by double-tapping or calling Charlies name, the driver can re-activate Charlie to acquire the guidance for the next step. Learning mode can work concurrently with this mode for Charlie to pick up the language of the guiding passenger.

Next Steps

Our team would like to test Charlie's intelligence and functionality. We would create a behavioral prototype to understand if Charlie is giving useful and intuitive information to the driver. This would be a crucial step in developing the navigation system.

UP NEXT

Mobile

Designs

Prototyping

PopSockets

Tipping

Point

BrellaBox

Visual

Designs

Charlie

  • White LinkedIn Icon
  • White Twitter Icon
  • White Pinterest Icon
  • White Facebook Icon

Made with love in San Francisco