top of page
Desktop - 3.png

UX Design and Research

Designing for autonomous machines automating inventory management

bc_logo_light 1.png

Role

Product Designer

Platform

BrainOS Robot UI

Duration

Jun '22 - Aug '22 (3 months)

Overview

Sense robots help retail business like Menards, BJs, and Walmart with their inventory management. A smart screen display on the robot helps operators interact with the machine to teach routes, assign tasks, and analyze data. Currently in the market, Gen2 robots have a steering wheel and a seat that helps drive the robots manually when necessary. Gen3 robots, with an update to the robots' build from the previous generation, operators are likely to wonder,

"How do I move the robot?"

Problem

Unlike the Gen2 robots, Gen3 robots no longer have a seat or a steering wheel to drive the robot manually to teach routes or to assist them when stuck. Instead a remote controller will be programmed to help control it. Now, because the robot operators change so very frequently,

"How might we help users with no experience learn how to operate the robot?"

Screen Shot 2022-10-08 at 4.36.51 PM.png

steering wheel

sense cameras

seat

scrubber

Gen2 Sense Robot 

Screen Shot 2022-10-08 at 4.53.04 PM.png
Screen Shot 2022-10-08 at 4.41.56 PM.png

display

sense camera

Gen3 Sense Robot 

Solution

By understanding the user's workflow, I identified the different scenarios an operator would need to use the controller. Based on which, in collaboration with the engineers, I designed the UI screens to help ease the use of the controller to drive the robots and conducted usability testing to help understand and improve the experience created.

Impact

Simplifying the visualization and the workflow of using the controller, and identifying and solving for every use case resulted in:

Screen Shot 2022-09-21 at 6.39.33 PM.png

increased ease of use for end-customers

Screen Shot 2022-09-21 at 6.39.39 PM.png

reduction in crashes and support requests

Screen Shot 2022-09-21 at 6.39.46 PM.png

projection of a higher rate of customer satisfaction.

__________________

Features

Zero Connect 

Setting up the controller

An improvised and easy to follow setup that allows robot operators to quickly connect without any hassle.

Quick Controls 

Quick guide for robot operators

A simplified and quick look of where to locate and how to use the controller to move the robot.

Drive App

On-screen controller

An intuitive on-screen controller for when robot operators need to quickly respond to an assist request from the robot, and don't have access to the physical controller.

__________________

Define

01.
When is this a problem?

After shadowing a deployment of one of the Gen2 robots in the field, I identified that after deployment, the workflow to setup the controller and teach the robots the routes are changing for Gen3 robots and needed to be resolved.

Screen Shot 2022-09-04 at 5.18.03 PM.png

Store Receives Robot

Screen Shot 2022-09-04 at 5.18.29 PM.png
Screen Shot 2022-09-04 at 5.19.35 PM.png

Install

Unbox

Run Routes

Screen Shot 2022-09-04 at 5.21.16 PM.png

Setup

Teach, Annotate, Generate Routes

Screen Shot 2022-09-04 at 5.19.12 PM.png

02.
Who is this a problem for?

From the field study and existing personas, I observed that other than having to move the robot to teach routes during deployment, the need to move/assist a robot may also arise while it autonomously runs its daily tasks. This is when the other store associates have to play the operator's role and need to figure out how to move/assist the robot.

Screen Shot 2022-09-04 at 5.29.39 PM.png

"Setting up scanner routes is much more complex than the scrubber. And I have to do both."

- Deepak | Deployment Specialist

I keep my section neat and well stocked, and I help my customers find things. 

- Sarah | Sales Associate

Screen Shot 2022-09-04 at 5.30.17 PM.png

03.
Mapping the different scenarios

I then identified the different scenarios an operator would interact with the controller. I broke down and mapped the workflows in each of those scenarios. This map helped me understand when and where there is a need to bridge the gap between the robots and the operators. It also revealed how the workflows are interconnected.

Screen Shot 2022-09-18 at 10.51.41 PM.png
Screen Shot 2022-09-18 at 10.51.33 PM.png

Controller workflow diagrams mapping out different scenarios when the user would interact with the controller to move the robot.

Design

01.
Ideation

Goals.jpg
Paired.jpg

Goals and assumptions were brainstormed, discussed, and noted.

Controls.jpg

Mapping of controller requirements and functions

Apart from understanding the goals, and potential use cases of the controller, I began to look into existing solutions in the market for pairing wireless devices. Breaking down the pairing process for AirPods with iPhones, highlighted that pairing using bluetooth and wifi can massively reduce the number of steps involved and provide a hassle-free pairing and connecting process.

Screen Shot 2022-09-21 at 8.08.13 PM.png

A step by step breakdown of the interactions involved in connecting AirPods to an iPhone

02.
Prototyping

To develop the concept and receive valuable feedback from the users, I designed high-fidelity prototypes that can be used for testing. Based on the identified scenarios, I broke the design down into three user flows:

a. The machine is fresh out of box. Controller needs to be setup.

The screens were broken down with one focus of action at a time, making it easier for the user to follow along, and receive timely feedback of what's happening, and what's next.

Actions

Feedback

Screen Shot 2022-10-01 at 3.46.23 PM.png

Clicking on this button takes the user to the next screen to begin the setup 

Start button

Pressing the start button, the controller sends feedback to the robot that it is ON. The robot  automatically directs the user to the next step.

Activate pair
Pairing mode

When the pairing mode is being activated, the user stays informed.

Hold close
ezgif.com-gif-maker (5).gif

The robot detects the device and continues to connect with the controller

Holding the controller close to the robot allows the robot to detect the device and pair instantaneously with no more actions required from the users.

Screen Shot 2022-10-01 at 3.49.11 PM.png

The user is informed that their setup is successful and can now continue to use the controller.

b. Teach routes. Check controller status and controls.

I designed a dashboard for the controller accessible from the menu at all times for the users to be able to find all features of the controller from one place

1

Virtual Control Pad.png
Controller Menu.png
Learn.png
Devices.png
Hamburger Menu.png

On-screen controller in cases when the controller is not readily available to assist the robot.

Connection status

The controller is always made accessible from the menu on the top nav bar.

Illustrated controls and functions.

Battery status

To connect other nearby controllers.

1

2

2

3

3

c. Assist robot. Path is blocked. Controller is broken.

When the controller runs out charge, cannot be found, or when it is not working, the solution to having an on-screen controller comes handy to assist the robot move around obstacles in the way. The single handle on-screen controller tries to mimic the controls of the physical controller to quickly operate without a learning curve. 

2

Assist screen directing the users to resolve the issue.

Assist Maximized.png
ezgif.com-gif-maker (6).gif

Drive App button takes the user to this screen to allow them to drive the robot using the on-screen joystick handle.

ezgif.com-gif-maker (7).gif

Quick controls to help users act without much delay.

1

Evaluation

01.
Usability Testing

With the prototypes ready, I collaborated cross-functionally with the engineers, product managers, and deployment specialists, to refine the design, and do some dry runs. Based on their inputs, I recruited 5 in-house robot operators with a varying range of experience using gaming controllers for the usability test. Considering all the use cases for when an operator will interact with the controller, the test was broken down into 5 scenarios in a sequential order.

UserTesting-6_edited.jpg
UserTesting-70_edited.jpg

Pictures captured during the usability test. That's me on the right, holding the figma generated interactive prototype on a tablet against a life-size prototype of the Gen3 robot.

02.
Key Findings

1. 3/5 participants couldn't locate the controller in the robot.
2. 2/5 participants found it
difficult to access/navigate to the on-screen controller

It also turned out to be that the participants with more experience using controllers for gaming found it way easier to comprehend the controls of the controller as compared to participants with little or no prior experience. 

Screen Shot 2022-10-01 at 11.09.15 PM.png
Screen Shot 2022-10-01 at 11.21.17 PM.png

Test observations and recommendations

03.
Next steps

Based on my recommendations, the engineering team implemented changes, and added features that created impact. We were surprised to learn that the users found it difficult to test with a stationary robot model. We decided to conduct second round of tests with a moving robot to get more accurate feedback.

Deliverables

Final Prototype

ezgif.com-gif-maker (11).gif

Click me to interact with the prototype!

Reflection

Users are always changing

Working on this project helped me understand how to approach a design problem when the users are changing frequently. This way you can't hold the design for "they'll learn over time." With this in consideration, it was challenging to design for a new user and an expert user and at the same time - not overload the users with a lot of information so that the new users are not overwhelmed, and the expert users are not annoyed.

Autonomy in everyday life

It was an exciting experience to work at the intersection of humans and robots in everyday settings. Closely working with a company focused on autonomous services for everyday needs of businesses like cleaning, and inventory management helped me understand the balance they thrive to achieve between their user base and the business.

bottom of page