top of page
UX Design and Research
Designing for autonomous machines automating inventory management
Role
Product Designer
Platform
BrainOS Robot UI
Duration
Jun '22 - Aug '22 (3 months)
Overview
Sense robots help retail business like Menards, BJs, and Walmart with their inventory management. A smart screen display on the robot helps operators interact with the machine to teach routes, assign tasks, and analyze data. Currently in the market, Gen2 robots have a steering wheel and a seat that helps drive the robots manually when necessary. Gen3 robots, with an update to the robots' build from the previous generation, operators are likely to wonder,
"How do I move the robot?"
Problem
Unlike the Gen2 robots, Gen3 robots no longer have a seat or a steering wheel to drive the robot manually to teach routes or to assist them when stuck. Instead a remote controller will be programmed to help control it. Now, because the robot operators change so very frequently,
"How might we help users with no experience learn how to operate the robot?"
steering wheel
sense cameras
seat
scrubber
Gen2 Sense Robot
display
sense camera
Gen3 Sense Robot
Solution
By understanding the user's workflow, I identified the different scenarios an operator would need to use the controller. Based on which, in collaboration with the engineers, I designed the UI screens to help ease the use of the controller to drive the robots and conducted usability testing to help understand and improve the experience created.
Impact
Simplifying the visualization and the workflow of using the controller, and identifying and solving for every use case resulted in:
increased ease of use for end-customers
reduction in crashes and support requests
projection of a higher rate of customer satisfaction.
__________________
Features
Zero Connect
Setting up the controller
An improvised and easy to follow setup that allows robot operators to quickly connect without any hassle.
Quick Controls
Quick guide for robot operators
A simplified and quick look of where to locate and how to use the controller to move the robot.
Drive App
On-screen controller
An intuitive on-screen controller for when robot operators need to quickly respond to an assist request from the robot, and don't have access to the physical controller.
__________________
Define
01.
When is this a problem?
After shadowing a deployment of one of the Gen2 robots in the field, I identified that after deployment, the workflow to setup the controller and teach the robots the routes are changing for Gen3 robots and needed to be resolved.
Store Receives Robot
Install
Unbox
Run Routes
Setup
Teach, Annotate, Generate Routes
02.
Who is this a problem for?
From the field study and existing personas, I observed that other than having to move the robot to teach routes during deployment, the need to move/assist a robot may also arise while it autonomously runs its daily tasks. This is when the other store associates have to play the operator's role and need to figure out how to move/assist the robot.
"Setting up scanner routes is much more complex than the scrubber. And I have to do both."
- Deepak | Deployment Specialist
I keep my section neat and well stocked, and I help my customers find things.
- Sarah | Sales Associate
03.
Mapping the different scenarios
I then identified the different scenarios an operator would interact with the controller. I broke down and mapped the workflows in each of those scenarios. This map helped me understand when and where there is a need to bridge the gap between the robots and the operators. It also revealed how the workflows are interconnected.
Controller workflow diagrams mapping out different scenarios when the user would interact with the controller to move the robot.
Design
01.
Ideation
Goals and assumptions were brainstormed, discussed, and noted.
Mapping of controller requirements and functions
Apart from understanding the goals, and potential use cases of the controller, I began to look into existing solutions in the market for pairing wireless devices. Breaking down the pairing process for AirPods with iPhones, highlighted that pairing using bluetooth and wifi can massively reduce the number of steps involved and provide a hassle-free pairing and connecting process.
A step by step breakdown of the interactions involved in connecting AirPods to an iPhone
02.
Prototyping
To develop the concept and receive valuable feedback from the users, I designed high-fidelity prototypes that can be used for testing. Based on the identified scenarios, I broke the design down into three user flows:
a. The machine is fresh out of box. Controller needs to be setup.
The screens were broken down with one focus of action at a time, making it easier for the user to follow along, and receive timely feedback of what's happening, and what's next.
Actions
Feedback
Clicking on this button takes the user to the next screen to begin the setup
Pressing the start button, the controller sends feedback to the robot that it is ON. The robot automatically directs the user to the next step.
When the pairing mode is being activated, the user stays informed.
The robot detects the device and continues to connect with the controller
Holding the controller close to the robot allows the robot to detect the device and pair instantaneously with no more actions required from the users.
The user is informed that their setup is successful and can now continue to use the controller.
b. Teach routes. Check controller status and controls.
I designed a dashboard for the controller accessible from the menu at all times for the users to be able to find all features of the controller from one place.
1
On-screen controller in cases when the controller is not readily available to assist the robot.
Connection status
The controller is always made accessible from the menu on the top nav bar.
Illustrated controls and functions.
Battery status
To connect other nearby controllers.
1
2
2
3
3
c. Assist robot. Path is blocked. Controller is broken.
When the controller runs out charge, cannot be found, or when it is not working, the solution to having an on-screen controller comes handy to assist the robot move around obstacles in the way. The single handle on-screen controller tries to mimic the controls of the physical controller to quickly operate without a learning curve.
2
Assist screen directing the users to resolve the issue.
Drive App button takes the user to this screen to allow them to drive the robot using the on-screen joystick handle.
Quick controls to help users act without much delay.
1
Evaluation
01.
Usability Testing
With the prototypes ready, I collaborated cross-functionally with the engineers, product managers, and deployment specialists, to refine the design, and do some dry runs. Based on their inputs, I recruited 5 in-house robot operators with a varying range of experience using gaming controllers for the usability test. Considering all the use cases for when an operator will interact with the controller, the test was broken down into 5 scenarios in a sequential order.
Pictures captured during the usability test. That's me on the right, holding the figma generated interactive prototype on a tablet against a life-size prototype of the Gen3 robot.
02.
Key Findings
1. 3/5 participants couldn't locate the controller in the robot.
2. 2/5 participants found it difficult to access/navigate to the on-screen controller.
It also turned out to be that the participants with more experience using controllers for gaming found it way easier to comprehend the controls of the controller as compared to participants with little or no prior experience.
Test observations and recommendations
03.
Next steps
Based on my recommendations, the engineering team implemented changes, and added features that created impact. We were surprised to learn that the users found it difficult to test with a stationary robot model. We decided to conduct second round of tests with a moving robot to get more accurate feedback.
Reflection
Users are always changing
Working on this project helped me understand how to approach a design problem when the users are changing frequently. This way you can't hold the design for "they'll learn over time." With this in consideration, it was challenging to design for a new user and an expert user and at the same time - not overload the users with a lot of information so that the new users are not overwhelmed, and the expert users are not annoyed.
Autonomy in everyday life
It was an exciting experience to work at the intersection of humans and robots in everyday settings. Closely working with a company focused on autonomous services for everyday needs of businesses like cleaning, and inventory management helped me understand the balance they thrive to achieve between their user base and the business.
bottom of page