XINGLU YAO
XINGLU YAO
robovie.jpg

HUMAN-ROBOT INTERACTION

In this project, we designed a futuristic robot "Robovie" to explore if robots could increase people's creativity. I took multiple responsibilities in this project, including speech script development, robot controller, non-verbal behavior design, exploratory data analysis, etc.

I will specifically talk about how I developed the non-verbal behaviors of Robovie.

Collaborators/Mentors: Takayuki KandaHiroshi IshiguroPeter Kahn, Jolina Rucket, Solace Shen, Heather Gary, Brian Gill
My Role: Speech and Behavior Design, Data Analysis, Literature Review, Research Report, Usability Testing
Timeline: 2013 - 2014 (2 weeks for gestures and eye gaze design)
**All the photos belong to HINTS lab at University of Washington

Will you be inspired by a robot?

The Challenge

After we had designed the interaction patterns, speech script, creativity tasks, and speech scripts, I started creating Robovie’s non-verbal behaviors. We specifically need Robovie’s gesture and eye gaze to accomplish three goals:

  1. Guide Interaction, suggesting participants when to look at the screen, when to work on Zen Rock Garden, etc.
  2. Reinforce the robot’s primary attitude - encouraging.
  3. Ease tension. 

 

Wizard of Oz Test 

Robovie appeared autonomous to participants but was controlled behind the scene. Robot controllers listened to participants and then delivered instructions to Robovie based on prewritten scripts.

 
Photo Credits: HINTS Lab at University of Washington

 

The robot

I started off understanding Robovie's technical details.  

Robovie has two arms (each has 4 degrees of freedom [DOF]), a head (3 DOF), two eyes (each has 2 DOF), a mobile platform (two driving wheels and one free wheel), 10 tactile sensors, an omnidirectional vision sensor, two microphones to listen to human voices, and 24 ultrasonic sensors for detecting obstacles. The eyes have a pan-tilt mechanism with direct-drive motors, and they are used for stereo vision and gaze control. Robovie represents a cutting-edge interactive technology.

Robovie has two arms (each has 4 degrees of freedom [DOF]), a head (3 DOF), two eyes (each has 2 DOF), a mobile platform (two driving wheels and one free wheel), 10 tactile sensors, an omnidirectional vision sensor, two microphones to listen to human voices, and 24 ultrasonic sensors for detecting obstacles. The eyes have a pan-tilt mechanism with direct-drive motors, and they are used for stereo vision and gaze control. Robovie represents a cutting-edge interactive technology.

 

Story Boards

I visualized the script into storyboards and designed Robovie's behaviors. Based on the problem statement, I generated "must have" behaviors and "nice to have" behaviors. 

"Must have" behaviors are essential in moving the interaction forward, while "nice to have" behaviors reinforced Robovie's personality and built rapport between participants and Robovie.

 

Implementation

Robovie is a Japanese robot developed by researchers at the Advanced Telecommunications Research Institute International (ATR). ATR developed a custom robotics control software specifically for Robovie's speech and non-verbal behaviors. 

We programmed combination gestures (i.e. pointing, wave hands, shake hands, etc.) into Robovie. For simple gestures such as "turn the head" and "maintain eye contact", robot controllers could directly operate through the console.

 

User Tests

During the pilot test, I discovered some problems and made adjustments accordingly:

"I am not sure if Robovie is paying attention to what I am doing!"
This is a standard issue in designing social robots. I studied Social Robot pioneer Cynthia Breazeal's work to solve this problem. Specifically, I designed the robot to look at the same thing the participant was looking at (theory of joint attention). Follow-up interviews with participants confirmed the effectiveness of this design. 

Participants' attention drifted away after thirty minutes.
Since the design session lasted for more than half an hour, we need techniques to keep participants concentrated. We added a joke and paired Robovie's talking with gestures to help participants stay focused.

 

Outcome

To evaluate, we recruited two groups of young adults (Forty-eight undergraduate students in the age range of 18 to 25): the "Robovie group" interacted with the robot; the "control group" went through the same design process without the robot.

Results showed that on average, participants provided almost twice the number of creative expressions in the Robot condition.