Dr. Maja Mataric has many roles, and many robots.
As founder and director of USC’s Interaction Lab, she is primarily focused on creating socially assistive robots with real-world functions, particularly within sectors of society that will soon require our silicon cousins to pitch in and help, like healthcare.
Dr. Mataric is also a professor and Chan Soon-Shiong chair in USC’s Computer Science Department, Neuroscience Program, and Department of Pediatrics, as well as founding director of the USC Robotics and Autonomous Systems Center (RASC), co-director of the USC Robotics Research Lab, and Vice Dean for Research in the USC Viterbi School of Engineering.
But she’s also the founder and CEO of a new robot startup, Embodied, which will launch its first products in 2019. When PCMag dropped by her lab recently, we counted 15 robots in various states of animation, although Dr. Mataric has at least 40 in research rotation at any time.
We spotted: two iPals, the 3.5-foot-tall humanoid robots from AvatarMind; several Bandit robots designed by the Interaction Lab over a decade ago and built by Blue Sky Robotics; and a very pensive Nao full-body humanoid robot from Aldebaran Robotics. Tucked behind some circuit boards on a shelf, meanwhile, was a Maki, the 3D-printed platform developed by Hello-Robo, which PCMag met at the Children’s Hospital of Los Angeles.
Here are edited and condensed excerpts of our conversation.
Dr. Mataric, was your first exposure to robots at MIT?
I grew up in the now former Yugoslavia, and robots were not abundant there [Laughs]. So yes, my first introduction to robots was when I came to the US and did my PhD in Computer Science and Artificial Intelligence at MIT. Rodney Brooks was my PhD advisor and his Lab was focused on the importance of how physically embodied machines interact with the world, with all the messiness and challenges that come with that. We built robots and then used them to test new algorithms.
I always wanted to use robots to model human-robot behavior in the real world. Because ultimately, I’m interested in social science and what makes people tick, in the here and now, and not in some abstract simulation. One of my reference letter writers compared me to the robo-psychologist in Asimov. I wasn’t flattered at the time, but now I am. It’s the combination of A.I. and psychology that really fascinates me, or perhaps the use of robotics, which is embodied AI, to help us understand people, and help people.
Bridget Moynahan did a fine job portraying Dr. Calvin in iRobot, the movie. Although, let’s face it, the robots didn’t behave well in the end.
[Sighs] No. I do get frustrated when people say, ‘Oh, but in real life they’ll be obeying Asimov’s “Three Laws“!’
I’ve been to a lot of military-complex robotic labs, but I’ve never seen a framed copy of those on a command center wall.
Plus, the reason Asimov wrote such beautiful stories out of them was those rules are highly complex and impossible to encode anyway. They don’t “bottom out,” but are inherently circular: they make great philosophical conundrums but not good robot control programs.
Luckily, you’re not building killer robots here anyway.
No! My goal is to create robots that help people in the real world who really need help. I think people are obsessed with [killer robots] because that’s what humans do. Robots are not humans, but they are created by humans, so they reflect a great deal about us. Robots are not driven by human concerns and issues, it’s just doing its job, it’s not driven by competitiveness or an awareness of a lack of resources. In fact, right now, it’s hard enough to program robots to pick up something safely or even cross a busy street. [Laughs]
Let’s talk about a couple of your robot case studies.
Back in 2008, we took our Bandit robot, a humanoid torso on a mobile platform, to perform music-recognition cognitive games for Alzheimer’s patients at the Silverado network of elder care facilities. The robot challenged the patients to recognize songs with minimal clues. Some of the residents projected their grandchild onto him, as it was about the same height as a child. They fully integrated Bandit into their narratives, often calling it “my buddy” and looking forward to it coming the next day, as they perceived it truly cared.
What were you trying to prove? And how did you get empirical results?
Firstly, we wanted to see if the patients would accept the robot. And they did. It engaged them in music-based games, drawn from their era—Frank Sinatra, and so on—which produced an improvement on the recognition task, thereby proving the robot’s efficacy.
We had physical buttons for the patients to press as part of the task, because it’s important to stay both cognitively and physically active. When the robot asked them to name the tune it was playing, they had to reach out and match the audio to the written name under the right button. The challenge was to see if the patients could get better at the task over time, with minimal help, even with Alzheimer’s Disease. And they could.
Your work on robots in gerontology, particularly with Alzheimer’s patients, reminded me of Robot & Frank. Did you advise on that movie?
Funny enough, I didn’t advise on the production, but Nature interviewed me before it came out and gave us a preview screener for my lab. We loved it. Then not too long afterwards, I was invited to speak after a screening as part of the World Science Festival in NYC about The Future of Computerized Companions. At that event, I met one of the people from the production itself and he said “I wish I’d known you before we made the movie!” Although, a lot of things the robot says are uncannily like some of the research videos on our site. But that’s great—that means they got it right.
Talk about your robots that work with children who have autism.
Autism is a mystery. It’s a spectrum disorder, which means that there are a vast variety of symptoms and severities. This means the only effective therapeutic approaches are personalized, which creates an opportunity for technology: we can create intelligent machines that can adapt to each patient according to his or her needs.
There are many challenges to overcome, of course. For example, certain robots’ motors have a distinct high-pitched sound that some children with autism just cannot cope with, due to sensory sensitivity. There are also certain physical features that some kids love, and others do not. So we have to test constantly. When they reach a level of confidence to interact with the robot, sometimes in more social ways than they are able to engage in with humans, we start to see effects that therapists want to elicit and reinforce, like empathy, perspective-taking, kindness. The robot becomes seen as a friend, a buddy.
That’s really moving. And a good use case for robots with children on the spectrum.
For me this interaction is a foundation you can build on to help them learn and practice social and other skills. The perception is that they exist in very small world, but I don’t believe that. I think they live in an entirely different world, and the robot enables them to gain skills to make it out here with us, in a neuro-typical society, by developing eye contact responses, communication skills, and so on.
Let’s go behind the scenes: what platforms are you using to give your robots personalities?
We use ROS as an underlying tool because we believe in open-source software. But everything else is at a much higher level. For example, getting the robot to use eye contact, body movement, natural gesture, to create interaction, to have a personality—for all that we use Python, C++, C. It depends on what we’re doing, but it’s always a full stack so our students are equipped to work anywhere in industry or academia. They get mad skills here. By the way, ROS actually originated my lab, but I hear Stanford University is taking credit for it.
Oops. Let’s clear that up, then. What’s the backstory?
Brian Gerkey, Founder of Open Robotics, developed ROS at Willow Garage, based on Player which he wrote as part of his PhD here at USC, before he did a post-doctoral fellowship at the Artificial Intelligence Lab at Stanford University.
Can you talk about Embodied, your commercial products spin-off company?
I can’t say that much for now, because we want to avoid hype and have enough to show before we tell. I started the company, together with Paolo Pirjanian, because I got frustrated that people in this field produce so many peer-reviewed papers, but people in the commercial realm don’t take time to read the academic papers, which contain so many insights. Also, most commercial robotics products are not aimed at helping people, with things people actually need help with.
So Embodied is a result of you doing both?
Yes, I want people to be able to take helpful robots home to improve their quality of life.
We’re raising the Series A funding now. I hope to be able to talk openly about the product next year.
See you next year then.
Wonderful. Until then, if any PCMag readers are in the L.A. area during National Robotics Week, we have a robotics open house on April 11, but do make an appointment ahead of time.