BY: KATY WILLIS
On first glance, Otonaroid and Kodomoroid look like humans. As they read the news, they sound like humans. Touch their silicone skin and they even feel human.
But, of course, they’re not humans.
Otonaroid and Kodomoroid are freakishly life-like humanoid robots created by Japanese scientist Hiroshi Ishiguro. While some hail them as the heralds of a dystopian future—where humanity is enslaved to cold, emotionless, ruthless android overlords—others welcome their creation, believing that they herald a new age, in which intelligent robots live harmoniously with humans, working together for a better world. Either way, they are a great leap forward in the robotic field.
Initially used to simply read the news, the two robots now attend exhibitions and conventions as exhibits and work at the Tokyo Museum as science guides, providing information to guests.
Though Ishiguro is proud of his creations, he admits that they lack true human-like intelligence. Everything they’re capable of is due to programming. They can lip-sync, make a wide range of facial expressions, respond to set questions, and even use a Siri-like application to provide information—an explanation of the orbit of the planets or reviews of a local restaurant—based on a quick Internet search. They cannot, however, recognize or feel emotion, make an informed decision, or apply logic or problem-solving skills.
But Ishiguro is still confident in his creations’ abilities. In fact, he has created a robot replica of himself that he sends abroad to give guest lectures in his stead when he doesn’t want to travel or has other engagements. He is confident, though, that eventually he, or someone else, will find a way to create a robot with true artificial intelligence.
Ishiguro’s robots won the world’s attention for being remarkably human-like, but regular old robots have become increasingly commonplace in everyday life. Take Pepper, for example, a robot who provides companionship and information. He is roughly humanoid in shape, though clearly a robot, featuring a touchscreen on his chest. His sensors can read and interpret your facial expressions, the sounds you make, and the tone of your voice, reading your emotions and providing the appropriate response.
But when we are turning to robots like Pepper for companionship, perhaps the problem isn’t with the ability of the robots. Maybe the problem is with us.