Mr Robot: will androids ever be able to convince people they’re human?

Article by SA Mathieson, Guardian Labs, (The Connected University series paid for by Staffordshire University), 29 November 2019

In his quest to build robots that can pass as human, Carl Strathearn has drawn on various fields including prosthetics, materials science, computing and animatronics – the last by talking to John Coppinger, designer of the Jabba the Hutt animatronic puppet used in the 1983 Star Wars film Return of the Jedi. “He helped me out in my master’s when I was doing research in advanced animatronics and gave me plenty of inspiration in my PhD,” says Strathearn.

Hutt may look like a giant slug with a face, but Coppinger made that face more convincing by including pupils that could dilate and contract. As part of his soon-to-be completed doctorate at Staffordshire University, funded by its graduate teaching assistant scholarship scheme, Strathearn reviewed scientific papers that led him to conclude that static pupils are the primary reason that people perceive humanoid robots as artificial. “We look for life in the eyes, they are the first point of contact during face-to-face communication, and we gauge attention and trust from the eyes,” he says.

To test this, Strathearn adapted a prototype medical prosthetic eye built by researchers at Nottingham Trent University with pupils that widened and narrowed using an artificial muscle called a dielectric elastomer in response to alternating light. He replaced the prototype’s graphite pupils with graphene, a super-conductive one-atom-thick nanomaterial, allowing the eyes to work at a much lower voltage. And to make it look as realistic as possible, he used a stereolithography 3D printer normally used by dentists to create a mould for a gelatine iris membrane; this was colourised using gelatine paper carrying an image of an iris that can stretch and contract with the pupil.

Strathearn’s supervisor, dean of Staffordshire University’s school of computing and digital technologies professor Minhua Eunice Ma, said that the robot eyes would need to be installed within a robotic head to test them fully. Dilating eyes alone cannot portray discernible emotions, as facial expressions incorporate multiple elements such as eyebrows, cheeks and lips. He had found through his research that the mouth was the second biggest reason for dismissing humanoid faces, so he created one that could convincingly form vowel and consonant shapes and a custom speech application that analyses the acoustic qualities of incoming speech and turns it into servomotor data for the robot. He then built two robot heads: one with his customised eyes and mouth called Baudi, and the other, named Euclid, without them.

But this led to a new problem: “We realised fairly quickly that there was no universal evaluation method for determining the authenticity of humanoid robots,” says Strathearn. The Turing test, proposed by computing and codebreaking pioneer Alan Turing in a 1950 paper, assesses software that “talks” to humans through text messages. In 2014, Eugene Goostman, a chatbot that emulates a teenager, convinced 33% of judges that it was human, passing the test’s 30% minimum.

But Ma says the original Turing test only considers the handling of text, known in computing as natural language processing. So she and Strathearn devised a multimodal Turing test that assesses whether a robot can accurately impersonate a physical human in four situations: at rest, moving, speaking and simulating intelligence known as embodied artificial intelligence (EAI). A robot that could pass all four would be capable of expressing emotions, interacting naturally with people and would be indistinguishable from them in many situations.

Ma adds that the new test can also assess digital humans seen on a screen or in virtual reality, including non-player characters in games. “It will also include emotional and social artificial intelligence, picking up signals and clues on what the other party feels,” she says.

Strathearn is about to begin testing his two robotic heads, but reckons that at present no robot can pass for human even when at rest, although some hyper-realistic works of art can manage this for a few seconds until an observer realises they are not living. Contemporary humanoid robots tend to have either realistic appearances or convincing movement, but not both: “This is a problem in humanoid robotics – the artistic side of things often comes secondary to engineering and programming,” he says.

Ma says that cutting-edge work such as Strathearn’s must combine art and science in an interdisciplinary fashion: “It’s not only useful, it’s essential.” This is true of other projects she and Staffordshire University students are working on in order to create “augmented reality” characters for museums that can be added to environments digitally with a holographic headset. These include a Tutankhamun for the Egyptian Museum in Cairo and a persecuted Jewish family discussing the Kindertransport at the National Holocaust Centre and Museum in Newark, Nottinghamshire.

It is harder to build physical humanoids than virtual ones, but such robots could have practical uses in areas including search and rescue work, entertainment and education, perhaps even allowing students to question a robot version of Alan Turing. Strathearn sees potential in social companions for the elderly, where there are increasing shortages of people willing and able to provide care. “Would it be better than using a human is a difficult question,” he adds.

But he reckons that experimentation and evaluation is the scientific way to find the best uses for humanoid robots. With the rapid evolution of robotics and AI, future humanoid robots could do almost anything humans can; we could have robots sitting next to us on the train or in our workplaces and not know the difference between them and us. Given we have adapted the world for ourselves, the human form is the ideal for our built environment, and the best interface for communicating with us is a human face. “We need to build these things to be able to measure their progress,” he says.