Human-like robot creates creepy self-portraits
The world’s first robotic self-portraits, painted by an android called Ai-Da, have been unveiled at a new art exhibit in London, despite the “artist” not having a “self” to portray. The surprisingly accurate images question the role of artificial intelligence (AI) in human society and challenge the idea that art is exclusively a human trait, according to her creators.
Ai-Da is a life-size android artist powered by AI — computer algorithms that mimic the intelligence of humans — that can paint, sculpt, gesture, blink and talk. Ai-Da is designed to look and act like a human woman with a female voice. Her head and torso looks like a mannequin’s and she wears a variety of different dresses and wigs, although a pair of exposed mechanical arms do give her away as robotic. A team of programmers, roboticists, art experts and psychologists from the University of Oxford and the University of Leeds in England spent two years, from 2017 to 2019, developing the android, according to The Guardian. She is named after Ada Lovelace, the pioneering English mathematician who is considered one of the first computer programmers.
In the past, Ai-Da’s work consisted of abstract paintings based on complex mathematical models, and her first exhibition raised over $1 million in art sales, according to Artnet. She has even given her very own TEDx Talk. But now Ai-Da has created what are believed to be the first self-portraits made by a machine. Three of these robot selfies went on display at the Design Museum on May 18 in an exhibition titled “Ai-Da: Portrait of the Robot,” which is free to the public and will remain on display until Aug. 29.
“These images are meant to unsettle,” Aidan Meller, the gallery owner behind the creation of Ai-Da, told The Guardian. “They are meant to raise questions about where we are going. What is our human role if so much can be replicated through technology?”
Ai-Da’s new self-portraits are a combination of constantly updated AI, inbuilt programming and advanced robotics. The eyes are actually cameras that allow the robot to “look” at what she is painting or sculpting, in this case herself, and replicate it. The robotic arms are controlled by the AI, which was able to create realistic portraits while also including techniques and color schemes used in examples of art created by real human artists that are uploaded into the AI.
Ai-Da did not decide to create the self-portraits; rather, her creators gave those instructions. Indeed, Ai-Da is not self-aware, feeling or conscious, but the accomplishment is still an example of just how far AI and robotics have come and where they could go in the future, according to Meller.
The timing of the exhibition during the COVID-19 pandemic is also extremely relevant, Priya Khanchandani, head of curatorial at the Design Museum, told The Guardian. “Over the last year, we’ve all had such an intimate relationship with technology, so it is a really good time to reflect on that and critically ask questions of it.”
Artist or artwork?
Although Ai-Da is often labeled as “the android artist” whose paintings and sculptures are considered art, her very existence and persona are also considered artwork. But where does human influence in the form of programming end and Ai-Da’s AI begin? This question has led to controversial and thought-provoking discussions, Ai-Da’s creators said.
“Some people think she is the worst thing ever and feel threatened, and some are really excited,” Meller told The Guardian. “Her very existence is wrong somehow, and we are aware of that.”
Ai-Da also questions a long-standing belief that art is a fundamentally human concept, even though the AI was created and programmed by humans. “I enjoy being someone who makes people think,” Ai-Da told the BBC in an exclusive interview. “I think that art needs more than just the drawing of something; it means communicating something in a way that is relatable.”
Ai-Da’s creators hope that here existence will make us think more about the role of technology, in particular AI, in our everyday lives.
“If Ai-Da does just one important thing, it would be to get us considering the blurring in human-machine relations,” Lucy Seal, project researcher for Ai-Da, told BBC Science Focus magazine, “and encouraging us to think more carefully and more slowly about the choices we make for our future.”
written by Harry Baker – a trainee news writer at Live Science, based in the U.K. He studied Marine Biology at the University of Exeter (Cornwall Campus). After graduating, he created his own blog site “Marine Madness,” where he writes about the weird and wonderful creatures of our oceans and the issues they face in a changing world. He is also interested in evolution, climate change, space exploration and environmental conservation. When not at work he can be found watching sci-fi or reading about octopuses. You can follow him on Twitter @harryjpbaker
Originally published on Live Science.