The dancer, sitting under the blue light, gets up and walks towards the stage. He stares at his wife under the spotlight: a long, sleek robotic arm! As they dance together, the movements of the fluid part of the device make them appear less robotic, and the researchers hope that these movements will be interesting and believable to the viewer.
“When we move a joint, we see movement in some other parts of the body,” says Armit Rogel, a graduate researcher in music technology at the Georgia Institute of Technology. In other words, our body follows that joint. “This connection between the components can be seen even in the bodies of animals, and this is really what makes a human dance look beautiful and natural to our eyes.” Rogel programmed this subtle connection between body parts in the robot arms. The dance between robots designed by researchers at the Georgia Institute of Technology and dancers at Kennedy State University was an international project that took place last month.
The goal of this project was not only to create a memorable collaboration between humans and robots, but also to increase the sense of trust between humans and robots. Robots are widely used in our lives today, and the number of participatory robots – which work with humans in tasks such as car factory maintenance and inspection of manufactured equipment – is projected to increase significantly in the coming years. But although the level of participation is increasing, human trust in them is very low – and this makes people reluctant to work with them. “People may not be able to understand how robots work,” said Harold Soo, a computer scientist at the National University of Singapore. “This is why they do not know exactly what the robot will do.” He was not involved in the project, but his studies focus on understanding human-robot interaction and developing more robotic behavior.
Friendly relationship between robot and human despite the unusual appearance
Although humans love fancy and beautiful machines (such as “WALL-E” or “R2-D2”), the robots that can do the most work for us are mostly not the friendliest looking or attractive to fantasy machines. “When a robot looks and behaves completely differently from a human, it becomes difficult for a human to trust the robot,” says Sue. “However, scientists are able to design even a bodyless robotic arm that most closely resembles a human in motion.” “Communicating emotions and social messages through a combination of voice and movement is a compelling approach that can strengthen the interactions between us and robots,” he explains.
That’s why the Georgia Tech team decided to program inhuman machines – robots that do not look like humans – as if they could convey their emotions through sound and motion. They have studied and tested for years to achieve such a goal. For example, in order to figure out which sounds could best convey certain emotions, the researchers asked guitar singers and musicians to look at a diagram called an “emotional cycle” during their performances, giving a feeling at each interval. Choose a time and then change your instrument, type or singing to adapt to that feeling. The researchers thus developed a machine learning model – a model they intended to embed in robots – to use the data collected to select the next move. In fact, they wanted to allow robots to produce a wide range of sounds, some of which are probably more complex than others. “You can say you want to be a little bit happy, excited, or even relaxed, and then you see how the robot arm moves,” says Gil Weinberg, project partner and director of the Georgia Center for Technology.
The team then tried to relate this set of sounds to the robot’s movements. Last year, researchers showed that combining motion with emotion-based sound improved people’s confidence in robotic arms in a virtual environment (a need reinforced by the corona epidemic).
The experiment first required robots to be able to perform four different movements to convey four different emotions. Ragel began research into human body language to expand machine emotional motion options in more recent studies, which have been conditionally accepted for publication in the journals AI and Frontiers in Robotics. “For each element of body language, I thought about how to adapt it to robotic movements,” he says. “I then invited Cannes University dancers to help them improve their movement by comparing their body behavior to that of robots.” Because the performers intended to convey emotions with their body language, Rogel and his colleagues filmed them so that they could subsequently produce algorithms that the robots could adapt to. “Can you make the robots breathe?” Asked Ivan Polinkala, a dance professor at Kansas State University. And this caused the arms to be equipped with a kind of inhalation and exhalation process a few weeks later.
Polinkala, who was the dance director at the show, was responsible for coordinating emotion-based movements and using sounds during the dance. “At first I was reluctant to accept this responsibility,” he said. My approach was to somehow evoke a sense of life inside the robots to make them seem less mechanical to the viewer. I was really looking at how to increase the emotional power of robots. How should a dancer respond to a mechanical object without feeling that encourages the viewer to watch? »
According to the dancers, this led to the production of robots that are slightly more human-like than their predecessors.
Christina Masad, a freelance professional dance graduate from Kansas State University, recalls that at first she thought they were going to revolve around robots, not robots accompanying them in the dance. “As soon as I saw the robots moving fluidly, I changed my mind and saw that these robots were more than just machines,” he says. In one of the first exercises, I stumbled upon one of the robots and said subconsciously, “I’m sorry. “I’m sorry.” To my disbelief, the robot laughed and replied, “No problem. “I am just a robot.” “But this reaction went beyond the behavior of a robot,” Masad continued.
Su believes that an attractive performance has taken place on the stage and can hope to increase the sense of trust between humans and robots. “It is difficult to build and maintain trust in teams of humans and robots, but we are trying to make a difference,” he said.