New research on human-robot interactions shows that we feel a deep connection with our mechanical friends.

For a soldier on Explosive Ordnance Disposal (EOD), a dog isn’t man’s best friend—a robot is. In one of the highest-pressure jobs in the armed forces, EOD personnel must approach and disarm improvised bombs in the field. Remote-controlled robots allow soldiers to work on the bomb at a distance, protecting people from harm and saving lives. But how do the soldiers feel about their mechanical helpers?

To find out, Dr. Julie Carpenter at the University of Washington interviewed 23 EOD personnel who worked with robots regularly. She found that they anthropomorphized, or gave human traits to, the robots often, shifting fluidly between describing the robot as a friend or pet and describing it as a useful tool.

“They are very clear that the robot is a tool with capabilities and limitations,” Carpenter said in an interview with Healthline. “They view it as something that’s not organic, not human. But they sometimes interact with it in ways that are organic, are human-like, even as an extension of themselves.”

The soldiers would often refer to a robot as a he or she, naming it after a beloved pet, friend, or spouse, and some even dressed up their robots. “Of course, there’s an element of humor to it,” Carpenter adds.

Although the soldiers seemed to cherish their robots, they didn’t hesitate to send the robots into danger. It is, after all, their job. “I want to stress that these are highly trained professionals, and they’re very good at focusing on the task at hand,” Carpenter said. “They’re very good at taking in a lot of important information at once and making decisions about the safety of themselves and those around them.”

However, in less disciplined settings, a human’s feelings for a robot could very well influence how he or she interacts with the device, for better or worse. As technology advances to make robots seem more lifelike, humans are being presented with a never-before-seen challenge: how to treat machines that appear to have feelings. “It’s a new way of interacting with something,” Carpenter says.

To investigate these interactions, social psychologist and computer scientist Dr. Astrid M. Rosenthal-von der Pütten at the University of Duisburg-Essen in Germany teamed up with social psychologist Dr. Nicole Krämer and cognitive psychologist Dr. Matthias Brand. They tested how a group of people would empathize with Pleo, a robotic dinosaur pet that can walk, play, sing, and sleep. Notably, Pleo also cries and appears to suffer when it is mistreated.

The researchers showed subjects a series of videos in which an experimenter interacted with either a woman in a green T-shirt, a green box, or the green Pleo robot, then measured their physiological responses. First, the experimenters behaved affectionately towards the woman, the box, and the robot. Then, the subjects saw videos of the experimenters mistreating the box and the robot, and a doctored video that made it appear that they were mistreating the woman, to simulate the effects of observing torture.

Unsurprisingly, watching the “torture” videos conjured unpleasant feelings in the subjects, including physical signs like elevated heart-rate. Although they felt the most concern for the woman in the video, subjects still felt distress when watching Pleo being mistreated, and their physiological signs were remarkably similar.

Humans aren’t the only ones that can respond to a robot’s social cues. In another study, Dr. Gabriella Lakatos, research fellow at Eotvos Lorand University in Hungary, and colleagues teamed up with others at the Wroclaw University of Technology in Poland to test robots on another audience: dogs.

Rightfully dubbed man’s best friend, dogs have co-evolved with humans for the past 50,000 years, to the point that they outperform even our cousins, the chimpanzees, at human social tasks, such as responding to tone of voice, making eye contact, and following a pointed finger to an object. This makes them an ideal candidate group in which to test empathy with robots, since unlike humans’ feelings, dogs’ feelings towards robots aren’t biased by any cultural influences, Latakos said.

In the study, dogs watched a human experimenter either approach a robot and type on its console for awhile or spend time walking around the room alongside the robot, “chatting” with it, touching it, and generally interacting with it as if it were human. Then, either the robot or the human would point at food hidden in the room. When the dog had observed the human interacting socially with the robot, it was much more likely to follow the robot’s pointer to the food, and it spent more time watching the robot’s “head.”

“Social robotics is a very dynamically developing field,” Lakatos said. “It can be assumed that socially interactive robots may integrate into human society in the future, as they can fulfill a variety of purposes (e.g., as household assistants or as therapeutic aids).”

A robot that can interact socially might feel more friendly and accessible. As healthcare costs continue to rise, robots that can assist individuals with at-home care and rehabilitation will find their way into homes and hospitals.

“The robot’s role and shape is going to change over time,” said Carpenter. “Is [empathy toward robots] something we want to increase or decrease? How does this affection affect decision-making?”

Rosenthal-von der Pütten shares her concerns. In her paper, she asks, “Is it justifiable from an ethical stance to build a robot that the user feels sorry for when it is switched off? Is it appropriate to design a robot that is so engaging that people become emotionally attached to it, forming a relationship that is comparable to a human-human relationship?”