Blog #6: Crossed Circuits

At Least We Care

In a study conducted in 2013, a team sought to find out if humans show empathy towards robots. Amar Toor’s article, “The robots are coming, but will we love them,” recaps that study. A group of people were given four videos to watch: a human coddling a robot, a human coddling another human; a human mistreating a robot, and a human mistreating another human.

Third Video. The affect is strange.

The results were that those being tested produced the same neurological patterns consistent with each similar type of scenario. That meant they empathized with the robot when it was being mistreated the same way they did with the human subject .

To what degree was the empathy similar? Who knows. But that test proved that we, or those tested, show empathy, no matter how little, towards robots. But why show any empathy at all? It seems that this is one of those psychological phenomena or neurological ticks that humans have.

“It can only be attributable to human error” – Hal 9000

I highly doubt that our empathy readings spike when Leonard from sales loses it and smashes his computer to bits (maybe we empathize for him, but for the computer, no). I think this tick has a lot to do with our ability to transpose ourselves with robots, especially the ones that resemble us, a little. If it moves like us and speaks our language, both verbal and nonverbal, then we may see ourselves in it; like some ancient hex was placed on us to where we can’t distinguish the line that separates ourselves and others like us. But that goes with almost anything that resembles us: dolls, action figures, and so on. Of course those examples may be geared toward children’s fancies, but the fact remains.

If the empathy test was done on something “smart” would we react the same? If a robot vacuum (the ones shaped like CD players) was smashed repeatedly and violently with a mallet it may not elicit similar readings as if we witnessed George Jetson bludgeon Rosie the Robot until her circuit-board was showing. The two objects perform essentially the same task: cleaning; but Rosie has humans characteristics, the robot vacuum doesn’t. I don’t think we have an option whether or not we show empathy to the former, but the latter, we’ll just buy a new one.

Closing The Gap

Although Rosie the Robot may be a automaton clear of the “uncanny valley” we wouldn’t sacrifice our dog for her.

Uncanny Valley in a single image
Uncanny Valley in a single image

While robots and androids fake human consciousness and characteristics, plants and animals actually have one, or are thought to have one, because they operate, and have been operating, without human actors. A parrot repeats just as much as we tell them to but their mimicry is different than Siri’s. A dog sits because it has background knowledge; a robotic dog, like the Poo-Chi, sits because it was programmed to react to that command.

I don’t think we feel real empathy towards robots and “smart” objects. I think it’s just our brains firing unauthorized signals to clue us in on some primordial alert that we haven’t yet evolved from. We can hold full conversations with Watson, Hal 9000, Gerty, the Iron Giant, or any other real or imagined object of human imitation, but, still, that would be like talking to  wall in an empty room.


Mufson, Beckett. “Could You Empathize With A Robot? | The Creators Project.” The Creators Project. The Creators Project, 25 July 2014. Web. 03 Oct. 2014.

Toor, Amar. “The Robots Are Coming, but Will We Love Them?” The Verge. The Verge, 26 Apr. 2013. Web. 03 Oct. 2014.

Wolchover, Natalie. “Why CGI Humans Are Creepy, and What Scientists Are Doing about It.” LiveScience. TechMedia Network, 18 Oct. 2011. Web. 03 Oct. 2014.

Print Friendly, PDF & Email

2 thoughts on “Blog #6: Crossed Circuits”

  1. I tend to agree with your analysis that we don’t really show empathy towards robots. One of the foundational principles of the the philosopher David Hume’s ethical position is that rational beings don’t owe any form of justice to beings that lack the ability to do rational beings harm. While I’m sure many people wouldn’t admit to this, many people act based on this premise. If in the experiment you cited, humans would have interacted with a normal robot and a robot with a gun for an arm, I guarantee much more empathy would be shown for the robot with a gun for an arm. So I agree, we don’t show much empathy towards robots or other smart things, but that would quickly change, if they all had guns for arms.

  2. I love your post! You stressed a very interesting point with compelling examples. When I watched the video, I actually felt empathy towards the robot dinosaur that was being mistreated, but I couldn’t think of a reason why I felt empathy towards a nonliving thing. The only explanation I came up with, which is also the one you are suggesting in your post, was that the robot in the video looked like a puppy, so I attributed human characteristics to it. That is why I reacted in the same way I would react if somebody mistreated an animal before my eyes. It makes no sense on a rational level, but it does on a more emotional level. I also like your approach towards smart things: it’s very original, and definitely something I never interrogated myself on. If smart things like robots and computers have no feelings and will never do, why are we compelled to feel something? The idea of a primordial instinct that we are not able to control might be a possible answer. This instinct allows us to respond and interact with what we believe to be a “smart object.”

Leave a Reply

Your email address will not be published. Required fields are marked *