The Sword and Laser discussion

This topic is about
Do Androids Dream of Electric Sheep?
2014 Reads
>
DADOES: Empathy
date
newest »

message 1:
by
Dwayne
(new)
Oct 31, 2014 11:44AM

reply
|
flag

Quite honestly, I wouldn't worry myself about that.


But on second "search", maybe not too difficult. This is pretty interesting:
http://www.ai.mit.edu/projects/humano...
(it's amazing what people have posted on the web and how easy it is to find now)

I think from a 2014 point-of-view, it might be hard to believe but not for a 1965 reader. Asimov's I, Robot came out in 1950.
Interestingly, earlier this year, there was a winner for one kind of Turing test:
http://www.bbc.com/news/technology-27...
message 6:
by
Tassie Dave, S&L Historian
(last edited Nov 02, 2014 04:27PM)
(new)
-
rated it 4 stars
Alex wrote: "Interestingly, earlier this year, there was a winner for one kind of Turing test:"
A 33% success rate was always considered a failure when I was at school :-?
I think it is more a sad indictment on the 10 people who were fooled than a success for the programmers.
The fact that the programme was pretending to be a 13 yo conversing in a foreign language would subconsciously sway humans to forgive quirks in it's grammar.
A 33% success rate was always considered a failure when I was at school :-?
I think it is more a sad indictment on the 10 people who were fooled than a success for the programmers.
The fact that the programme was pretending to be a 13 yo conversing in a foreign language would subconsciously sway humans to forgive quirks in it's grammar.


There was a lot of 20th century philosophy and thought dealing with how industrialized society was dehumanizing and decreasing empathy in society. I think in a lot of SF of this era, robots and AI are used as a metaphor for the alienating effects of industrialization, and not necessarily a warning against them specifically.


And so this is to say that it seems like, in this novel, it's the humans who act like programmable robots, and they misunderstand something basic about the androids they produce-- it's not that they don't feel empathy (Robowoman is not at all a fan of using the skin of human babies as a trade good, even if she shows her revulsion a fraction of a second too late), it's that the empathy they feel is different from, and expressed differently from, the weird, twisted things that the remaining humans on earth feel. But because they're human, the human beings just assume that this means they can't actually feel empathy.
Does that make sense?

However, if Empathy is part of a spiritual nature, then it probably can never be programmed. If it can't be programmed, it seems to follow that there will likely be a way to test for it.

The human brain IS programmable with empathy. I'm of the opinion that our definition and level of empathy is very much a learned response. Yes, there's an innate biological component that allows us to learn it, but without sharing experience with those around us and learning through cause and effect the human capacity for empathy can be stunted and weak.
At the same time, sociopaths are considered largely incapable of empathy AND fully understanding and determining cause and effect without having a template or rule for that experience. That alone makes me think that at least the semblance of empathy can be programmed, even if we can't do it now.
Still, having read Do Androids Dream of Electric Sheep a few times, I think the entire book is (as is typical for Philip K. Dick), a deconstruction of what it is that makes us human. Even the humans in the story don't empathize in what we would consider a "normal" way, often relying on artificial external stimulus to enable what should be a standard human response.

Regardless of this, I have a counterpoint, that maybe empathy truly can't be achieved by a non-human or artificial intelligence. Perhaps empathy is a trait associated with the controversial concept of the "soul". I won't really get into my opinion of that, because I don't want to say anything too taboo, but as a Christian I have often shone this light upon the subject as an explanation.

I have to say that the more I think about how empathy is handled in this book, the more unsettled I become.
First, the importance placed on empathy as the defining characteristic of humanity seems impossible naive to me. If historical atrocities like the treatment of indigenous cultures, chattel slavery or the Holocaust have taught us anything, it's that empathy is hardly permanent, and far from universal. It seems relatively common to simply decide that my empathy for other humans only stretches so far, only to a certain group. Only to others like me.
The Holocaust, for example, supplies us with many examples of German men and women who were considered the peak of culture and breeding in the so-called civilized world, who set themselves to serve their country by systematically annihilating Jews, Roma and other "undesirables" in death camps, while continuing to pursue their interests in literature, music, philanthropy, vegetarianism and other "enlightened" pursuits.
Back to the book, this brings me to another disturbing point. Compare for a moment the attitude most characters have towards animals with that they have for the "specials". They consider the "specials" to be degenerate, to be hardly human due to their genetic differences. And yet they treat animals, a different species entirely, with reverence.

Whereas many authors write without implicitly targeting a message or point, Dick always built his books around a theme and a reflection of society or humanity as a whole. He liked to dig into our deepest and darkest points and assumptions about ourselves and shine a light on them.

Barak, I never thought about it in that light, so thank you. It is an interesting idea to ponder, and now that you bring it to mind, I guess that is the thought behind Deckard questioning why the test has never taken into consideration feelings for other androids. He says many times during the book that androids don't feel anything for each other, but this isn't something that is ever proven. Perhaps this is because it would be harder to kill the androids if the bounty hunters thought they had feelings like everybody else. Actually, it is when Deckard starts questioning this that he starts doubting himself, and at that point he seems to more firmly conclude that Androids do not care about one another, perhaps a defensive reaction?
Actually, this reminds me of a truly horribly thing a person I once knew said. A local Pakistani family had been killed in a fire and he said he was glad. "What?" he protested against my look of disgust. "They don't feel anything."

http://www.huffingtonpost.com/2014/11...
People who are happy first thing in the morning lack empathy.
Therefore their”skin jobs” and need to be retired.
Works for me. ;-}