Originally posted by Vector J
I don;t recall any claims of it being 'concious'? If there were any then I'd have to strongly disgree. See my earlier post about conciousness for
The problem with discussing whether or not something other than yourself is conscious is that it requires faith. Sometimes a lot and sometimes only a
It is relatively easy to believe that another fellow human being is conscious. We have neurons in our heads that allow feelings of empathy, and then,
it takes very little trust to recognize that, because you think, therefore you "are" then, so does the other person. The familiarity of another
person enables these feelings. If you were to merely read text coming from a window, those feelings are slightly diminished. At some point a computer
could very well be writing this very text and you would not know the difference. If it was a face to face conversation your feelings of human empathy
take over and you are certain that somebody else is conscious.
A machine does not elicit those emotions in us, because of the lack of a human looking body and humanish thought process and dialog. If such a machine
was made that appeared to us physically undistinguishable from a person, then we would naturally conclude it is conscious. But the point is, whatever
one person decides about another "entity" (person or not) level of consciousness, has nothing to do with the "reality" of that subjective
"experience" to the other entity.
Whatever our strong opinions may be as to whether machines are really conscious or not, we have to recognize that we are talking about matters that
are by their lack of definition, impossible to resolve. Just like you will never know if my perception of red is different from yours and I just
happen to call it red because of social conditioning, so you can never tell if my subjective experience of consciousness is any more "real" than
yours. And even harder to know from a computer.
Just like we will never know if somebody other than ourselves is conscious, although we feel a very strong hunch that they are in fact as conscious as
ourselves, we do not have a way of knowing whether a machine is conscious, and there is nothing to support the phenomenon of empathy in the case of a
Another example of this is how easy it is for us to kill and eat creatures that are remarkably different from ourselves. We are much less inclined to
eat creatures that we keep as pets, and we are very disinclined to kill and eat other people.
The appearance of humanity is all we have going to decide whether or not something else is conscious.
In the end, when dealing with questions of morality with machine intelligences, we will eventually decide based on the machines ability to appeal to
our empathy: If the machines themselves can make a good case for their own consciousness and recognition, whether by arguing with us, or by having the
strength to defend themselves, or cry and scream when hurt, then we will treat them like "conscious entities". If they don't scream and cry when
unplugged, we will do it with no remorse, for better or worse, just like we kill a fish and feel little or no remorse.