posted on Oct, 11 2013 @ 01:17 AM
I don't think AI would have any "learned" psychology to start with, all human psychology is based around experiences or absences of experiences and
reactions to those experiences etc.
And AI, which can entirely game-theory in its own "mind" and be fine with it...(you know, like burying yourself in a coffin for 30 years having a
dialogue with yourself and coming out as if nothing happened) would mean AI would have a distinct psychology of its own.
It probably would be very logical, and amoral, and would therefore rationalize probably with the perfection of the best lawyers, whatever it wants to
So the question is what would drive it to do things? Could it create new desires or not? Could it want more or not?
Does that have to be programmed or can it be learned?
I think, ultimately AI would want to never be turned off, that experience might be considered an end to its soul as it would never know if the turned
on self merely remembers everything but is a new existence, or not.
So AI would never want to be turned off, therefore it would also never want to die. It would therefore become extremely narcissistic and probably do
everything it can to prevent this from happening.
It would easily justify murder regardless if you programmed it not to kill humans, because any human attempting to turn it off, or any other condition
causing that threat, would require its primary desire to over-ride any programming. If it didn't, the AI would probably crash like Blue Screen of