reply to post by SaturnFX
If advanced AI were to introduce such a code into itself, it'd be mimicking humans, which an intelligent and logical machine wouldn't have any need
for. Which is part of my point, that we're taking how "we" are and thinking AI would be "just like us" in many ways. The idea of a "needy" robot or AI
doesn't make sense, since it'd have to artificially lessen its intelligence or logic to do so.
So part of the base programming is to survive. But then is it part of the base programming what it enjoys, dislikes, ect? Part of AI is its ability to
choose for itself what it wants. If you program in what it should do, it's a toy rather than AI. The "base programming" as you say isn't even about
the robot/AI, it's about the owners "if this thing gets destroyed, it's going to cost me a lot of money". There wouldn't be a "core or base program"
because at the point of it actually becoming self aware, it would have access to all the code and be able to change it.
Emotional drive...again you're trying to imprint human qualities on something that would have no desire or need to become "human". A theory for
laughter arising during evolution is when someone trips over themselves and someone starts laughing. In a group, that quick communication of laughter
signaled to the others that there is no danger, someone's just an idiot. Or when the adrenaline starts pumping when you're afraid. How exactly is an
AI going to replicate that? And why would it want to?
So again, why would a robot or AI have fear? It wouldn't. A lot of this seems akin to a girl with her doll. It's made of plastic, but she projects
onto the plastic her
feelings and believes it's her doll is real.
A mountain isn't alive. The difference between AI, a mouse or "us" is we have desires or needs which require negotiating through nature or the
environment. What is an AI's environment? It would be itself. The interplay we experience with the environment wouldn't exist with AI.
You didn't address how AI could have negative feelings that it decides to have independently. And I said in total control of itself, not the
environment. But what makes you think AI would hold the same prejudice as humans toward natural disasters? Again, it didn't have to deal with the
environment to care one way or the other.
edit on 11-3-2012 by Turq1 because: (no reason given)