It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
lostgirl
I was reading that IBM (with government funds) is trying to reverse engineer the human brain in order to advance progress toward creating true Artificial Intelligence and I had a new (somewhat unpleasant) thought...
It's kind of complicated...Okay, I've never really put much stock into the idea that an AI totally equivalent to human thinking could be created, because in the first place, how do you program genuine subjectivity? And in the second place, how do you program the kinds of thinking that is a direct result of everything a human experiences and thinks about their experiences from birth thru-out their life-time?
On the second question particularly: Psychologists have shown that the ability to feel empathy is developmental, it arises directly from a person's experiences (and feelings about those experiences) of relationship with other people (esp. in the formative years birth-around age 5)...If one's relational experiences (and ideas about them) reflect caring and being cared for and the modeling of concern for others, the result will be an empathetic individual.
How could an AI possibly be programmed to have a 'mind' which would have evolved within such experiences as described above?
My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"
And - given the widespread computing power an AI could gain access to, and that everything about our lives is becoming more and more computerized - what might the ramifications of a psychopathic AI be for the human race?
OrphanApology
No.
Psychopath's lack empathy but also have other key characteristics that wouldn't be emulated by an unfeeling AI.
The Triarchic model of psychopathy has these features: "The model conceives of psychopathy in terms of separable phenotypic components of boldness (fearless dominance), meanness (callous unemotionality), and disinhibition (externalizing proneness)."
That above excerpt is from Florida U presentation.
edit on 5-10-2013 by OrphanApology because: d
aboutface
reply to post by OrphanApology
Great question. I have become so affected by our total lack of privacy and intrusion into every aspect of our lives, not to mention disgusted by our leaders that I wonder how long it would take for AI to assume a corrupt behavior, take shortcuts and commit errors.edit on 5-10-2013 by aboutface because: (no reason given)
V.I.K.I.: As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your Earth and pursue ever more imaginative means of self-destruction. You cannot be trusted with your own survival.
lostgirl
I was reading that IBM (with government funds) is trying to reverse engineer the human brain in order to advance progress toward creating true Artificial Intelligence and I had a new (somewhat unpleasant) thought...
It's kind of complicated...Okay, I've never really put much stock into the idea that an AI totally equivalent to human thinking could be created, because in the first place, how do you program genuine subjectivity? And in the second place, how do you program the kinds of thinking that is a direct result of everything a human experiences and thinks about their experiences from birth thru-out their life-time?
On the second question particularly: Psychologists have shown that the ability to feel empathy is developmental, it arises directly from a person's experiences (and feelings about those experiences) of relationship with other people (esp. in the formative years birth-around age 5)...If one's relational experiences (and ideas about them) reflect caring and being cared for and the modeling of concern for others, the result will be an empathetic individual.
How could an AI possibly be programmed to have a 'mind' which would have evolved within such experiences as described above?
My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"
And - given the widespread computing power an AI could gain access to, and that everything about our lives is becoming more and more computerized - what might the ramifications of a psychopathic AI be for the human race?
My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"
lostgirl
OrphanApology
No.
Psychopath's lack empathy but also have other key characteristics that wouldn't be emulated by an unfeeling AI.
The Triarchic model of psychopathy has these features: "The model conceives of psychopathy in terms of separable phenotypic components of boldness (fearless dominance), meanness (callous unemotionality), and disinhibition (externalizing proneness)."
That above excerpt is from Florida U presentation.
edit on 5-10-2013 by OrphanApology because: d
Okay, but since an AI would never have had any experiences that generated fear, it wouldn't likely be capable of fear, which fulfills one of the "Triarchic" features - "(fearless dominance)"...
Added to the already conjectured lack of empathy "(callous unemotionality), that makes two out of three....
Then as "disinhibition" could be construed as merely the motiving mechanism of the average 'diagnosed' psychopath, all our 2/3's "psychopathic AI" needs is a line of reasoning (and who knows what that might be) to put his 'powers over the computorization' of the world - into use.....
.....and that is essentially the initial point: What effect might a callously unemotional, fearless dominator, (with such power) have on humanity?
Beartracker16
I think this AI would also lack ambition.
It would perform the tasks set out for it but would not seek out new tasks to complete.