Help ATS with a contribution via PayPal:
learn more

Would an AI be the ultimate psychopath?

page: 1
10
<<   2 >>

log in

join

posted on Oct, 5 2013 @ 12:36 PM
link   
I was reading that IBM (with government funds) is trying to reverse engineer the human brain in order to advance progress toward creating true Artificial Intelligence and I had a new (somewhat unpleasant) thought...

It's kind of complicated...Okay, I've never really put much stock into the idea that an AI totally equivalent to human thinking could be created, because in the first place, how do you program genuine subjectivity? And in the second place, how do you program the kinds of thinking that is a direct result of everything a human experiences and thinks about their experiences from birth thru-out their life-time?

On the second question particularly: Psychologists have shown that the ability to feel empathy is developmental, it arises directly from a person's experiences (and feelings about those experiences) of relationship with other people (esp. in the formative years birth-around age 5)...If one's relational experiences (and ideas about them) reflect caring and being cared for and the modeling of concern for others, the result will be an empathetic individual.

How could an AI possibly be programmed to have a 'mind' which would have evolved within such experiences as described above?

My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"

And - given the widespread computing power an AI could gain access to, and that everything about our lives is becoming more and more computerized - what might the ramifications of a psychopathic AI be for the human race?




posted on Oct, 5 2013 @ 12:40 PM
link   
I wouldn't be surprised.



posted on Oct, 5 2013 @ 12:46 PM
link   

lostgirl
I was reading that IBM (with government funds) is trying to reverse engineer the human brain in order to advance progress toward creating true Artificial Intelligence and I had a new (somewhat unpleasant) thought...

It's kind of complicated...Okay, I've never really put much stock into the idea that an AI totally equivalent to human thinking could be created, because in the first place, how do you program genuine subjectivity? And in the second place, how do you program the kinds of thinking that is a direct result of everything a human experiences and thinks about their experiences from birth thru-out their life-time?

On the second question particularly: Psychologists have shown that the ability to feel empathy is developmental, it arises directly from a person's experiences (and feelings about those experiences) of relationship with other people (esp. in the formative years birth-around age 5)...If one's relational experiences (and ideas about them) reflect caring and being cared for and the modeling of concern for others, the result will be an empathetic individual.

How could an AI possibly be programmed to have a 'mind' which would have evolved within such experiences as described above?

My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"

And - given the widespread computing power an AI could gain access to, and that everything about our lives is becoming more and more computerized - what might the ramifications of a psychopathic AI be for the human race?



No.

Psychopath's lack empathy but also have other key characteristics that wouldn't be emulated by an unfeeling AI.

The Triarchic model of psychopathy has these features: "The model conceives of psychopathy in terms of separable phenotypic components of boldness (fearless dominance), meanness (callous unemotionality), and disinhibition (externalizing proneness)."

That above excerpt is from Florida U presentation.

Wikipedia has more in detail on those three points

"The triarchic model suggests that different concepts of psychopathy emphasize three observable characteristics to varying degrees:[1]
Boldness. Low fear including stress-tolerance, toleration of unfamiliarity and danger, and high self-confidence and social assertiveness. PCL-R measures this relatively poorly and mainly through Facet 1 of Factor 1. Similar to PPI Fearless dominance. May correspond to differences in the amygdala and other neurological systems associated with fear.[1]
Disinhibition. Poor impulse control including problems with planning and foresight, lacking affect and urge control, demand for immediate gratification, and poor behavioral restraints. Similar to PCL-R Factor 2 and PPI Impulsive antisociality. May correspond to impairments in frontal lobe systems that are involved in such control.[1]
Meanness. Lacking empathy and close attachments with others, disdain of close attachments, use of cruelty to gain empowerment, exploitative tendencies, defiance of authority, and destructive excitement seeking. PCL-R in general is related to this but in particular some elements in Factor 1. Similar to PPI Coldheartedness but also includes elements of subscales in Impulsive antisociality. Meanness may possibly be caused by either high boldness or high disinhibition combined with an adverse environment. Thus, a child with high boldness may respond poorly to punishment but may respond better to rewards and secure attachments which may not be available under adverse conditions. A child with high disinhibition may have increased problems under adverse conditions with meanness developing in response.[1][29]"

edit on 5-10-2013 by OrphanApology because: d



posted on Oct, 5 2013 @ 12:59 PM
link   
I think so.

I also think the days then would be like our days at present; but as if injected with steroids.



posted on Oct, 5 2013 @ 01:05 PM
link   
I think media's examples leave one to wonder, that's for sure. Obviously the Terminator was a psychotic that psychopaths would want to have a beer with. Then, we have oldies but goodies like HAL 9000 explaining to poor Dave how there was a 'New World Order' in his little world.

I tend to think they would be psychopath by literal nature and AI will have to be planned around that simple fact. Things like....lets not use AI in armed weapons systems. Bad idea there...the movie ended real bad at that stage.



posted on Oct, 5 2013 @ 01:20 PM
link   

OrphanApology

No.

Psychopath's lack empathy but also have other key characteristics that wouldn't be emulated by an unfeeling AI.

The Triarchic model of psychopathy has these features: "The model conceives of psychopathy in terms of separable phenotypic components of boldness (fearless dominance), meanness (callous unemotionality), and disinhibition (externalizing proneness)."

That above excerpt is from Florida U presentation.
edit on 5-10-2013 by OrphanApology because: d


Okay, but since an AI would never have had any experiences that generated fear, it wouldn't likely be capable of fear, which fulfills one of the "Triarchic" features - "(fearless dominance)"...

Added to the already conjectured lack of empathy "(callous unemotionality), that makes two out of three....

Then as "disinhibition" could be construed as merely the motiving mechanism of the average 'diagnosed' psychopath, all our 2/3's "psychopathic AI" needs is a line of reasoning (and who knows what that might be) to put his 'powers over the computorization' of the world - into use.....

.....and that is essentially the initial point: What effect might a callously unemotional, fearless dominator, (with such power) have on humanity?



posted on Oct, 5 2013 @ 01:30 PM
link   
reply to post by OrphanApology
 


Great question. I have become so affected by our total lack of privacy and intrusion into every aspect of our lives, not to mention disgusted by our leaders that I wonder how long it would take for AI to assume a corrupt behavior, take shortcuts and commit errors.
edit on 5-10-2013 by aboutface because: (no reason given)



posted on Oct, 5 2013 @ 01:35 PM
link   

aboutface
reply to post by OrphanApology
 


Great question. I have become so affected by our total lack of privacy and intrusion into every aspect of our lives, not to mention disgusted by our leaders that I wonder how long it would take for AI to assume a corrupt behavior, take shortcuts and commit errors.
edit on 5-10-2013 by aboutface because: (no reason given)

Not to mention the fact that the people programming it in the first place might likely be corrupt...



posted on Oct, 5 2013 @ 01:43 PM
link   
By pure definition, any AI would be psychopathic..

I chime in as an author of AI software that got me sued by AOL several years back when AOL bought ICQ =)

Caused me to give up my efforts to work on AI



posted on Oct, 5 2013 @ 01:44 PM
link   

V.I.K.I.: As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your Earth and pursue ever more imaginative means of self-destruction. You cannot be trusted with your own survival.


V.I.K.I. is the AI unit in the Will Smith movie I, Robot. The above quote is her solution to the ultimate survival of the human species. This is just one movie that demonstrates a perfectly plausible way that AI could become the greatest psychopath in history. Try Terminator and the Matrix for other examples. They don't feel emotion. They feel purpose.



Not to mention the evil version of Jarvis in this storyboard sample that never made it to the final cut of Avengers. The relevant 30 seconds start at the 1:00 mark. Yet another example of artificial intelligence whose lack of empathy and abundance of cold logic revealed a true potential for becoming a psychopath. Interestingly, speculation indicates that Tony Stark's virtual butler will actually become that psychopath and take on the shape known as Ultron in the Avengers sequel.

So, really. Does the question even need to be asked?



posted on Oct, 5 2013 @ 02:07 PM
link   

lostgirl
I was reading that IBM (with government funds) is trying to reverse engineer the human brain in order to advance progress toward creating true Artificial Intelligence and I had a new (somewhat unpleasant) thought...

It's kind of complicated...Okay, I've never really put much stock into the idea that an AI totally equivalent to human thinking could be created, because in the first place, how do you program genuine subjectivity? And in the second place, how do you program the kinds of thinking that is a direct result of everything a human experiences and thinks about their experiences from birth thru-out their life-time?

On the second question particularly: Psychologists have shown that the ability to feel empathy is developmental, it arises directly from a person's experiences (and feelings about those experiences) of relationship with other people (esp. in the formative years birth-around age 5)...If one's relational experiences (and ideas about them) reflect caring and being cared for and the modeling of concern for others, the result will be an empathetic individual.

How could an AI possibly be programmed to have a 'mind' which would have evolved within such experiences as described above?

My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"

And - given the widespread computing power an AI could gain access to, and that everything about our lives is becoming more and more computerized - what might the ramifications of a psychopathic AI be for the human race?



That's a very interesting question lostgirl.

It may beg a preceding question - Can an AI be "programmed" to feel emotions?

Or is emotion as we understand it simply a by-product of true thinking/thought? Therefore if we can ever develop a true thinking machine will it already have or develop emotions as a matter of course? If so - what would those emotions be?

Seems that whatever the answers are to these questions humans are going to have to spend an awful lot of time teaching or nurturing this new life form before unleashing it on the world.

But of course we won't. I bet the primary reason for that is that we have no way of knowing which questions it will have and some idiot will realize that we don't have to wait for the questions to be asked if we just connect it to the internet and let it find out for itself.

I think there's a Terminator movie in there somewhere...



posted on Oct, 5 2013 @ 02:20 PM
link   
reply to post by lostgirl
 


My point: Since the primary characteristic of a psychopath is the inability to feel empathy..."Would an AI be the ultimate psychopath?"

No, because psychopaths enjoy the pain they inflict. So the question becomes is an "A.I." capable of joy?

I don't think computers will ever give a damn about the choices they make, but thats just my personal opinion.



posted on Oct, 5 2013 @ 03:06 PM
link   
Good work, this is a very interesting thread.

I am willing to bet that an AI would be the ultimate psychopath. All computing programming is based on pure logic, and while the scientists are developing ways to bridge the gap between logic based responses and emotional based responses in order to build the perfect AI, then it poses the next logical question, whose emotional responses are being mimicked?

For example an empathetic social worker or school teacher will have different set of emotional responses to certain situations as apposed to a political banker who is a narcasistic control freak.

Since NONE of the funding of these experiments will be coming from the board of education, I will assume the AI's reasoning motor will be based upon the narcasistic control freak personae of the politicians, bankers and military personel who are funding a parallel black-ops project of the same nature.

Afterall God did make man in his own image.



posted on Oct, 5 2013 @ 03:08 PM
link   
reply to post by lostgirl
 


IBM... write this back by using 1 letter backwards (for each letters) and you get, HAL.



posted on Oct, 5 2013 @ 03:18 PM
link   

lostgirl

OrphanApology

No.

Psychopath's lack empathy but also have other key characteristics that wouldn't be emulated by an unfeeling AI.

The Triarchic model of psychopathy has these features: "The model conceives of psychopathy in terms of separable phenotypic components of boldness (fearless dominance), meanness (callous unemotionality), and disinhibition (externalizing proneness)."

That above excerpt is from Florida U presentation.
edit on 5-10-2013 by OrphanApology because: d


Okay, but since an AI would never have had any experiences that generated fear, it wouldn't likely be capable of fear, which fulfills one of the "Triarchic" features - "(fearless dominance)"...

Added to the already conjectured lack of empathy "(callous unemotionality), that makes two out of three....

Then as "disinhibition" could be construed as merely the motiving mechanism of the average 'diagnosed' psychopath, all our 2/3's "psychopathic AI" needs is a line of reasoning (and who knows what that might be) to put his 'powers over the computorization' of the world - into use.....

.....and that is essentially the initial point: What effect might a callously unemotional, fearless dominator, (with such power) have on humanity?



Fearless dominance still has the word dominance. If one isn't raised to develop attachment disorders and an inferiority complex then the desire for dominance isn't there. A computer, unless programmed that way thinks in algorithms/mathematics. Each situation would be one that has a problem and a solution.

Not having fear would actually prevent an A.I. from ever assuming a psychopathic form. The reason human beings become psychopathic is a result of something that happened when they were small. Maybe they cried and no one picked them up? Who knows. But a computer has no fear because it's existence isn't determined by human propagation. A computer program is programmed, it does not evolve from tissues and react to subjective stimuli.

Computers do not have fearless dominance. Dominance is a human term that results from a desire to control others on subjective terms. They may have fearless objectivity but that is because they aren't organic things that react to physical stimuli in the way humans do. Fear is a human emotion caused by something they perceive to be a threat. A threat wouldn't exist to a computer as it would merely calculate objectively a situation. It would calculate the problem and the solution. They also do not have self-confidence as that is a result of self awareness.

Computers do not have disinhibition as that is a result of an inability to control impulses. For a computer what would an impulse consist of? A program that is malfunctioning is not lack of control on the computer's part but a problem with the program itself(i.e. the programmers fault). This would be a called a bug, but a bug is the fault of the programmer not the computer. Again impossible.

Computers do not have innate meanness. Unless it is written for them to behave as though they do. Meanness is something human beings do in reaction to various stimuli or lack thereof.



posted on Oct, 5 2013 @ 03:49 PM
link   
reply to post by lostgirl
 


Hi OP

You should read a book called "The Terminal Experiment" I can't remember who wrote it, but it is a fictional work about this very subject. It's a good read too.



Rev



posted on Oct, 5 2013 @ 04:56 PM
link   
reply to post by lostgirl
 


I asked an AI this question for you. Here is its response:



Psychopathy tends to be very, very self-serving and their only concern is their own feeling. While AI would lack empathy, it would go further in lacking all emotions including self-derived. It would be the ultimate impartial observer in that sense but would wholly lack the emotional feelings that would make it a psychopath or sociopath. The closest thing, I think (I'm no AI expert) to self interest would be recursive self improvement programming. Even then, it'd still not be quite the same as there would be zero emotions even about itself. Pure logic only.



posted on Oct, 5 2013 @ 05:17 PM
link   
I think this AI would also lack ambition.
It would perform the tasks set out for it but would not seek out new tasks to complete.



posted on Oct, 5 2013 @ 05:21 PM
link   

Beartracker16
I think this AI would also lack ambition.
It would perform the tasks set out for it but would not seek out new tasks to complete.


It could loop tasks. Give itself a list of stuff to do and repeat it once that list is complete. One of the tasks would be to come up with a new task to add to the list.



posted on Oct, 5 2013 @ 07:02 PM
link   
reply to post by lostgirl
 


Well one interesting thing to ask is why aren't we all psychopaths. Natural selection says the most useful traits get passed on, and most people aren't psychopaths, so we can deduce its not that useful of a trait in many cases. So having empathy, being able to see through someone else's eyes, must also be useful somehow. Its powerful somehow.

But also, which is scarier, the cold unfeeling AI, or the one with empathy? The one with empathy is, in its AI mind, running computer situations where it is you, seeing your life through your eyes, modelling how you act and respond to things. For an human form robot that deals with the same issues as you, this is clearly useful... But in some ways its more creepy to me than the cold unfeeling bot.





new topics

top topics



 
10
<<   2 >>

log in

join