It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

AI robot that decides decision in court

page: 1
0
<<   2 >>

log in

join
share:

posted on Dec, 5 2004 @ 07:29 AM
link   
Wouldnt it be a good idea to stick the entire legislation of a country into a robot which had to ability to analyse information fed into it. Wouldnt the outcome be unbiased? I know that the creator can manipulate the robot but say it was just in a local court case? Do you think that its possible in the future? Would there be any point of a judge anymore? Maybe this is a new idea or maybe its already been built. Im not sure but I was just thinking about it.



posted on Dec, 5 2004 @ 08:52 AM
link   
I wouldn't trust a pure logical AI over a human any day. You cannot rely on it to take into account extenuating circumstances, or remorse, or anything else that may affect the outcome of a trial. The machine just would not understand, and pass a sentence that may not fit the crime.



posted on Dec, 5 2004 @ 09:02 AM
link   
At this point in our technological timeline, I dont think I would trust those decisions to logic gats in a few IC's. The technology for this type of thought process is only in its infancy. See MIT's COG.



posted on Dec, 5 2004 @ 09:03 AM
link   
How about if we also fed all the past convictions etc so it got an overall idea? lol



posted on Dec, 5 2004 @ 09:19 AM
link   

Originally posted by stumason
I wouldn't trust a pure logical AI over a human any day. You cannot rely on it to take into account extenuating circumstances, or remorse, or anything else that may affect the outcome of a trial. The machine just would not understand, and pass a sentence that may not fit the crime.


Why not just program it to take into account extenuating circumstances, or remorse, or anything else that may affect the outcome of a trial?



posted on Dec, 5 2004 @ 09:25 AM
link   
human consciousness is far more complex than any robot can emulate. our technology wont get to the point of thinking like we do for years and years to come. a robot, AI or not, cant understand love. cant understand revenge. hate, lust, addiction. it just cant. all it knows is to make a decision, thats it. theres more to pure fact than just the facts.

at any rate, a single robot would be easily prone to manipulation. by any means. coercion, hacking, reprogramming. a single, centralized authority is a bad idea for a court system. when that authority gets poisoned, the whole justice system is poisoned.



posted on Dec, 5 2004 @ 09:25 AM
link   
How can you program a machine that has to emotion, to understand emotion? It would not be possible (maybe decades in the future, but not now) to make a machine capable of fully understanding human behaviour.



posted on Dec, 5 2004 @ 09:29 AM
link   

Originally posted by stumason
How can you program a machine that has to emotion, to understand emotion? It would not be possible (maybe decades in the future, but not now) to make a machine capable of fully understanding human behaviour.

It's impossible for any judge to be capable of fully understanding human behaviour as well though.
They do the best that they can. As would the machine.
They would ot actually need to be able to understand human emotion liek you are thinking. They would merely need to be able to understand that it effects humans to do things and how it affects the current case.



posted on Dec, 5 2004 @ 09:53 AM
link   


It's impossible for any judge to be capable of fully understanding human behaviour as well though.
They do the best that they can. As would the machine.


Yeah, human judges suck too, but they have a better understanding of what makes people tick.




They would ot actually need to be able to understand human emotion liek you are thinking. They would merely need to be able to understand that it effects humans to do things and how it affects the current case.


A machine cannot do this. It requires non-linear thinking, and the ability to empathise, which machines cannot do.



posted on Dec, 5 2004 @ 10:08 AM
link   
how about robot and human working together? ie - the robot gives his decision and humans make the final based on the robot or the otherway around. im just thinking about buffers in the us election for some weird reason, lol



posted on Dec, 5 2004 @ 10:44 AM
link   
well, ok channy, but once you do that, youll need mormake it effective and unbiased again, once you reintroduce the human element you reintroduce all the negatives we took away with the robot. may i suggest using 9 people ? that way you can get a definite majority, and it wont be large enough to get out of hand.

if you add humans, which you have to, you remove the need for the robot. there could be justices aong with a robot, a single one who presents an opinion to the justices, how the robot sees it. they can do what they want with it. its silly though, in order to make use of it you have to make it useless.



posted on Dec, 7 2004 @ 01:14 AM
link   
I personally think this is a great idea.

You would still have lawyers to argue the case and all relevant case law to the judge. The judge decides what info should be entered into the computer. The computer itself will not accept race, age, childhood problems, drug problems etc.

Only the crime, past criminal history, # of witnesses, d.n.a. evidence and so on would be entered into the computer. It comes up with a verdict not based on emotion but pure info. No appeals.



posted on Dec, 7 2004 @ 02:02 AM
link   
Heck why not, we let them count our votes, which already includes district court judges, really in light of this, aren't we already pretty much well down this path already? We are constantly profiled and monitored by machines now, why not let them decide our fates in legal decisions as well. This concept is not as far fetched as it might seem, indeed, it is already up and running in the electoral process and monitoring/profiling citizens. I think we owe Mr. George Orwell a big Thank You for at least trying to let us know where things were going...


Odd

posted on Dec, 7 2004 @ 02:11 AM
link   
The odds that the motives and drives of a truly sentient AI would coincide with those of human society are astronomical... I wouldn't trust such a thing with a five-dollar bill, much less the authority to levy a death sentence.



posted on Dec, 7 2004 @ 02:58 AM
link   
Why dont we dress up a chimp in a small suit and let it make the descisions, it be cheaper only a few bannanas now n then!

JESUS H CHRIST are you insane people?!!!


Never, have you seen Terminater 3?



posted on Dec, 7 2004 @ 10:18 AM
link   
how many lawsuits do you think are unbiased? a ai computer can be completely neutral apart from the rules its been given (like a constitution or something). why should we even give the person a chance - rules are rules. how many crimes are let off?

you people might not agree with it, but i believe sooner or later, the government will use AI machines. no one knows were AI will exactly lead to, only predict so who knows whether a computer has 5 senses to develop emotion part?



posted on Dec, 7 2004 @ 04:48 PM
link   

Originally posted by Channy
how many lawsuits do you think are unbiased? a ai computer can be completely neutral apart from the rules its been given (like a constitution or something). why should we even give the person a chance - rules are rules. how many crimes are let off?

you people might not agree with it, but i believe sooner or later, the government will use AI machines. no one knows were AI will exactly lead to, only predict so who knows whether a computer has 5 senses to develop emotion part?


i attended a lecture today on the death penalty by professer robert blecker. one of the things he espoused was that you need emotion. theres more to a case than just fact, theres moral fact. a machine cannot understand post-partum depression, it cannot understand being beaten and raped by your father for until you were 13. and we are equally incapable of programming even a response to that. you cannot mathematically express pain, love, hatred, depression, physical abuse. there are some things you can mimic, some you cant. court cases require the use of both. we cant settle for 50%.



posted on Dec, 9 2004 @ 01:39 PM
link   

Originally posted by Amorymeltzer one of the things he espoused was that you need emotion. theres more to a case than just fact, theres moral fact.


Considering what's at stake with a death penalty case why would you want your judgement to be clouded by emotion?

Their have been a number of cases where an apparantly guilty man who was convicted based on victim testimony was let go when the evidence proved them wrong.


Originally posted by Amorymeltzer a machine cannot understand post-partum depression,


Why not? Let it know what happens to those with post partum depression and why it happens.


Originally posted by Amorymeltzer it cannot understand being beaten and raped by your father for until you were 13.


Can you? Can anyone who hasn't actually gone through that truelly understand it?



Originally posted by Amorymeltzer and we are equally incapable of programming even a response to that.


Sure you can.


Originally posted by Amorymeltzer you cannot mathematically express pain, love, hatred, depression, physical abuse.


So language can't express emotion now?
Anyone who actually knows anything about math knows that it is capable of expressing things that we don't have words for. And that if ever do meet an alien race that math will be the first langauge we understand of each other.



posted on Dec, 9 2004 @ 04:40 PM
link   

Considering what's at stake with a death penalty case why would you want your judgement to be clouded by emotion?

Their have been a number of cases where an apparantly guilty man who was convicted based on victim testimony was let go when the evidence proved them wrong.


theres faulty logic there. the case shouldnt be decided on emotion, but it needs to play a part. you've done countless of illegal things, most (all?) of them you've not been prosectued for. why? because theres emotion in the world. a robot would fine you for every infraction. the same goes with a murder trial. theres a major difference between killing your father because you wanted the car and killing your father because, to you, he embodied pain and suffering. without emotion, there would be no minimum sentences, no lowered sentences.


Can you? Can anyone who hasn't actually gone through that truelly understand it?

Originally posted by Amorymeltzer and we are equally incapable of programming even a response to that.


Sure you can.


i dont pretend to understand it, not at all. but i understand pain. suffering. shame. hatred. misery. a robot does not. it can express a reaction to it with pre-programmed settings, but thats not the same thing. its restricted and it utterly fails. a program is not capable of calculating just how depressed you were, or just how heart-less you were, and cannot express that.


So language can't express emotion now?
Anyone who actually knows anything about math knows that it is capable of expressing things that we don't have words for. And that if ever do meet an alien race that math will be the first langauge we understand of each other.


math is a different language. i know quite a bit of mathematics, thank you very much. you use it to express lots of things we dont have words for, but the opposite works just as well. ive yet to derive anything in n-space to tell me whether the girl in the third row likes me or not, and, while i may like to, i highly doubt ill get anywhere with it. all ive been able to do is prove that girls=evil. mathematics can only convey what we can do, it cannot explain what we know and understand.


Nox

posted on Dec, 9 2004 @ 04:47 PM
link   
Why not? I'm sure mitigating circumstances can be quantified.

I'm sure that a history of abuse can qualify as a mitigating circumstance, lowering a sentence.




top topics



 
0
<<   2 >>

log in

join