It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Self-driving cars are already deciding who to kill

page: 2
11
<< 1    3  4 >>

log in

join
share:

posted on Feb, 11 2017 @ 06:52 AM
link   
Will never trust a driverless car .
not to mention Id have to fire my chauffer .




posted on Feb, 11 2017 @ 06:58 AM
link   
It goes more deeper than the OP surmises. Think on the supposed laws of robotics, I think Asimov coined them. You know what I mean, law 1, no robot will harm a human, or something like that.
Now as been suggested think of this scenario, which will happen. You are driving at 40 mph down a road when a pedestrian steps in front of you. To miss the pedestrian your car has to hit a lorry head on. What would the computer be programmed to do?
It might recognize the human and not want to harm them but would it recognized the lorry as an object therefore swerving into its path to avoid the human.
This then opens up the premise, does the computer recognize that there is a human on board and it has to protect them as well. As usual it will come down to whoever programs the software. And thinking of more AI, what if it chooses to harm a human in that scenario for its own self preservation?



posted on Feb, 11 2017 @ 07:14 AM
link   
a reply to: 727Sky

Iran did not hack the Predator 'drone', Russian GPS technicians spoofed the GPS digital correllator, jammed the GPS signals from the constellation with their own navigation data, causing the Predator to 'think' that it was near its landing area and it landed itself in Iran instead of in Afghanistan at the forward operating base. A friend of mine was crew-chief for that bird. Me? At that time, I was a military anti-spoofing control segment GPS engineer, subcontractor to US Space Command.

Re; the autodriving cars, I want one!! Yeah, I turn down lots of jobs that are just too far away to take the bus but not close enough (or have enough heart) to drive the distance twice a day, everyday. I can just see it now, hop into the electric autodrive car at about 4am, take my blankie, pillow, and a flask of JD. Kick back, take a couple nips, and snooze all the way to work, hah hah, yeah baybee!!!

As somebody said, in a few years, every car will be required to be driverless. I suspect that it won't be very long before it is likely that all cars will be manufactured WITHOUT user controls aside from maybe seat-belts? The first rigs will be semi's, then buses, taxis, police cars, fire-engines, airplanes. Uber is planning to manufacture tri-kopters, so they'll be auto-flying electric cars (nothing new in that, was proposed 25 years ago using GPS automatic-skylane nav). And all the vehicles will be networked by AI traffic computers. We are in the period of the Mark of the Beast...souless machines thinking for us. People are shoving & pushing, lining-up in columns, they just can't get enough of all those corporate/federali spy sensors known as "smart-phones" which all the IT creeps are hacking & tracking us every-which-way-but-loose. Seems all of those folks all got FaceBookitis disease and are tweet'n twitterphobes...folks are worried about self-driving cars, hah hah hah. I want one, and it would really be kewl should it fly also!!



plutron

edit on 2/11/17 by Plutron because: edt: fix lexicals and other general phkups



posted on Feb, 11 2017 @ 07:19 AM
link   

originally posted by: LuXTeN
Would you trust your life to a handful of strangers? A hunk of metal that drives itself? Is it worth it or do you live your life in the fast lane tempting fate?





Don't we already trust our lives to a handful of strangers?

We trust the machines that monitor our lives in hospitals.
We trust the train to not come off the tracks.
We trust the thermostats, smoke detectors and CO2 monitors in our homes.
We trust the brakes in our cars as we drive down mountains.
We trust our ovens and microwave counter-tops.
We trust x-ray machines, MRI's.
We trust elevators.

Every day we place our trust in machines that have been programmed by people.



posted on Feb, 11 2017 @ 07:34 AM
link   
Just wait, soon after these cars are commonplace, the rich and powerful will be able to buy priority status in which the network will gladly sacrifice a dozen of us regular folk to save the rich.



posted on Feb, 11 2017 @ 07:35 AM
link   
a reply to: LuXTeN


Is it worth it or do you live your life in the fast lane tempting fate?


my answer





posted on Feb, 11 2017 @ 07:42 AM
link   
a reply to: DAVID64

here ya go,




posted on Feb, 11 2017 @ 07:52 AM
link   
had to,





edit on 11-2-2017 by hounddoghowlie because: (no reason given)



posted on Feb, 11 2017 @ 08:44 AM
link   
It does appear, based on IEEE publications, that self-driving cars will become the next wave of automation. There is a tremendous amount of research being done right now into different aspects of automobile self-control.

But it's not going to replace driving, at least not for some time.

There are simply too many unknowns. Road and utility construction require intuitive responses to unfamiliar conditions, something no computer can accomplish to any reasonable degree. Law enforcement activities, including accident site clean-up, fall into the same category. Then there's the issue of parking lots, where traffic control is largely dictated by non-verbal and implied communication between drivers and traffic flow is based on sometimes confusing lines painted on sometimes damaged pavement.

At best, one will be able to get into their car, drive manually to a major artery, engage self-control, and be alerted to take over when they reach their exit. Even that is years away.

As for the 'moral decisions,' they do not exist. Computers cannot make judgements based on subjective ideals such as the value of one life versus another. Computers can only make decisions based on hard, numerical data. Isaac Asimov was a great writer and an insightful dreamer, but his romantic notion of the three Laws of Robotics were just that: romantic musings. At best, a self-driving car can be programmed to avoid pedestrians even if that means self-destruction. Even that would not guarantee complete success, any more than drivers desperate to avoid hitting pedestrians will occasionally fail. At highway speeds, sometimes physics just says "no."

Cars need not be networked the way we think of computers being networked. Communication is not the same as networking. In order to hack a machine, it is necessary to re-program that machine or at least to fool the totality of the sensory systems. Computers can be hacked easily, because their undedicated nature is based around programming. A car need not be reprogrammed except for updates, which could (should) be disallowed during operation. The increasing number of sensors, required in order to handle varying conditions, makes sensor takeover a moot point.

There are dangers, yes, and I am far from comfortable sharing the road with an automaton. I have many reservations. But the truth is that pilots do not fly planes either; planes fly themselves with the pilots acting as backup and take-off/landing operators. Yet, we trust our lives to them everyday, in higher-speed applications than cars operate in, and with lower chances of survival during failure... and flying is still much safer than driving.

TheRedneck



posted on Feb, 11 2017 @ 09:23 AM
link   
If they made "real" cars again people would enjoy driving again and there would be no demand for robo cars.

driving is boring when you`re in a plastic and tin 4 cylinder, cookie cutter,emissions controlled, politically correct clown car.

bring back the highway yachts,the 454 engines,real steel and chrome,the 130 inch wheelbases,the smog belching 10 MPG behemoth`s.

I`m sorry, but this is NOT a car



THIS IS a car:




posted on Feb, 11 2017 @ 09:37 AM
link   
a reply to: Plutron



Iran did not hack the Predator 'drone', Russian GPS technicians spoofed the GPS digital correllator, jammed the GPS signals from the constellation with their own navigation data, causing the Predator to 'think' that it was near its landing area and it landed itself in Iran instead of in Afghanistan at the forward operating base. A friend of mine was crew-chief for that bird. Me? At that time, I was a military anti-spoofing control segment GPS engineer, subcontractor to US Space Command.


Thanks for the info all I ever heard and surmised was Iran did the deed.. A great feat and test run for Russian capability which hopefully there are now ways to avoid that scenario in the future..?

Since you were in the service I can assume this is not just another democratic tale of "the Russians did it !"



posted on Feb, 11 2017 @ 10:02 AM
link   

originally posted by: jappee
Although having a computer decide to hit the car vs. The pedestrian is troubling... Population control? I think a fat NO.

Maybe not population control but this gives us something to think about.





posted on Feb, 11 2017 @ 10:09 AM
link   

originally posted by: crayzeed
It goes more deeper than the OP surmises. Think on the supposed laws of robotics, I think Asimov coined them. You know what I mean, law 1, no robot will harm a human, or something like that.
Now as been suggested think of this scenario, which will happen. You are driving at 40 mph down a road when a pedestrian steps in front of you. To miss the pedestrian your car has to hit a lorry head on. What would the computer be programmed to do?
It might recognize the human and not want to harm them but would it recognized the lorry as an object therefore swerving into its path to avoid the human.
This then opens up the premise, does the computer recognize that there is a human on board and it has to protect them as well. As usual it will come down to whoever programs the software. And thinking of more AI, what if it chooses to harm a human in that scenario for its own self preservation?


In such a scenario and for simplicity sake lets assume that a total of 3 lives are at stake, a passenger in each vehicle and the pedestrian,...maybe the algorithm would factor in the probable survival rate between the 2 choices, if car runs into pedestrian, what are the odds of survival for a human to survive a 40 mph collision, if cars runs into lorry, at the current closing rate between the 2 vehicles, what is the survival probability for the 2 humans involve....and ultimately which choice would lead to the survival of more lives.

I would think the first choice would most likely involve 1 fatality (the pedestrian) and the other choice, a higher chance of survival for both motorist....of course a ton of other factors must be considered eg, road condition..it is dry or wet, the safety technology of such vehicle ...advance airbags or whatever and so forth....



posted on Feb, 11 2017 @ 10:49 AM
link   
a reply to: JD163
What do you choose when you are faced with no good choice?


edit on 11-2-2017 by NightSkyeB4Dawn because: Word edit.



posted on Feb, 11 2017 @ 12:47 PM
link   
a reply to: NightSkyeB4Dawn

Thats interesting,

Personally in the first scenario I would flip the switch,..in the second, I would jump myself,..but thats just me



posted on Feb, 11 2017 @ 12:55 PM
link   

originally posted by: DBCowboy

originally posted by: LuXTeN
Would you trust your life to a handful of strangers? A hunk of metal that drives itself? Is it worth it or do you live your life in the fast lane tempting fate?





Don't we already trust our lives to a handful of strangers?

We trust the machines that monitor our lives in hospitals.
We trust the train to not come off the tracks.
We trust the thermostats, smoke detectors and CO2 monitors in our homes.
We trust the brakes in our cars as we drive down mountains.
We trust our ovens and microwave counter-tops.
We trust x-ray machines, MRI's.
We trust elevators.

Every day we place our trust in machines that have been programmed by people.


Speak for yourself. I don't trust any of that and more.

I'm hyper vigilant about anything "automated". Ask my wife.

I don't trust people for that matter.



posted on Feb, 11 2017 @ 12:57 PM
link   

originally posted by: NightSkyeB4Dawn
a reply to: JD163
What do you choose when you are faced with no good choice?



Create a third option. I don't believe in "no-win" situations.

Put me there in reality and I'll enhance our understanding of the laws of physics.
edit on 2/11/2017 by TarzanBeta because: (no reason given)



posted on Feb, 11 2017 @ 01:01 PM
link   
a reply to: TarzanBeta

You hit the nail on the head sir,

It's not really automation that scares me, it's always been people that scare me.

People code that software, people either did the maintenance or they didn't, or they did it but not correctly.

Ever since I worked on F-16s in the Air Force and realized that it's not the machine you worry about, it's the incompetence of the people working on that machine that should worry you, because that's when things fail.

Most failures are boiled down to someone not doing their job.

And you want to take the present day worker and create automation?

I don't see everyone super happy and really wanting their job, I see people collecting paychecks, not all but enough to be cause for concern.

Only takes one weak link and the chain of trust and functionality breaks.



posted on Feb, 11 2017 @ 01:17 PM
link   

originally posted by: jappee
Although having a computer decide to hit the car vs. The pedestrian is troubling... Population control? I think a fat NO.



Is it you might have to make the same choice while driving. And at that point odds of survivability come in to play. If i hit a pedestrian with my car odds arent good they will survive but i hit another car they have protection from the impact.



posted on Feb, 11 2017 @ 01:26 PM
link   
a reply to: TarzanBeta

I don't trust People either.



new topics

top topics



 
11
<< 1    3  4 >>

log in

join