It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Uber stops all self-driving car tests after fatal accident

page: 3
27
<< 1  2    4 >>

log in

join
share:

posted on Mar, 20 2018 @ 01:04 AM
link   
It's quite clear we don't yet have good enough AI for self-driving cars, they just cannot handle the countless unpredictable situations which can occur when driving, in order to account for that you need a type of general intelligence, which we do not have. We can create a very good approximation of a good driver but every now and then it will still kill someone because it doesn't have the general intelligence required to make improvised decisions in unexpected situations.
edit on 20/3/2018 by ChaoticOrder because: (no reason given)



posted on Mar, 20 2018 @ 01:55 AM
link   
I don't understand how the almost exclusive opinion on here is that the car or driver was absolutely at fault without any investigation being published. I am sorry, this rare post from me is a long one.

The Crash

Looking at the other side of the coin, there is an inherent belief that if you step out in front of a car; then it will stop. I don't get it. If you step out in front of a train, regardless of whether it is automated (ATO) or operated by a driver; the chances are it will not be able to stop before hitting you. People know that, so if you have a train hurtling through a station at 100mph/160kmh you know to step back.

When there are cars hurtling along a highway at 60mph/95kmh and you are walking along the sidewalk then you know not to cross in front of it.

In many cases though there is an expectation that regardless of when you walk out in front of a car in a residential road, that it will be able to stop in time. Residential zone speed limits are not there for that reason, they are there to give the pedestrian a damn good chance of surviving, otherwise we would still be driving at walking pace with a bloke waving a red flag in front of it.

So indeed, it may not be the fault of the car, or it's driver. As the minority have pointed out though, we do not have all the facts.

The Need

What does interest me though is the fact that people are saying "but we do not need self-driving cars, that is not what a car is there for." Indulge me for a moment and imagine....

You are happily at home one day, playing backgammon with the kids or grandkids, doing DIY, sunbathing; and something horrific happens. You go to the hospital and they can not save your sight; how do you get around? Would you rather rely on someone to do everything for you, take you shopping, to the doctors, to the bar? or would you rather be able to get into your self-driving car and tell it to take you there?

My wife, she went from driving to banned by doing something outrageous as giving birth. The pressures on her brain of the pregnancy caused her to lose her sight and she had to hand back her driving licence. She now has to rely on the bus service, if it runs, to get her around.

The Future
Why are we really worrying though. This has been in the pipeline for many, many years and will eventually mean the end of your personal cars anyway. The whole system of transport is changing, albeit the Government investment is probably the most challenge for the future right now.

I think we are closer than we realize.

Imagine if inter-city travel was no longer undertaken by car. you would hail a self-driving car from your smartphone and it would turn up, take you to the nearest interchange where you would get a self-driving train to the interchange of your choice and then hail another self-driving car from the interchange hub to your destination.

All of this will run on-demand 24/7 because it is self-driving, self-charging (or in the case of trains continuously powered) and paid for through your taxes. Tourists will be expected to pay for theirs through airport/landing/holiday taxes paid at point of purchases and self-driving cars will become the preserve of collectors who will need a premium licence and insurance to be able to drive it.

SO

TL;DR
Uber accident: let's wait and see
self-driving cars: are the future anyway
manually driven cars: will end up in museums.



posted on Mar, 20 2018 @ 03:34 AM
link   

originally posted by: Blue Shift

originally posted by: RomeByFire
And how many people die per day by car accidents?

Apparently around 3,300 a day. The goal of self-driving cars is actually to reduce this number by linking and controlling all cars. Of course, there will be failures along the way, but the alternative is to just keep chalking these daily deaths up to "acceptable losses," because damned if we don't love our cars.


Kill me before all the nice, fun, and driveable cars are gone.



posted on Mar, 20 2018 @ 03:58 AM
link   
a reply to: Ohanka

didnt pervious idiots [ some 130 years ago ] advocate banning the new fangled " automobiles " to " protect " the then dominant horse and carriage industry ??????????



posted on Mar, 20 2018 @ 06:00 AM
link   
a reply to: CriticalStinker

How do we know the person behind the wheel didn't have control of the vehicle??

I didn't read from the link you provided so maybe it says in the story how they know but if you could just tell me, that would save me time.



posted on Mar, 20 2018 @ 08:47 AM
link   

originally posted by: RickyD
a reply to: CriticalStinker

And there was no option for the person in the car to...like...hit the f*&%in breaks???

I dare say that that person was at the least negligent or if there indeed was no option to take over control the company should be found negligent and get wrecked in court. This is beyond absurd...


this is actually a problem caused by "Self driving cars", with "safety drivers". how long would you be able to sit there doing nothing, and paying strict attention to the road, while not actually driving? seriously the human mind is not going to be engaged enough to keep focus. the safety driver's attention will begin to wander just sitting there. in some cases just not falling asleep could be an issue.

yet that is only one issue. another is reaction times will be much slower. as a person actually driving a car is already using the pedals, and steering wheel, being engaged with driving. and is thus able to react quickly to a situation. someone just sitting there as a safety driver will always be slower to react to the exact same situation. because of course before they can even react. they first have to reposition so that they are using the pedals and put their hands on the steering wheel. wasting precious seconds when even a split second may be too long. first the safety driver has to see the issue, realize there is a problem, then he has to actually take control of the vehicle. only then can they actually react to the situation. where someone actually driving just has to see the issue, realize there is a problem, and react to the issue without wasting the time to take control. in an aircraft on autopilot the issue is not near as critical. as a pilot has much more time to react. because first off aircraft are purposely kept extremely far apart compared to cars. that alone gives time. in fact an aircraft will have collision warnings long before they even come close to the distance cars normally travel to each other. and aircraft don't have to worry so much about things like pedestrians popping out in front of them. (ps aircraft hit things like birds all the time). and it is all that time a pilot has which makes autopilot rather safe. unless something is really wrong like the instruments being extremely wrong. they will rarely if ever be in that split second, do or die emergency. and the truth is in that kind of emergency, the aircraft is pretty much doomed. yet for cars, split second issues can happen pretty much every time you drive, quite possibly many times in one trip. even if we put, proximity warning systems on these cars, they are so close anyway that by the time any "proximity warning" went off, chances are there would be no time to actually do anything anyway.

and that is where they will have to really do their investigation. did the driver have enough time to see there was a problem, and then to actually be able to do something about it. and remember that is exactly the reason a lot of people want these self driving cars. because they CAN react faster to danger than a human. so if the bike just popped out of nowhere, was there enough time for the driver to realize he was going to hit her, and then have time to take control and avoid as well? after all it seems like either the car did not see her with enough time for the computer to react in that faster time a computer is capable of over a human. so then how can the human be able to react, when the human is already not just slower, but made eve slower to react than normal, since they were not in control?

another aspect other than the timing was did the driver, and should the driver have realized there was a problem in the first place. after all how does the driver "know" the car is not going to react properly with enough time to actually do something about it? and remember if the car does not start to brake in time, the driver is going to take time to understand that, then the driver has to take control. and only then can the driver try to stop the car. i think the honest answer to that is that by the time the driver could see the car was not reacting, and took over, he has already hit the pedestrian, before they would even have a chance to try to stop the car. of course the only real way to stop that problem would be to have these cars massively "over react" like by trying to stop far earlier than it needs to. to give the driver the time to realize it isn't doing it, and to then take control. and if this was a "pop out" situation, then even that would not have helped. or should the driver just automatically hit the brakes whenever the car needs to stop or slow down? which renders the entire idea of a "self driving car", a farce. since the "driver" is actually in control. of course rendering the whole idea of "testing" the system invalid.

seriously a fair portion of safe driving is making guesses about what is going to happen before hand. "is that car going to try to lane change into me?" "is that kid going to run out into the road?" and that is a huge issue with computers. they are not so good at making those guesses. a computer can only do what it is programed to do. it does not "guess, it reacts to preprogramed instructions. if A, then do action 22, if B, then do action 123. and that is the biggest problem with these self driving cars. and at the same time you are asking a person sitting there as a so called driver, not to guess what is going to happen ahead of time, but just to react when the car is not doing what it should be doing.

a serious question for people who are so gung ho over these self driving cars. if i gave you a flight right now. lets say from Florida, to the Philippines. and told you that there is no pilot, and in fact no controls at all (like all the hype for these self driving cars not having controls). would you right now, get on that airplane? would you put your children on without you?would you even want such aircraft flying over your homes, schools and businesses? if the answer to any of these is NO then how can you possibly want self driving cars?



posted on Mar, 20 2018 @ 12:21 PM
link   

originally posted by: generik
a serious question for people who are so gung ho over these self driving cars. if i gave you a flight right now. lets say from Florida, to the Philippines. and told you that there is no pilot, and in fact no controls at all (like all the hype for these self driving cars not having controls). would you right now, get on that airplane? would you put your children on without you?would you even want such aircraft flying over your homes, schools and businesses? if the answer to any of these is NO then how can you possibly want self driving cars?
I'm not gung-ho on self-driving cars, but I'd have no problem flying in a self-flying plane and I'm not sure that's not already happening.

Don't Freak Over Boeing's Self-Flying Plane—Robots Already Run the Skies

“Asiana prohibits the first officer from landing the plane by flying it, it must be automated,” says Moss. “The captain is prohibited from manually flying above 3,000 feet.”

The type of plane plays a role, too. Airbus tends to rely more heavily on automation, giving the computer control unless the pilot overrides it.


I'm more worried about a human pilot making a mistake than automation doing so, so yes I'd fly such a plane and in effect I think I already have except for the takeoff and no humans to take over the controls part.

But what makes air travel different are the air traffic controllers, who make sure the planes don't fly into each other. As a backup most commercial planes also have collision avoidance technology. There are no such controllers for the roads and even if there were, they don't control pedestrian traffic. I think AI will take a long time to become as efficient as humans at dealing with pedestrians, especially when it involves reading their body language and guessing what they might do next. Even I can't always predict that, but I think I have a better chance than AI for the time being, until AI gets much better.

edit on 2018320 by Arbitrageur because: clarification



posted on Mar, 20 2018 @ 12:50 PM
link   

originally posted by: Mandroid7
You know what make good self driving cars?

Trains

Keep that bs off the streets.

Too many variables to consider that computers can't compute.


I'm sorry this is just an ignorant statement. There are many things computer do better than humans right now. It is just a matter of getting the input devices and software perfected.

You act as if humans are infallible. The issue is not whether a self driving car will ever make a mistake, the question is do they make less mistakes than humans.

Once they do, the argument for them to be banned does not make any sense - and I guarantee you economics will make them commonplace.



posted on Mar, 20 2018 @ 01:10 PM
link   
TED Radio hour has a GREAT podcast on this topic.

And here’s a TED talk discussing it:
Self driving cars

My personal opinion is I will feel much more comfortable if my kids are on the road with more driverless cars than not.



posted on Mar, 20 2018 @ 01:12 PM
link   

originally posted by: proximo
The issue is not whether a self driving car will ever make a mistake, the question is do they make less mistakes than humans.


The real issue is how much control do you wish to exert over your own life? Computer-controlled cars will soon be telling you where and when you may drive.



posted on Mar, 20 2018 @ 01:29 PM
link   

originally posted by: Wardaddy454
Kill me before all the nice, fun, and driveable cars are gone.

You can just step in front of a self-driving one.



posted on Mar, 20 2018 @ 01:32 PM
link   
a reply to: proximo

What do you mean exactly by “economics will make them common place”?

Already the amount of tech in modern cars mean they are much more of a hassle to diagnose and repair when things go wrong. The onboard computer can often be faulty and as often as not the cause of the fault. For instance one of “daytime”front lights is out on my van, there’s nothing wrong with the bulb or the wiring as can be proved when I flick it to dipped mode and it comes on. The issue is obviously with the onboard computer.. it would mean taking it to the main dealer and spending lots of money over a non issue because my mechanic doesn’t repair onboard computers..



posted on Mar, 20 2018 @ 01:45 PM
link   

originally posted by: surfer_soul
Already the amount of tech in modern cars mean they are much more of a hassle to diagnose and repair when things go wrong.

Along with the self-driving aspect, these things will be most efficient in a kind of communal car pool run by a larger company, kind of like the car2go folks have in Vancouver, except that a close available vehicle would come to your house by itself, pick you up, drop you off where you wanted to go, then drive by itself to the next customer. All payment would be through an app. The parent company would be responsible for repairs. If the car you were "driving" went dead, another one would come pick you up ASAP. The ice in your cocktail wouldn't even have time to melt.

It would be like a cab, but with no stinky cab driver. No one to yell at how much you hate the Eagles.
edit on 20-3-2018 by Blue Shift because: (no reason given)



posted on Mar, 20 2018 @ 02:12 PM
link   
a reply to: Blue Shift

Absolutely no good to me, I need a van with all my tools in so I can get to and from jobs across the area I cover in my daily work. If self driving vehicles became mandatory I would need to own one out right or hire it for me personally at the very least. It’s already far more cost effective to own a used vehicle than hire one long term.

I’m also against them replacing taxi drivers, lorry drivers and others whose profession is based on driving. No country needs even more unemployed people, surely?

On top of that I’ve owned a desktop computer since they first came out and they are always getting infected with malware or just crashing for some reason or other. I wouldn’t trust one to drive me about and crash on me literally!



posted on Mar, 20 2018 @ 02:26 PM
link   

originally posted by: surfer_soul
Absolutely no good to me, I need a van with all my tools in so I can get to and from jobs across the area I cover in my daily work. If self driving vehicles became mandatory I would need to own one out right or hire it for me personally at the very least. It’s already far more cost effective to own a used vehicle than hire one long term.

I imagine that there will be a long transition period before most cars become self-driving. But your personal van will probably be loaded up with all kinds of semi-self-driving features so you wouldn't be a hazard in traffic.


I’m also against them replacing taxi drivers, lorry drivers and others whose profession is based on driving. No country needs even more unemployed people, surely?

Most technological advances that replace old systems usually increase the number of available jobs, not decrease them. Those cars will still need mechanics and designers and programmers and trackers and all the other people needed to make them go. Adapt or die. That's the way we do it on Earth.


On top of that I’ve owned a desktop computer since they first came out and they are always getting infected with malware or just crashing for some reason or other. I wouldn’t trust one to drive me about and crash on me literally!

Sorry, old timer. The world of the future is sure a scary place, innit?
edit on 20-3-2018 by Blue Shift because: (no reason given)



posted on Mar, 20 2018 @ 02:42 PM
link   

originally posted by: Blue Shift
Along with the self-driving aspect, these things will be most efficient in a kind of communal car pool run by a larger company...


If you're trying to sell the concept of self-driving cars--it's not working. No thanks!



posted on Mar, 20 2018 @ 02:44 PM
link   

originally posted by: ChaoticOrder
It's quite clear we don't yet have good enough AI for self-driving cars, they just cannot handle the countless unpredictable situations which can occur when driving, in order to account for that you need a type of general intelligence, which we do not have. We can create a very good approximation of a good driver but every now and then it will still kill someone because it doesn't have the general intelligence required to make improvised decisions in unexpected situations.


I think they need to re-examine the way they go about testing them. Instead of having the computer drive and an occupant sit in there for hours and hours trying to pay enough attention to correct a mistake from the AI, have a person drive and the AI study the person's driving habits. Instead of an active program that's actually controlling the car, just record what the AI thinks it should do the whole time the human is driving. Then the engineers can compare what the human did to what the AI thought it should do and in places where the AI made a mistake, you work on the algorithms for that behavior. Then we don't have this AI killing people until they get it right.



posted on Mar, 20 2018 @ 03:04 PM
link   

originally posted by: starviego
If you're trying to sell the concept of self-driving cars--it's not working. No thanks!

I'm not selling anything. I'm just explaining how it's going to be, with some variations. Predicting the exact future is a little hard, and it usually falls short of our expectations.

But look at it this way... If you're worried about not being in control of the car you're in, thinking that you're gonna get creamed in some way, how do you manage to drive right now, knowing that 99 percent of the cars you're driving next to are absolutely NOT in your control and probably being driven by some half-blind, half-wit with a belly full of booze? If you don't want to ride in a self-driving car, you might consider the advantages of taking all the other death machines out of the hands of the stupid people who are driving them now. Overall, I think it improves your odds of survival.
edit on 20-3-2018 by Blue Shift because: (no reason given)



posted on Mar, 20 2018 @ 03:29 PM
link   
the space shuttle used 5 computers as a redundancy.
so why dont they use 3 computers.
each made by different company's.
like it takes two to make the care move and accelerate.
any one of them can put the brakes on!
each had its own censors cameras.

that would make me feel safe!



posted on Mar, 20 2018 @ 03:43 PM
link   

originally posted by: buddha
the space shuttle used 5 computers as a redundancy.
so why dont they use 3 computers.

The flight controller on the Space Shuttle had around 1% of the computing power of an XBox 360.




top topics



 
27
<< 1  2    4 >>

log in

join