Ya, the goal is for self-driving cars to drive themselves. Tesla doesn't have those. They don't claim they have those. Which is why this story is kinda dumb, and to base legal decisions on self driving cars on an instance where an idiot who wasn't driving a self driving car killed himself is also stupid.
That's not to say I'm not all for liability to get worked out and codified. But this story is dumb.
But you've misunderstood what it's for. It's not a self-driving car. It's explained quite clearly in all cases that you're supposed to still pay full attention to the road with auto-pilot engaged. Again, they make it really clear that it's not a self-driving car, and if you're driving on a public road, you have to be in control of your car.
Because they show quite clearly that their autopilot system has nothing to do with self-driving. It keeps you in your lane and can adjust your cruise control on-the-fly, but that doesn't mean it's self-driving and can get you to a destination while you watch a movie and aren't paying attention.
The "autopilot" function is just a combination of adaptive cruise control (matches the speed of traffic), automatic lane centring (helps keep you in the middle of the lane), and automatic braking (detects objects ahead and automatically hits the brakes if you don't).
It still requires a human with their hands on the wheel and feet at the pedals, paying attention and controlling the vehicle.
Warning : Above post may contain snark and/or sarcasm. Try reparsing with the /s argument before replying.
What the world has learned is that America is never more than one election away from losing its goddamned mindMe on Elite : Dangerous | My WoW charactersOriginally Posted by Howard Tayler
Guy was all but collecting speeding tickets. Play with the devil and you get the bern.
That may be true, but think of all the reasons that cause car-related deaths in the world today. Humans are fallible too. Whether that be because of lack of sleep, intoxication, slow reflexes, poor judgement, or just plain distraction/focus loss, people are also likely to have "bugs in the system."
https://crashstats.nhtsa.dot.gov/Api...ication/812115
About 94% of all crashes are caused by driver error, as opposed to just 2% from a defect in the vehicle. Vehicle manufacturers do try to make sure their car is safe, because of regulations and because if the car isn't safe they aren't going to sell any. So sure, there may be bugs once in a long while, but bugs in a human are much harder to fix than bugs in a machine. As long as the chance of death is lowered compared to having humans drive, especially by such a significant amount as is projected, and as long as they continuously test to try to find those bugs so they remain few and far between, we will benefit from this technology.
Sadly I don't believe this will be the last of these type of incidents. Cars cannot plan for everything so a human always needs to be able to intervene. I had said in a thread about autonomous driving can never fully be trusted. But glare now should be part of regular testing, and this death will make autonomous driving safer in the end. One down, infinity to go. Next up: rolling fog or avalanche? Slow down and let the human figure it out
edit: I find it strange glare in itself is a problem as I thought these types of cars had distance sensors, albeit vehicle height is also a concern for me
Last edited by Ayla; 2016-07-02 at 03:33 AM.
Well for one, that is not a self driving car nor does it have the capabilities to even get close to that same level. As tragic as the death is, I somewhat expect such things to happen even with actual driverless cars. Doesn't mean we should suddenly stop pursuing the move IMO. Human drivers cause more than enough needless deaths with erroneous driving.
The wise wolf who's pride is her wisdom isn't so sharp as drunk.
No, they said that the white siding of the truck mixed with the brightness of the reflection interfered with the sensors picking up the truck.
They also said that this was the first death in 130 million miles of automated driving. That number seems very high, but if they say it's true, I can't really say it's not.
the best part about this incident is that people are screaming "self driving cars won't work" despite the fact that this is the first death... compared to the countless deaths caused by people controlling cars.
and from the looks of the story, while it's true that the vehicle did malfunction, it also seems like it was still ultimately caused by the truck driver.
Originally Posted by Bigbazz
Self-driving car deaths ARE going to happen. The question is, aside from "how do we deal with it?" is, "are these cars still statistically safer than human drivers?" If so, then we should continue to develop and improve this technology.
Putin khuliyo
Tesla autopilot is not a self driving car and they have never claimed or advertised it that way so the title of the thread is misleading.
"Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel."
https://www.teslamotors.com/presskit/autopilot
I'm almost inclined to say:
that's why you leave making cars to actual car makers.
almost..
But it surely helps boosting the established, experienced manufacturers in that market.
"The pen is mightier than the sword.. and considerably easier to write with."