Warning : Above post may contain snark and/or sarcasm. Try reparsing with the /s argument before replying.
What the world has learned is that America is never more than one election away from losing its goddamned mindMe on Elite : Dangerous | My WoW charactersOriginally Posted by Howard Tayler
"Self-driving" technology has existed for a few years. If you're not comfortable with it yet, wait a few more for it to mature. Personally I think it is incredible how far along they already are.
- - - Updated - - -
Technically true, but it's also true that most people grossly overestimate how good of a driver they really are.
“Nostalgia was like a disease, one that crept in and stole the colour from the world and the time you lived in. Made for bitter people. Dangerous people, when they wanted back what never was.” -- Steven Erikson, The Crippled God
We're not talking about a PC operating system capable of running any old application that any old schmuck decides to create. What we're talking about is a much more closed off system with a much more specific purpose. Sure there will be bugs which have to be worked out, but being able to achieve the desired level of reliability is much more achievable considering the more limited scope.
We might not get there in 5-10 years, but we'll get there. And I can guarantee that automated cars will be better at driving safely than you. Computers have far more potential to be better drivers considering that their reaction speed is only limited to the speed of electricity (~50% the speed of light). They're also capable of making complicated calculations pretty much instantly like calculating how far it should be from the car in front of it based on the speed it's going; something that only the Rain Man could even come close to doing.
Last edited by Docturphil; 2017-03-08 at 08:52 PM.
Still feels like it did a good job recovering, I myself would probably have fucked up even more. But then again I would probably notice what was happening... Skynet is still not so near
Not necessarily. Even assuming that all of the components required for the software to make decisions function perfectly and never fail (a huge assumption mind you), the software will only be as good as people program it to be. If there are rare situations that the programmers failed to consider, then the software will fail and from what we have today rare situations cause software to fail all the time.
Good driving isn't like racing where speed and being able to calculate precisely how the car will move within is important. Good driving is about being able to predict issues before they happen and avoid them. Programming software to deal with situations that are unpredictable is exceedingly difficult.
Great data for creating a proper self driving system, Eggs will be cracked to get there.
"It doesn't matter if you believe me or not but common sense doesn't really work here. You're mad, I'm mad. We're all MAD here."
How to tell if somebody learned World Geography in school or from SNL:
"GIBSON: What insight into Russian actions, particularly in the last couple of weeks, does the proximity of the state give you?
PALIN: They're our next door neighbors and you can actually see Russia from land here in Alaska, from an island in Alaska."
SNL: Can't be Diomede Islands, say her backyard instead.
Considering most large companies have a good portion of autopilot sensors and software already installed in their cars they won´t be lobbying against it. Against electric cars maybe for a few more years, but autopilots, don´t think so.
- - - Updated - - -
People can also drive up the wrong side of the fucking high way to become ghost drivers...
Incorrect. You should read up on machine learning. Machine learning basically allows a computer to write it's own algorithm based on feedback. That's how the algorithm for the Xbox's Kinnect was able to learn to detect human motion patterns; an algorithm which was created by a computer and which no human really understands. A developer trying to write that algorithm probably would have taken decades.
Again, not with machine learning. Teach the system how to teach itself. Start with some simple tests until the computer is able to learn enough to be put into more and more complicated scenarios. Eventually you'll reach a point where the machine is better at driving than we are. By the way, Google is already doing this and they already have a shit ton of data collected from machine learning.
You can be a naysayer all you want, but just wait 10-20 years maybe 30 at most and you'll see. It's going to happen and the automated cars will be a better driver than you and I or any other human.
Last edited by Docturphil; 2017-03-08 at 09:30 PM.
Dashcam catches rare instance of Auto-pilot crash.
Meanwhile around the World Humans continue to plow into each others Vehicles at an ever increasing rate.
I agree the driver is at fault here, however, unless it's fully autonomous Tesla shouldn't be calling it autopilot. Probably something along the lines of enhanced <xyz>. It's very misleading, people may think of the autopilot in an airplane (which isn't really similar either, but can perform post-take off all the way to automated landing).
Machine learning isn't magic. Under highly constrained and defined circumstances, like a person moving in front of a camera, it can be very effective. Under more complicated and unpredictable situations it doesn't do nearly as well. The bar you're going for is pretty high as well considering some drivers go a lifetime without being at fault for an accident.
Pretty clear-cut example of human error. The driver engaged autopilot in a situation where it should not have been engaged and was not ready to take control of the vehicle.
- - - Updated - - -
Tesla makes it very clear that the autopilot feature still requires an alert human driver behind the wheel ready to take control at any moment.
“The biggest communication problem is we do not listen to understand. We listen to reply,” Stephen Covey.
Sure, and that's why it's not 100% machine learning. Still, this sort of thing is exactly what machine learning is good for. And I'm fairly certain Google's automated cars are already safer than human drivers. They probably just aren't available to the public yet because of the potential lawsuits should they miss something. They're going to be testing them for a while still before they're on the market.
Edit: Confirmed, the Google self driving cars are already safer than human drivers. http://bigthink.com/ideafeed/googles...iculously-safe Sooooo yeah. That's the end of that argument.
"We just got rear-ended again yesterday while stopped at a stoplight in Mountain View. That's two incidents just in the last week where a driver rear-ended us while we were completely stopped at a light! So that brings the tally to 13 minor fender-benders in more than 1.8 million miles of autonomous and manual driving — and still, not once was the self-driving car the cause of the accident.”
Last edited by Docturphil; 2017-03-08 at 09:56 PM.
My parents work for the sha and showed me a video of a car speeding(over 100mph), and when it got withing 50y of a self driving car, the self driving car immediately turned it's hazards on and went into the shoulder and the speeding car then caused a 4 car pile up.