Page 4 of 23 FirstFirst ...
2
3
4
5
6
14
... LastLast
  1. #61
    I Don't Work Here Endus's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Ottawa, ON
    Posts
    79,231
    Quote Originally Posted by Dacien View Post
    They definitely need to get the programming down to a razor sharp edge, there's no room for error.
    Only if your standards are unreasonable. We don't need self-driving cars to never make an error. We just need their rate of error to be lower than that, on average, of human drivers, by a solid quantifiable margin. As long as they're better than human drivers, the safety needs have been met. As for who'd be at fault in such an accident, you're assuming there WAS fault. A lot of accidents get handed off as being equally at-fault, or there is no fault to be assigned. And even there, it's more about whose insurance company is responsible for covering what damages, nothing more.


  2. #62
    Quote Originally Posted by LMuhlen View Post
    This right of way thing is more to try and protect the weaker part of the potential incident. If they didn't have a right of way, drivers could just accelerate and hit jaywalkers on purpose. Most drivers hit pedestrians by accident, so it is working.

    It still falls to the pedestrian to make his part and not put himself in a dangerous situation where the driver, potentially distracted or without enough time to react, would hit them. Because the pedestrian is the most interested party in avoiding accidents, even if the blame lands on the other side.
    That's true ONLY if the choice is hit or don't hit, and you can do no other action.

    Again: legally, pedestrians generally have the right of way, and drivers assume a lot of fault for preventable collisions. But legally doesn't bring you back from being dead because you decided to play chicken with someone driving a ton of vehicle, so I prefer to defer to right of way of physics - if I can stop easier than someone else that'll occupy the same place, then I will generally slow down or stop to prevent an accident. I'm not going to play chicken with my body vs. a car, nor my car vs. a truck because I will lose 100% of the time.

  3. #63
    Quote Originally Posted by Draco-Onis View Post
    The article states that there was a human in the car but they obviously didn't have their hands on the controls and didn't react fast enough. Also we do not know the full details of how it happened the last time we got an article like this it turned out the self driving car was not at fault but the other driver. Last but not least even if this was their fault considering how long they have been on the road they still have a better track record than human drivers.
    I dunno. I've been driving for over 20 years. Haven't killed anyone yet.

    But, as I said, I don't blame the cars...I'm very much in favour of fully autonomous vehicles replacing human drivers. I just don't think the technology is ready for widespread public use yet.

    Take another look at the second bit I quoted:
    Autonomous cars are expected to ultimately be safer than human drivers, because they don’t get distracted and always observe traffic laws.
    However, researchers working on the technology have struggled with how to teach the autonomous systems to adjust for unpredictable human driving or behavior.
    “The biggest communication problem is we do not listen to understand. We listen to reply,” Stephen Covey.

  4. #64
    Pit Lord Wiyld's Avatar
    10+ Year Old Account
    Join Date
    Jun 2010
    Location
    Secret Underground Lair
    Posts
    2,347
    Quote Originally Posted by Endus View Post
    Only if your standards are unreasonable. We don't need self-driving cars to never make an error. We just need their rate of error to be lower than that, on average, of human drivers, by a solid quantifiable margin. As long as they're better than human drivers, the safety needs have been met. As for who'd be at fault in such an accident, you're assuming there WAS fault. A lot of accidents get handed off as being equally at-fault, or there is no fault to be assigned. And even there, it's more about whose insurance company is responsible for covering what damages, nothing more.
    Musk outlined this nicely in his TED late last year. He quantified the target reliability as "unlikely to ever have an accident in a couple hundreds of lifetimes of regular use" as being the point where we can feel comfortable falling sleep at the wheel and letting the car get us there. That is exactly the point...there will never be a point of 100% .....sad stories like this will never be gone completely. Using automation simply offloads the easy presumption of guilt from the driver to /waves hands around 'others'. That is what makes people uncomfortable...they want an easy target to blame...they want to believe that 'I've never hit anyone, clearly I am a superior human, those drivers who HAVE hit someone are clearly inferior humans who deserve to be punished for their inferiority'

    When an automated car hits someone those folks can't feel superior.
    Quote Originally Posted by Gillern View Post
    "IM LOOKING AT A THING I DONT LIKE, I HAVE THE OPTION TO GO AWAY FROM IT BUT I WILL LOOK MORE AND COMPLAIN ABOUT THE THING I DONT LIKE BECAUSE I DONT LIKE IT, NO ONE IS FORCING ME TO SEARCH FOR THIS THING OR LOOK AT THIS THING OR REMAIN LOOKING AT THIS THING BUT I AM ANYWAY, ITS OFFENDS ME! ME ME ME ME ME ME ME ME ME!!!"
    Troof

  5. #65
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,753
    Quote Originally Posted by DSRilk View Post
    You have no IDEA why a woman is dead. It could be because there was a defect in the programming. It could be because the driver started to take over and did something wrong. It could be that she stepped out into traffic at the wrong time. It could be that she was hidden from view, meaning a human would have hit her also.
    You’re right but for all of what you said in your first reply I’m going to assume it’s the machines fault. If humans die it’s the fault of the A.I or maybe you’re right again maybe it was a human era...oh wait.

    You contradicted yourself this A.I is or isn’t capable that’s the point and it looks like may have killed someone prove it.
    Milli Vanilli, Bigger than Elvis

  6. #66
    Reforged Gone Wrong The Stormbringer's Avatar
    10+ Year Old Account
    Premium
    Join Date
    Jul 2010
    Location
    ...location, location!
    Posts
    15,419
    Quote Originally Posted by Connal View Post
    This is why I hope, at some point, google, uber, and all the other autonomous car companies start sharing their data. In the long run it saves lives, and prevents a negative public image.

    It is why Tesla released a lot of their patents, to speed up adoption.
    Agreed. Unfortunately, a lot of these things are driven by profits, which makes too many companies unwilling to work together for something this necessary.

    I know, we all wish we were our own personal jet fighter pilot badasses, but we're really not. We should be implementing sensors in and along roads, adding trackers and sensors to pre-existing non-auto vehicles, etc. etc. to make it even safer for autonomous cars. Let me get to the point where I can sleep in my car during a trip, dammit!

  7. #67
    Quote Originally Posted by Evil Midnight Bomber View Post
    I dunno. I've been driving for over 20 years. Haven't killed anyone yet.

    But, as I said, I don't blame the cars...I'm very much in favour of fully autonomous vehicles replacing human drivers. I just don't think the technology is ready for widespread public use yet.

    Take another look at the second bit I quoted:
    It's going to be a weird transition - when it's 100% autonomous cars accident rates will plummet to near zero. But as long as you have one human driver they're going to have issues figuring out wtf that person is doing as they cry to cut around the rule-abiding cars (or cars abiding by their rulebook, which would likely have much higher speeds involved).

  8. #68
    Quote Originally Posted by Flarelaine View Post
    And once again, an accident is caused by a human breaking the rules.
    And...? Yes, people break the rules. If a self-driving car can't deal with that they're not going to be very useful.

    Quote Originally Posted by iETHOSi View Post
    In all seriousness though. What was this "operator" doing... sleeping?
    There's only fractions of a second to respond to a situation like this. Even if they are paying attention, it's going to take more time for them react than if they were the ones driving.

    It goes from something like this:
    OMG there's someone in front of me --> I need to hit the breaks
    to
    OMG there's someone in front of me --> Why isn't the car stopping? --> I need to hit the breaks.

  9. #69
    I Don't Work Here Endus's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Ottawa, ON
    Posts
    79,231
    Quote Originally Posted by Wiyld View Post
    Musk outlined this nicely in his TED late last year. He quantified the target reliability as "unlikely to ever have an accident in a couple hundreds of lifetimes of regular use" as being the point where we can feel comfortable falling sleep at the wheel and letting the car get us there. That is exactly the point...there will never be a point of 100% .....sad stories like this will never be gone completely. Using automation simply offloads the easy presumption of guilt from the driver to /waves hands around 'others'. That is what makes people uncomfortable...they want an easy target to blame...they want to believe that 'I've never hit anyone, clearly I am a superior human, those drivers who HAVE hit someone are clearly inferior humans who deserve to be punished for their inferiority'

    When an automated car hits someone those folks can't feel superior.
    Another way of looking at it is that you can have different "classes" of accident, from the perspective of the driver of the car;

    Driver error (you, the driver, screwed up).
    External driver error (other drivers make a mistake and hit you).
    Pedestrian error (diving out from between parked cars, etc).
    Mechanical failure (tire blows, axle breaks, etc).

    A self-driving car can, realistically, only address the first of those. It may be able to, down the line, provide for some improved response behaviours to the second two, and provide for better assessment tools to avoid the last, but it can't ever eliminate those.

    And by eliminating the capacity of being distracted, of inattention, of recklessness and carelessness, it SHOULD be able to grossly limit driver error. And instances of accidents due to some other factor, as above, cannot reasonable be blamed on the automated driver system, not unless its response to those was measurably worse than a human driver's likely would have been.

    If you're expecting them to have a 0% accident rate, you're not approaching this reasonably, because the automated driver isn't the only factor.


  10. #70
    Quote Originally Posted by X Amadeus X View Post
    You’re right but for all of what you said in your first reply I’m going to assume it’s the machines fault. If humans die it’s the fault of the A.I or maybe you’re right again maybe it was a human era...oh wait.

    You contradicted yourself this A.I is or isn’t capable that’s the point and it looks like may have killed someone prove it.
    I'm honestly not sure what you're trying to say. There's no contradiction at all in what I said. The machine's sensors are far more capable than a human's. That doesn't mean there couldn't be a defect in the software. So yes, it's possible it's the vehicle's fault. However, the software could work perfectly fine, and it could have been caused by a human - either the driver (though I find it unlikely) or far more likely the woman who was already disobeying the traffic laws. There's no contradiction there. My core point in this recent response, however, is that you were making statements implying that the incident alone was enough to discredit automated vehicles, when you in no way have enough information to reach any conclusion at all.

  11. #71
    People need to get used to this kind of thing. The only way to stress test these vehicles is out in the real world. Accidents will happen, people will die. If we want the technology that's the price we pay.

  12. #72
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,753
    Quote Originally Posted by Endus View Post
    Only if your standards are unreasonable. We don't need self-driving cars to never make an error. We just need their rate of error to be lower than that, on average, of human drivers, by a solid quantifiable margin. As long as they're better than human drivers, the safety needs have been met. As for who'd be at fault in such an accident, you're assuming there WAS fault. A lot of accidents get handed off as being equally at-fault, or there is no fault to be assigned. And even there, it's more about whose insurance company is responsible for covering what damages, nothing more.
    No you’re wrong Endus. On this you are 100% wrong. This machine needs to do a hell of a lot better than humans. It had better be damn near perfect. If this A.I can literally make many times the calculations per second compared to a human then it had better be fucking damn near flawless.

    Human beings have the limitations of being human stupidity being the most important. But make no mistake us stupid humans will be the ones to make the call over if we trust these machines or not. You as well as anyone knows that stupidity is powerful and there is already enough prejudice and paranoia to kill any chance of a driverless car before it gets off the ground.

    Which means at best we get assisted A.I driving at best. In a perfect world a logical one you’re right but we do live in that world. These cars have to convince people things like this can almost never happen.
    Milli Vanilli, Bigger than Elvis

  13. #73
    Merely a Setback Sunseeker's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    In the state of Denial.
    Posts
    27,126
    Inb4 Republicans pushing for legislation to prevent more autonomous vehicle deaths!
    Human progress isn't measured by industry. It's measured by the value you place on a life.

    Just, be kind.

  14. #74
    Pit Lord Wiyld's Avatar
    10+ Year Old Account
    Join Date
    Jun 2010
    Location
    Secret Underground Lair
    Posts
    2,347
    Quote Originally Posted by Endus View Post
    If you're expecting them to have a 0% accident rate, you're not approaching this reasonably, because the automated driver isn't the only factor.
    Exactly

    Even a decent automated system is likely to be better then the vast majority of people in dealing with all those examples. Instead of calling your dad to ask him if that 'noise' is bad while hurtling down the road at 80 mph, it will make far more intelligent, informed decisions about mechanical issues.

    Really the one and only reservation I have about automation is how the systems can function in a variety of road conditions. Tesla, for example, has been all about proving you don't need RADAR or LADAR like the other manufacturers are using. They want to get away with all line of sight optical. Which is fine, I don't really care...
    but how does that work outside of a nice brightly lit LA street on a clear 70 degree night? What happens when it is -25 out and my car has an inch of snow and ice caked on it and I have to drive down a back road with 0 lighting?

    I know that is a highly specific case and somewhat off topic...in general I couldn't be more excited about the coming robot chauffeur.
    Quote Originally Posted by Gillern View Post
    "IM LOOKING AT A THING I DONT LIKE, I HAVE THE OPTION TO GO AWAY FROM IT BUT I WILL LOOK MORE AND COMPLAIN ABOUT THE THING I DONT LIKE BECAUSE I DONT LIKE IT, NO ONE IS FORCING ME TO SEARCH FOR THIS THING OR LOOK AT THIS THING OR REMAIN LOOKING AT THIS THING BUT I AM ANYWAY, ITS OFFENDS ME! ME ME ME ME ME ME ME ME ME!!!"
    Troof

  15. #75
    I Don't Work Here Endus's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Ottawa, ON
    Posts
    79,231
    Quote Originally Posted by X Amadeus X View Post
    No you’re wrong Endus. On this you are 100% wrong. This machine needs to do a hell of a lot better than humans. It had better be damn near perfect. If this A.I can literally make many times the calculations per second compared to a human then it had better be fucking damn near flawless.
    On what grounds?

    There's plenty of automated systems that we already use that have serious risk factors and aren't forbidden due to those risks. Hell, cars get into accidents, way more than, say, horses. But we still let people drive cars.

    Human beings have the limitations of being human stupidity being the most important. But make no mistake us stupid humans will be the ones to make the call over if we trust these machines or not. You as well as anyone knows that stupidity is powerful and there is already enough prejudice and paranoia to kill any chance of a driverless car before it gets off the ground.

    Which means at best we get assisted A.I driving at best. In a perfect world a logical one you’re right but we do live in that world. These cars have to convince people things like this can almost never happen.
    If your argument is that stupid people will fearmonger about wrongheaded bullshit and try and ban these, sure. That'll happen.

    We don't have to listen to the idiots, any more than we do anti-vaxxers.


  16. #76
    Quote Originally Posted by pionock View Post
    We're certainly far from having fully autonomous vehicles.
    A huge problem with this situation, probably stopping it from ever becoming a reality, is the ethics. In a situation with an autonomous vehicle many people will argue that the vehicle should always look after the owners well-being above all else. Others will argue that the car should always take into account of random acts that you can't control from happening to be willing to risk the owners life in situations that warrant it. Which is right? Therein lies the conundrum, they are both right. However who would want to buy or even ride in as a passenger in a car that would be willing to risk their life if something uncontrollable happens.
    Quote Originally Posted by scarecrowz View Post
    Trust me.

    Zyky is better than you.

  17. #77
    The Lightbringer Minikin's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Canada
    Posts
    3,766
    Quote Originally Posted by X Amadeus X View Post
    No you’re wrong Endus. On this you are 100% wrong. This machine needs to do a hell of a lot better than humans. It had better be damn near perfect. If this A.I can literally make many times the calculations per second compared to a human then it had better be fucking damn near flawless.

    Human beings have the limitations of being human stupidity being the most important. But make no mistake us stupid humans will be the ones to make the call over if we trust these machines or not. You as well as anyone knows that stupidity is powerful and there is already enough prejudice and paranoia to kill any chance of a driverless car before it gets off the ground.

    Which means at best we get assisted A.I driving at best. In a perfect world a logical one you’re right but we do live in that world. These cars have to convince people things like this can almost never happen.
    i think it will end more at assisted rather than complete autonomy. for example, entry level cars now a days, come with not just adaptive cruise but also emergency braking and lake keeping. higher level cars, like audi a8, mercedes s class or the new caddilac true cruise go beyond that, for cross traffic alert, and even lane avoidance for a crash.

    But i dont think it will ever come to full driverless cars until EVERYONE on the road has a car that can by standard have autonomy or at least work with other cars to achieve autonomy. not everyone can afford a driveless car, or an s class. your car could drive perfectly but it can only avoid and derail physics so much until the stupidity of another driver impacts it. once that variable is taken out then i think theyll get to complete driveless cars. until then it will be an assisted setup.
    Blood Elves were based on a STRONG request from a poll of Asian players where many remarked on the Horde side that they and their girlfriends wanted a non-creepy femme race to play (Source)

  18. #78
    I don't understand... Do people expect these things to be perfect? It's a tragic accident, to be sure, but I just don't get all the fearmongering.

  19. #79
    Void Lord Doctor Amadeus's Avatar
    10+ Year Old Account
    Join Date
    May 2011
    Location
    In Security Watching...
    Posts
    43,753
    Quote Originally Posted by DSRilk View Post
    I'm honestly not sure what you're trying to say. There's no contradiction at all in what I said. The machine's sensors are far more capable than a human's. That doesn't mean there couldn't be a defect in the software. So yes, it's possible it's the vehicle's fault. However, the software could work perfectly fine, and it could have been caused by a human - either the driver (though I find it unlikely) or far more likely the woman who was already disobeying the traffic laws. There's no contradiction there. My core point in this recent response, however, is that you were making statements implying that the incident alone was enough to discredit automated vehicles, when you in no way have enough information to reach any conclusion at all.
    Yes and woman is dead unlike this machine, and regardless to how stupid she might have been this would still be the A.Is fault period. And if the machine is as capable as you said that’s even more proof.

    The responsibility for safety is on those who have the greatest ability. As for ending any driverless cars I’ll wait until we know more. But if it were up to me Id be suspending any other driverless cars.
    Milli Vanilli, Bigger than Elvis

  20. #80
    I Don't Work Here Endus's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Ottawa, ON
    Posts
    79,231
    Quote Originally Posted by Wiyld View Post
    Exactly

    Even a decent automated system is likely to be better then the vast majority of people in dealing with all those examples. Instead of calling your dad to ask him if that 'noise' is bad while hurtling down the road at 80 mph, it will make far more intelligent, informed decisions about mechanical issues.

    Really the one and only reservation I have about automation is how the systems can function in a variety of road conditions. Tesla, for example, has been all about proving you don't need RADAR or LADAR like the other manufacturers are using. They want to get away with all line of sight optical. Which is fine, I don't really care...
    but how does that work outside of a nice brightly lit LA street on a clear 70 degree night? What happens when it is -25 out and my car has an inch of snow and ice caked on it and I have to drive down a back road with 0 lighting?

    I know that is a highly specific case and somewhat off topic...in general I couldn't be more excited about the coming robot chauffeur.
    A couple things;

    1> Optical sensors can be WAY better than human eyes. Nightvision is a thing. Heck, we could in theory do infrared, too. It just means it's a reception of existing light/radiation rather than a "ping" system like RADAR or LADAR.

    2> Computers are way better than human brains at reliably constructing an image from partial glimpses.

    3> The car would likely slow down as conditions worsen, to maintain the needed reaction distance based on speed. This can be aided somewhat in cases like fog if the sensors are on the front of the car, rather than behind the windshield; there's just a bit less mist to look through.

    If your car has an inch of snow and ice and it can't see, it'll just refuse to move until you clear it or take manual control. Driving down a back road with no light, even no headlights, shouldn't be a problem for the vehicle, with low-light receptors. And if there's headlights, it's much less of an issue, and the computer won't get fooled into focusing on the headlight cone alone, either.

    I mean, I've driven when conditions were so bad I could barely make out the tail lights of the car a couple meters ahead of me; massive blizzard hit on my way to the in-laws at Christmas one year. We ended up, along with other traffic, creeping along behind each other, keeping one set of wheels on the gravel at the side of the highway. If both wheels hit gravel, go left to get back on the road. If both hit pavement, go right and get the passenger-side wheels back off onto the shoulder. We were driving just something like 10-15kph, mostly, the remaining half hour of the trip took us over 3, but we made it safe. The computer driver may not have been able to handle THAT, but that's literally happened one time in my life. And if it can't handle it, it just doesn't drive.


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •