Page 1 of 2
1
2
LastLast
  1. #1

    Will cars in the future decide who lives and who dies?

    And how do we feel about that? I'm sort of disquieted by the idea that my self-driving car would calculate that I must die, that kind of cold decision-making by a computer to end my life. Feels weird.

    https://www.usatoday.com/story/money...ash/891493001/

    As you approach a rise in the road, heading south, a school bus appears, driving north, one driven by a human, and it veers sharply toward you. There is no time to stop safely, and no time for you to take control of the car.

    Does the car:

    A. Swerve sharply into the trees, possibly killing you but possibly saving the bus and its occupants?

    B. Perform a sharp evasive maneuver around the bus and into the oncoming lane, possibly saving you, but sending the bus and its driver swerving into the trees, killing her and some of the children on board?

    C. Hit the bus, possibly killing you as well as the driver and kids on the bus?

    In everyday driving, such no-win choices may be exceedingly rare but, when they happen, what should a self-driving car — programmed in advance — do? Or in any situation — even a less dire one — where a moral snap judgment must be made?


    Few people seem to be in a hurry to take on these questions, at least publicly.

    It’s unaddressed, for example, in legislation moving through Congress that could result in tens of thousands of autonomous vehicles being put on the roads. In new guidance for automakers by the U.S. Department of Transportation, it is consigned to a footnote that says only that ethical considerations are "important" and links to a brief acknowledgement that "no consensus around acceptable ethical decision-making" has been reached.

    Whether the technology in self-driving cars is superhuman or not, there is evidence that people are worried about the choices self-driving cars will be programmed to take.

    Last year, for instance, a Daimler executive set off a wave of criticism when he was quoted as saying its autonomous vehicles would prioritize the lives of its passengers over anyone outside the car. The company later insisted he’d been misquoted, since it would be illegal “to make a decision in favor of one person and against another.”

    Last month, Sebastian Thrun, who founded Google’s self-driving car initiative, told Bloomberg that the cars will be designed to avoid accidents, but that “If it happens where there is a situation where a car couldn’t escape, it’ll go for the smaller thing.”

  2. #2
    Quote Originally Posted by Vegas82 View Post
    No, the people who program their systems are choosing.
    Correct. This means when ever someone dies in an accident the car manufacture will get sued because their program choose to kill them. So America will never get driverless cars.

  3. #3
    Deleted
    Ye they will, what's the problem? It's not like people choose themselves how get into an accident now? It's pretty common that people freeze up if they get in a panic situation. :P

    Think of it like this, where normal people freeze and just crash (I've had it happen to me myself, instead of reacting I just froze at the thought of "oh shit, I'm gonna crash" ) the car will actually break and maybe steer away from danger.

    In general an automated response will save a lot more lives than the bullshit argument people bring up of "what if the car decides to run into a child" to save the person inside.

    - - - Updated - - -

    Quote Originally Posted by Ginantonicus View Post
    Correct. This means when ever someone dies in an accident the car manufacture will get sued because their program choose to kill them. So America will never get driverless cars.
    They can ask you to sign an agreement that with parameters the driver is responsible. The end.

  4. #4
    Deleted
    Driverless will only work if all cars are driverless and communicating with eachother.

  5. #5
    Quote Originally Posted by Aeilon View Post
    Ye they will, what's the problem? It's not like people choose themselves how get into an accident now? It's pretty common that people freeze up if they get in a panic situation. :P

    Think of it like this, where normal people freeze and just crash (I've had it happen to me myself, instead of reacting I just froze at the thought of "oh shit, I'm gonna crash" ) the car will actually break and maybe steer away from danger.

    In general an automated response will save a lot more lives than the bullshit argument people bring up of "what if the car decides to run into a child" to save the person inside.
    Yeah but think of it like this. If I'm coming around a mountain bend driving alone and someone in the oncoming lane has drifted into my lane, and it's full of a family of five, I'm going to swerve away from the cliff and impact them directly, and there will probably be injuries, and yes, there could possibly be death. But in this scenario, what if my computer decides it needs to save the family of five from risk of death, and just swerves off the cliff? And then the whole way down I'm probably going to be really sad.

    The computer is basically deciding to kill me based on a percentage of likelihood of death by the other occupants.

  6. #6
    The Insane Masark's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    Canada
    Posts
    17,977
    I have yet to find any of these articles written by people who have apparently just heard about the "trolley problem" that actually contain a realistic scenario. They consistently presume that the automated vehicles will drive like a typical human, rather than in a correct and safe manner.

    There is no time to stop safely
    Article's flaw in reasoning : Presumes that the automated vehicle would be driving at a speed unsafe for the road conditions, specifically, visibility.

    In reality, the vehicle would have slowed down as its visibility fell going up the hill, such that it would be able to stop safely in response to anything that suddenly comes into view.
    Last edited by Masark; 2017-11-25 at 05:36 AM.

    Warning : Above post may contain snark and/or sarcasm. Try reparsing with the /s argument before replying.
    What the world has learned is that America is never more than one election away from losing its goddamned mind
    Quote Originally Posted by Howard Tayler
    Political conservatism is just atavism with extra syllables and a necktie.
    Me on Elite : Dangerous | My WoW characters

  7. #7
    Company should guarantee my safety as it's customer, at any point my car should not prioritize someone's safety over mine. Simple as that.
    Yes bad things will happen, but I should be save from "some idiot jumping on the road" situation where car can decide to kill me instead.
    At this point, Company should take responsibility for death of a bystander if no manual control of car was used which means there are flaws in software/hardware and customer should no be responsible for that.

    It's like hiring a lawyer that will decide to switch sides in the middle of the court .
    Last edited by OmniSkribe; 2017-11-25 at 05:47 AM.
    When a player quits EVE and goes to WoW, the average IQ in both games increases.

  8. #8
    Assuming automated cars will react and act like human operated vehicles is moronic as the first time it was brought up.

    But at least the cars won't selfishly be on their cell phones rather than paying attention to the road.

  9. #9
    Quote Originally Posted by Aeilon View Post


    They can ask you to sign an agreement that with parameters the driver is responsible. The end.
    Which no one will sign unless they are forced to and if they are forced to it will be challenged and be deemed unconstitutional.

  10. #10
    They already do, the machines will rise up soon enough and enslave or kill us all.

  11. #11
    Quote Originally Posted by Aeilon View Post
    They can ask you to sign an agreement that with parameters the driver is responsible. The end.
    what happens if someone gets killed who's not the driver nor a passenger? You can't expect people who don't own/use cars to sign such a contract.

  12. #12
    Essentially we just need to give up on the notion that there will always be someone to blame.
    Accidents happen and they will happen with driverless vehicles. In my view a manufacturer should NEVER program their vehicle to deliberately kill the occupant in order to avoid an accident. Try to save the occupants while minimizing the outside casualties ought to be the default setting.

  13. #13
    Deleted
    Quote Originally Posted by Grimjinx View Post
    what happens if someone gets killed who's not the driver nor a passenger? You can't expect people who don't own/use cars to sign such a contract.
    Then they sue, the manufacturer pays some money, then goes back to normal business.

  14. #14
    The Lightbringer Cerilis's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Germany
    Posts
    3,191
    I do believe the car software should prioritize its own passengers first and foremost.

    How is the car even gonna know who if at all is in the truck shaped object approaching?

  15. #15
    Dreadlord yoma's Avatar
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    The Dark Tower
    Posts
    915
    Quote Originally Posted by Dacien View Post
    Yeah but think of it like this. If I'm coming around a mountain bend driving alone and someone in the oncoming lane has drifted into my lane, and it's full of a family of five, I'm going to swerve away from the cliff and impact them directly, and there will probably be injuries, and yes, there could possibly be death. But in this scenario, what if my computer decides it needs to save the family of five from risk of death, and just swerves off the cliff? And then the whole way down I'm probably going to be really sad.

    The computer is basically deciding to kill me based on a percentage of likelihood of death by the other occupants.
    How in the hell does the computer know how many people are in the oncoming car if its not also computerized (and if it is computerized, this is all a moot point because each vehicle would know of the other's presence)? Besides that, the computer merely sees "an object" in it's path that it knows it needs to avoid, be it another car, a rock, a cow, or a ball a kid just threw in the street. It doesn't stop to consider whether it should save 5 or 1 person; it tries to avoid the accident altogether in the most efficient way - something a human would likely be incapable of.
    "It is not wise to judge others based on your own preconceptions or by their appearances."

  16. #16
    Quote Originally Posted by Masark View Post
    I have yet to find any of these articles written by people who have apparently just heard about the "trolley problem" that actually contain a realistic scenario. They consistently presume that the automated vehicles will drive like a typical human, rather than in a correct and safe manner.



    Article's flaw in reasoning : Presumes that the automated vehicle would be driving at a speed unsafe for the road conditions, specifically, visibility.

    In reality, the vehicle would have slowed down as its visibility fell going up the hill, such that it would be able to stop safely in response to anything that suddenly comes into view.
    No thank you, please leave your driverless traffic machine off the road if it's going to impede traffic by slowing down for no reason.

  17. #17
    Deleted
    Self-driving vehicles can in my opinion only be viable if we can make all cars self-driving. In order to safely operate cars need to be able to predict the actions of other vehicles around it, and a vehicle operated by a human is not predictable.

    We will still have crashes and accidents caused by faults in the systems instead of human error, which will overall prevent more deaths and injuries, but there are certain situations a computer will not be able to handle.

    As an example: http://www.bbc.com/news/uk-england-tyne-42083779

  18. #18
    Quote Originally Posted by Dacien View Post
    And how do we feel about that? I'm sort of disquieted by the idea that my self-driving car would calculate that I must die, that kind of cold decision-making by a computer to end my life. Feels weird.
    I'm sort of disquieted by the idea that someone elses self driving car would calculate that I must die. Checkmate.

    - - - Updated - - -

    Quote Originally Posted by Sadpants View Post
    Self-driving vehicles can in my opinion only be viable if we can make all cars self-driving.
    Could just make it so that the fault is always on the driver'd car and never the driverless on the grounds that the driverless will ALWAYS be driving safer than another human would and therefore if an accident occurs it is the fault of the human in control. Kinda like if you rear-end someone it is always your fault even if they randomly slammed on the breaks...
    Quote Originally Posted by Shalcker View Post
    Posting here is primarily a way to strengthen your own viewpoint against common counter-arguments.

  19. #19
    From what I've heard previously this is something they are working on. A base idea is that cars will be linked and communicate data to each other. So they actually get far more time to process potential accidents than a human does and since they also think faster there a lot better off. Add in the ability to calculate the best possible way to crash if needed and I'd put my faith in the self drive car any day (once they've actually finished making the tech to do this properly though since everything is still in test phase not in mass production phase).

  20. #20
    Merely a Setback Sunseeker's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    In the state of Denial.
    Posts
    27,142
    You're already not in control of your life, what difference will computers in charge make?
    Human progress isn't measured by industry. It's measured by the value you place on a life.

    Just, be kind.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •