Page 1 of 4
1
2
3
... LastLast
  1. #1
    Deleted

    Should self driving cars make moral choices?



    Most people questioned in a study say self-driving cars should aim to save as many lives as possible in a crash, even if that means sacrificing the passenger. But they also said they wouldn’t choose to use a car set up in such a manner.

    The study was conducted by three academics in the Department of Psychology and Social Behavior at the University of California, Irvine. They were exploring the implications of autonomous vehicles for the social dilemma of self-protection against the greater good.

    Participants were asked to consider a hypothetical situation where a car was on course to collide with a group of pedestrians. They were three sets of choices:

    A) Killing several pedestrians or deliberately swerving in a way that would kill one passer-by.

    B) Killing one pedestrian or acting in a way that would kill the car’s passenger (the person who would be considered the driver in a normal car.)

    C) Killing several pedestrians or killing the passenger.

    What would you pick? Would you use a self driving car knowing it would kill you to save others? Seems an easy decision unless you are the one in danger.

    original source

  2. #2
    Scarab Lord Zoranon's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Czech Republic, Euro-Atlantic civilisation
    Posts
    4,071
    An interesting topic for ethicists to be sure, but I think market will settle this quite decisively.

    Simply put, very few people would buy a car that would chose to kill them in order to save some strangers. As such I expect that the prime directive of self driving cars will be preserving their occupants.
    Quote Originally Posted by b2121945 View Post
    Don't see what's wrong with fighting alongside Nazi Germany
    Quote Originally Posted by JfmC View Post
    someone who disagrees with me is simply wrong.

  3. #3
    Deleted
    Quote Originally Posted by Zoranon View Post
    An interesting topic for ethicists to be sure, but I think market will settle this quite decisively.

    Simply put, very few people would buy a car that would chose to kill them in order to save some strangers. As such I expect that the prime directive of self driving cars will be preserving their occupants.
    Initially that might be true, but lawmakers might choose to enforce laws of this nature once self driving cars are the norm. On the other hand by that time accidents will be much lower anyways so probably less need.

  4. #4
    I guess I'll be taking the public transit system in the future then, maximizing my survival by riding with as many people as possible.
    "In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance

  5. #5
    Brewmaster Khadgar's Avatar
    15+ Year Old Account
    Join Date
    Aug 2008
    Location
    Dalaran
    Posts
    1,483
    isn't the "sticky hood" that Google patented supposed to solve this issue?



    http://gizmodo.com/google-patented-a...ian-1777376162

  6. #6
    Scarab Lord Zoranon's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Czech Republic, Euro-Atlantic civilisation
    Posts
    4,071
    Quote Originally Posted by Him of Many Faces View Post
    Initially that might be true, but lawmakers might choose to enforce laws of this nature once self driving cars are the norm. On the other hand by that time accidents will be much lower anyways so probably less need.
    I doubt that, after all the lawmakers would be passengers in those cars as well.
    Quote Originally Posted by b2121945 View Post
    Don't see what's wrong with fighting alongside Nazi Germany
    Quote Originally Posted by JfmC View Post
    someone who disagrees with me is simply wrong.

  7. #7
    Deleted
    This is a fun discussion to have at parties.

  8. #8
    The car should always ask "what is best for the owner?"
    .

    "This will be a fight against overwhelming odds from which survival cannot be expected. We will do what damage we can."

    -- Capt. Copeland

  9. #9
    Brewmaster Khadgar's Avatar
    15+ Year Old Account
    Join Date
    Aug 2008
    Location
    Dalaran
    Posts
    1,483
    I'd be content just having a Tesla S so I can take naps during traffic jam's.


  10. #10
    Or self driving should be grandfathered in to every car and we should aim to have automated highways, wherein every car connects to a "hive mind" which controls every single car in a fail-safe manner.

  11. #11
    Deleted
    Quote Originally Posted by Hubcap View Post
    The car should always ask "what is best for the owner?"
    do you mean passenger/driver or actual owner?

    what if the owner is uber / taxi company / company car / public transit / etc?

  12. #12
    Fluffy Kitten xChurch's Avatar
    10+ Year Old Account
    Join Date
    Jun 2012
    Location
    The darkest corner with the best view.
    Posts
    4,828
    Well when all cars are automated this wont be an issue but until then people will probably not like the idea of your car choosing to kill you.

  13. #13
    Deleted
    Quote Originally Posted by Rick Magnus View Post
    Well when all cars are automated this probably wont be an issue but until then people will probably not like the idea of your car choosing to kill you.
    you will still have scenarios like a child suddenly running into the street in which this would apply. where a humans reaction time would not be enough to swerve out of the way, but a computers would be.
    Last edited by mmoc982b0e8df8; 2016-06-25 at 05:22 PM.

  14. #14
    People j-walking into the path of a self driving car should be allowed to die. It is their negligence that causes the accident. It's not safe for a vehicle travelling at high speeds to attempt a sudden change of course. The car should apply its brakes and attempt to make a lane change (if safe to do so) but nothing more.

  15. #15
    Deleted
    Quote Originally Posted by Nixx View Post
    I think it should function as a normal car would in those situations, rather than deliberately sacrificing anyone.
    since a computer has to choose, someone deliberately dies in each scenario regardless. the question is more, who and/or how many?

    this is like that "im on a traintrack going to 1 or 5 people, do you switch tracks" but it's becoming real now.

  16. #16
    Fluffy Kitten xChurch's Avatar
    10+ Year Old Account
    Join Date
    Jun 2012
    Location
    The darkest corner with the best view.
    Posts
    4,828
    Quote Originally Posted by Him of Many Faces View Post
    you will still have scenarios like a child suddenly running into the street in which this would apply. where a humans reaction time would not be enough to swerve out of the way, but a computers might.
    Scenario's where that would result in a fatality would probably few and far between though, unless you just happened to swerve into a light pole or something but even then it'd be unlikely to end in a death perse, at least for the driver.

  17. #17
    Quote Originally Posted by Him of Many Faces View Post
    since a computer has to choose, someone deliberately dies in each scenario regardless. the question is more, who and/or how many?

    this is like that "im on a traintrack going to 1 or 5 people, do you switch tracks" but it's becoming real now.
    These scenarios are all bad and the "Ethicicsts" who peddle them should feel bad.

    Especially since you can't just switch tracks on the fly. this isn't fucking Hollywood. The tracks on the junction weigh in the tons. So operating the switch-track takes a considerable amount of time.

    I feel like you should have to have real world experience to propose ethical scenarios. Because they all sound like they were written by a 5 year old.

  18. #18
    If anything I would say self driving should only be allowed in designated areas (i.e. Highway and interstates here in America). These routes are already generally restricted in some way that would cause this issue in the first place. Right now interstate traffic prevents bikes, pedestrians and farm equipment.

  19. #19
    Deleted
    Quote Originally Posted by Hubcap View Post
    The car should always ask "what is best for the owner?"
    This.. also Id change it to "whats best for the people in the car". And it would be the only way to make it commercially viable too.

    Quote Originally Posted by Rick Magnus View Post
    Well when all cars are automated this wont be an issue but until then people will probably not like the idea of your car choosing to kill you.
    Even then, Id probably not drive one or would try to change it for pirated bootleg AI that will always try to preserve people in the car or my property, assuming the person in the way is at fault and shouldn't be on the road.

  20. #20
    Deleted
    Quote Originally Posted by Rick Magnus View Post
    Scenario's where that would result in a fatality would probably few and far between though, unless you just happened to swerve into a light pole or something but even then it'd be unlikely to end in a death perse, at least for the driver.
    i'm sure it will be rare, but i'm also sure it will happen regularly enough that it will eventually go to court. either carmakers solve this, lawmakers solve this, or courts solve this, but i'm sure it's gonna come up at some point.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •