Page 2 of 4 FirstFirst
1
2
3
4
LastLast
  1. #21
    Quote Originally Posted by Him of Many Faces View Post
    i'm sure it will be rare, but i'm also sure it will happen regularly enough that it will eventually go to court. either carmakers solve this, lawmakers solve this, or courts solve this, but i'm sure it's gonna come up at some point.
    Laws already exist. A driver is absolved of responsibility when a pedestrian negligently places themselves in the path of a moving vehicle.

    This is just a shit sorry excuse for people to pretend they are smart and have some important reason to make noise.

  2. #22
    Deleted
    Quote Originally Posted by Gheld View Post
    Laws already exist. A driver is absolved of responsibility when a pedestrian negligently places themselves in the path of a moving vehicle.

    This is just a shit sorry excuse for people to pretend they are smart and have some important reason to make noise.
    human driver vs computer driver is way different. you can't expect a human driver to not act in self interest, you can expect a computer driver to. current laws also don't treat computers as legal persons.

    - - - Updated - - -

    Quote Originally Posted by Nixx View Post
    A human has to choose as well, but we wouldn't commonly frame it as deliberately killing anyone. A car that is using the number of people likely to be hurt or killed as the primary determining factor is making a different kind of choice than a human driver generally is.
    true, but humans often don't get enough time to make a choice in these situations because they happen to fast.

  3. #23
    Now this original question is really an interesting one. I cannot even say anything yet to it, for the first look.

  4. #24
    Hoof Hearted!!!
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Earth
    Posts
    2,805
    Why would the car need to swerve in the first place instead of braking and coming to a stop before hitting anyone?
    when all else fails, read the STICKIES.

  5. #25
    Quote Originally Posted by Him of Many Faces View Post
    human driver vs computer driver is way different. you can't expect a human driver to not act in self interest, you can expect a computer driver to. current laws also don't treat computers as legal persons.
    It's not different at all.

    If somebody jumps in front of your car while you are moving your first priority is to apply the brakes. Your next priority is to check for a safe path to avoid the collision. your next priority is to avoid the collision or maintain course of an alternative route can't safely be taken.

    The procedure is this way because it's the safest way.

    A computer suddenly crashing a car into a guard rail is stupid. It's beyond stupid. Debris from the crash could become a hazard for subsequent motorists.

    There's literally nothing smart about that scenario.

    It's so god damn stupid that just calling it stupid doesn't suffice, and yet a word for how stupid it is doesn't yet exist.

    It's narrow-minded, and short sighted. And displays a complete lack of understanding of the surrounding science behind the scenario.

    A self driving car should follow the exact same procedure a human driver is trained to follow.

  6. #26
    Quote Originally Posted by Flatspriest View Post
    Why would the car need to swerve in the first place instead of braking and coming to a stop before hitting anyone?
    Because some ethics student wanted to sound smart.

  7. #27
    Quote Originally Posted by Nixx View Post
    I think it should function as a normal car would in those situations, rather than deliberately sacrificing anyone.
    Normal cars don't make decisions, humans driving them do. These cars wouldn't have drivers, so they would be forced to make decisions. It's not deliberately sacrificing someone, it's using a hierarchy of Save driver > Save multiple people > Save 1 person > Save animals > Save property (Or some variation of that)

  8. #28
    Fluffy Kitten xChurch's Avatar
    10+ Year Old Account
    Join Date
    Jun 2012
    Location
    The darkest corner with the best view.
    Posts
    4,828
    Quote Originally Posted by Him of Many Faces View Post
    i'm sure it will be rare, but i'm also sure it will happen regularly enough that it will eventually go to court. either carmakers solve this, lawmakers solve this, or courts solve this, but i'm sure it's gonna come up at some point.
    It will be especially hard for people to swallow being in danger by their own cars actions when they aren't at fault too. It may sound messed up, but I'd rather accidentally run over an idiot that darted out between cars than have their fuck up cost me my life or get me injured.

  9. #29
    The cars aren't making moral choices, people are. Just like today. Also, the "ethical" decision is absurd. It's pretty much never a they die or I die scenario. The vast majority of the time it's I swerve or someone dies, not I drive into a brick wall. Additionally, with airbags and seat belts, even driving into a brick wall at a speed where people might be stepping into the road is not remotely close to a death sentence. In the end, people make this call already, they just do it in a panic without the reflexes capable of even carrying out their decision. Having it decided upon ahead of time, and with a device capable of making the maneuvers necessary to apply the decision is FAR better.

    Lastly, there's no reason the car couldn't be set to the desires of the user. Right now, some people would decide to swerve around a pedestrian and risk going into a ditch. However, some people wouldn't. If we are allowed to make that decision now, there's no reason the car couldn't be designed to let us make that decision. There could also be a manual control, such as grabbing the wheel or hitting a pedal, that would allow us to override any "decision" the car would make. I find this whole debate little more than people panicking about technology - that and people looking for grants. In the end, the decision could easily be the exact same as it is today - left to the individual driver.

  10. #30
    Quote Originally Posted by Flatspriest View Post
    Why would the car need to swerve in the first place instead of braking and coming to a stop before hitting anyone?
    This scenario is assuming traveling at highway speeds and the intrusion being abrupt. At least that's how I took it, as that's the only really relevant frame.

  11. #31
    Quote Originally Posted by Torgent View Post
    Normal cars don't make decisions, humans driving them do. These cars wouldn't have drivers, so they would be forced to make decisions. It's not deliberately sacrificing someone, it's using a hierarchy of Save driver > Save multiple people > Save 1 person > Save animals > Save property (Or some variation of that)
    That's an incorrect way of looking at it.

    Save nothing.
    Follow procedures. If people die, boohoo. If somebody comes up with a better procedure, change procedures.

  12. #32
    Old God Mirishka's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Get off my lawn!
    Posts
    10,784
    Quote Originally Posted by Him of Many Faces View Post


    Most people questioned in a study say self-driving cars should aim to save as many lives as possible in a crash, even if that means sacrificing the passenger. But they also said they wouldn’t choose to use a car set up in such a manner.

    The study was conducted by three academics in the Department of Psychology and Social Behavior at the University of California, Irvine. They were exploring the implications of autonomous vehicles for the social dilemma of self-protection against the greater good.

    Participants were asked to consider a hypothetical situation where a car was on course to collide with a group of pedestrians. They were three sets of choices:

    A) Killing several pedestrians or deliberately swerving in a way that would kill one passer-by.

    B) Killing one pedestrian or acting in a way that would kill the car’s passenger (the person who would be considered the driver in a normal car.)

    C) Killing several pedestrians or killing the passenger.

    What would you pick? Would you use a self driving car knowing it would kill you to save others? Seems an easy decision unless you are the one in danger.

    original source
    I think that I'll continue using a car I'm 100% in control of, thanks.
    Appreciate your time with friends and family while they're here. Don't wait until they're gone to tell them what they mean to you.

  13. #33
    Quote Originally Posted by Mirishka View Post
    I think that I'll continue using a car I'm 100% in control of, thanks.
    As long as people who peddle terribad ethical scenarios are programming them I agree completely.

  14. #34
    Ah here we go again with the false premise of of the instantaneous super computer car that has absolute knowledge of it's surroundings and has information about every passerby and their last 10 generations. This time with the added spice of a class of Hogwarts students in the middle of apparition training.

  15. #35
    Mechagnome
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    vermont
    Posts
    526
    Reminds me of that Will Smith movie i,Robot. Theres a part where will smith and a little girl are stuck in a car that is sinking in a river and the robot saves Will Smith because it calculates he has a slightly higher chance of survival. Whereas probably almost all humans would try to save the little girl.

    A car that would make the decision to preserve the most life and sacrifice its occupants probably wouldn't sell very well. And a car that makes a decision to run over 5 kids in the street to save the drivers life also wouldn't sell very well. What if there are 5 people in the car, 5 people in the road, and 3 people in the path where the car would swerve off to crash to avoid the people in the road? Could the car calculate things like that?

  16. #36
    Quote Originally Posted by Haidaes View Post
    Ah here we go again with the false premise of of the instantaneous super computer car that has absolute knowledge of it's surroundings and has information about every passerby and their last 10 generations. This time with the added spice of a class of Hogwarts students in the middle of apparition training.
    I imagine the car won't be advanced enough to make a bunch of decisions like these until all cars are self driving and the question becomes moot.

    But until then the prime directive of a self driving car will be to avoid wrecking itself. If somebody jumps out in front of it, it'll prolly just slam the breaks like any other idiot.

  17. #37
    Where is the option to "Maximum Overdrive" as many people as possible?

  18. #38
    Quote Originally Posted by Gheld View Post
    People j-walking into the path of a self driving car should be allowed to die. It is their negligence that causes the accident. It's not safe for a vehicle travelling at high speeds to attempt a sudden change of course. The car should apply its brakes and attempt to make a lane change (if safe to do so) but nothing more.
    Well done on answering a question that wasn't asked. Maybe get to grips with what the question is before spurging your mind vomit in an attempt to get + 1posts.

  19. #39
    Deleted
    Quote Originally Posted by Torgent View Post
    This scenario is assuming traveling at highway speeds and the intrusion being abrupt. At least that's how I took it, as that's the only really relevant frame.
    If that's the case then scenario A has you kill an innocent bystander because some idiots thought it was fun to have a BBQ on the highway.

    Quote Originally Posted by Deja Thoris View Post
    Well done on answering a question that wasn't asked. Maybe get to grips with what the question is before spurging your mind vomit in an attempt to get + 1posts.
    That IS part of the question.
    The choices always are:
    Kill many idiots and save an innocent bystander or save many idiots or kill someone who isn't at fault.
    Last edited by mmocdca0ffe102; 2016-06-25 at 05:53 PM.

  20. #40


    If you swerve, or run off the road to avoid something, without ascertaining a safer route. You are not making a moral choice.

    Period. You are being stupid.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •