human driver vs computer driver is way different. you can't expect a human driver to not act in self interest, you can expect a computer driver to. current laws also don't treat computers as legal persons.
- - - Updated - - -
true, but humans often don't get enough time to make a choice in these situations because they happen to fast.
Now this original question is really an interesting one. I cannot even say anything yet to it, for the first look.
Why would the car need to swerve in the first place instead of braking and coming to a stop before hitting anyone?
when all else fails, read the STICKIES.
It's not different at all.
If somebody jumps in front of your car while you are moving your first priority is to apply the brakes. Your next priority is to check for a safe path to avoid the collision. your next priority is to avoid the collision or maintain course of an alternative route can't safely be taken.
The procedure is this way because it's the safest way.
A computer suddenly crashing a car into a guard rail is stupid. It's beyond stupid. Debris from the crash could become a hazard for subsequent motorists.
There's literally nothing smart about that scenario.
It's so god damn stupid that just calling it stupid doesn't suffice, and yet a word for how stupid it is doesn't yet exist.
It's narrow-minded, and short sighted. And displays a complete lack of understanding of the surrounding science behind the scenario.
A self driving car should follow the exact same procedure a human driver is trained to follow.
Normal cars don't make decisions, humans driving them do. These cars wouldn't have drivers, so they would be forced to make decisions. It's not deliberately sacrificing someone, it's using a hierarchy of Save driver > Save multiple people > Save 1 person > Save animals > Save property (Or some variation of that)
It will be especially hard for people to swallow being in danger by their own cars actions when they aren't at fault too. It may sound messed up, but I'd rather accidentally run over an idiot that darted out between cars than have their fuck up cost me my life or get me injured.
The cars aren't making moral choices, people are. Just like today. Also, the "ethical" decision is absurd. It's pretty much never a they die or I die scenario. The vast majority of the time it's I swerve or someone dies, not I drive into a brick wall. Additionally, with airbags and seat belts, even driving into a brick wall at a speed where people might be stepping into the road is not remotely close to a death sentence. In the end, people make this call already, they just do it in a panic without the reflexes capable of even carrying out their decision. Having it decided upon ahead of time, and with a device capable of making the maneuvers necessary to apply the decision is FAR better.
Lastly, there's no reason the car couldn't be set to the desires of the user. Right now, some people would decide to swerve around a pedestrian and risk going into a ditch. However, some people wouldn't. If we are allowed to make that decision now, there's no reason the car couldn't be designed to let us make that decision. There could also be a manual control, such as grabbing the wheel or hitting a pedal, that would allow us to override any "decision" the car would make. I find this whole debate little more than people panicking about technology - that and people looking for grants. In the end, the decision could easily be the exact same as it is today - left to the individual driver.
Ah here we go again with the false premise of of the instantaneous super computer car that has absolute knowledge of it's surroundings and has information about every passerby and their last 10 generations. This time with the added spice of a class of Hogwarts students in the middle of apparition training.
Reminds me of that Will Smith movie i,Robot. Theres a part where will smith and a little girl are stuck in a car that is sinking in a river and the robot saves Will Smith because it calculates he has a slightly higher chance of survival. Whereas probably almost all humans would try to save the little girl.
A car that would make the decision to preserve the most life and sacrifice its occupants probably wouldn't sell very well. And a car that makes a decision to run over 5 kids in the street to save the drivers life also wouldn't sell very well. What if there are 5 people in the car, 5 people in the road, and 3 people in the path where the car would swerve off to crash to avoid the people in the road? Could the car calculate things like that?
I imagine the car won't be advanced enough to make a bunch of decisions like these until all cars are self driving and the question becomes moot.
But until then the prime directive of a self driving car will be to avoid wrecking itself. If somebody jumps out in front of it, it'll prolly just slam the breaks like any other idiot.
Where is the option to "Maximum Overdrive" as many people as possible?
If that's the case then scenario A has you kill an innocent bystander because some idiots thought it was fun to have a BBQ on the highway.
That IS part of the question.
The choices always are:
Kill many idiots and save an innocent bystander or save many idiots or kill someone who isn't at fault.
Last edited by mmocdca0ffe102; 2016-06-25 at 05:53 PM.
If you swerve, or run off the road to avoid something, without ascertaining a safer route. You are not making a moral choice.
Period. You are being stupid.