Milli Vanilli, Bigger than Elvis
I mean, Texas drivers are also just another thing onto themselves.
I'll never forget the road construction that lead to me seeing a VW sitting between two concrete blocks that had not yet been connected because they thought they could jump the gap and beat the rest of traffic.
Spoilers: I'm guessing they had to crawl along the side of their car because it was hanging by the bumpers basically.
Agreed; I thought you were disagreeing with the other person, which looking back is not entirely the case. Maybe the solution one day will be to have self driving cars above walking areas for pedestrians, but sadly we're a ways off from that =]
- - - Updated - - -
You're making the assumption this woman was crossing a ways down the road, and not literally right in front of the car. Why is that?
This makes NO sense. "No matter how stupid she might have been, this would still be the A.I.s fault, period?" Let's say you were driving down the road, and someone steps out in front of your car because she's looking down at her phone. You try to swerve, but at 45 MPH there's no physical way your car can slow fast enough or that your car can turn enough not to hit her. That is NOT your fault, it is not the CAR's fault, it is HER fault. Now let's take that exact same scenario, except the car is automated. There are plenty of situations where it is physically impossible for a car to avoid a collision, either with another vehicle or a pedestrian. I don't mean impossible for a self driving car, I mean impossible for ANY car, no matter who or what is driving. If you can't see this blatantly obvious fact, I guess we're done, because I can't explain it any more plainly.
The Dept of Transportation of NJ once did a study on trying to improve traffic conditions, and the conclusion wasn't pretty. I think it was headed as "You build it, and they will come." Essentially, if you build more roads, you will have more vehicles on them, worsening the problem. They said that there wasn't any practical way of dealing with the problem.
Way too little details to come to an accurate conclusion as to if the machine was actually in error or if the pedestrian gave too little warning for it to be able to respond appropriately.
Hah! I remember my first week in Austin, someone pulled out in front of me and proceeded to cross over four lanes of traffic to make a left hand turn from the right-hand turn lane. And only a few months after that when a cop almost ran me off the road because he was driving literally along the center line on one of those twisty roads that leads onto MoPac. Ah, Texas, how I miss thee.
There have been plenty of other accidents. But because money rules and there have been loads invested any investigation always concludes that it was people that was at fault.
https://www.reuters.com/article/auto...-idUSL2N1MF1RO
https://www.usatoday.com/story/money...ents/74946614/
https://www.nytimes.com/2017/09/12/b...ving-cars.html
https://www.dmv.org/articles/gm-sees...icle-accidents
https://www.theguardian.com/technolo...nomous-vehicle
that women was dumb, just standing on the street, she had a deathwish.
as usual,people being retarded and we blame the machines
There needs to be liability.
Without that, this machine is dead.
A liability which is not exclusive to technology though as there was roughly 4920 cases of jay walkers being killed in 2016. Accountability rests on the person who willingly puts themselves in a position to become a victim providing the vehicle is functioning within its designed parameters which I am sure is being thoroughly checked and rechecked to make sure it isn't loaded with Death Race firmware.
If this car blew through an active crosswalk at 90 mph then I'd say Uber would be accountable without issue since there would obviously be something wrong but just because the person is the victim doesn't absolve her from performing an illegal action.
A soldier will fight long and hard for a bit of colored ribbon.
Yes, because the A.I isn't human and the reason it's driverless is because of its ability to do what even humans can that should make it safer, well looks like that isn't the case. The bar is set much higher for A.I and should be, has to be.
If I had the ability to be access probability like this A.I and anticipate like the human I am, Yes it's my fault, just like it would be the Car's Fault if it was being piloted by an A.I
Doesn't matter the Driverless car has to do far, far, far better, it has to do what seems like the impossible, that's the burden. Humans over machines, human life is paramount over all things.
Milli Vanilli, Bigger than Elvis
Really it's not a strong argument for what, an academic setting?
Is that where you think we are or this is?
Because it isn't and I am actually arguing for the Driverless A.I's I actually think personally they would be much better than people, hell I will even go as far as to agree realistically they only have to do better than people.
The problem is that isn't how this works, how any of this works. Driverless cars have to do A LOT better than people, they have to come with an assurance they can never fail, or almost next to never. That is what's going to be expected, because Driverless cars aren't all those other things you mentioned.
This is going to be at the forefront of people's lives for years to come if parts of this technology becomes part of the norm. How this technology is introduced will have a lot to do with how well it's implemented and how people FEEL about it.
This woman was killed, so the next thing to do is treat this with the utmost sensitivity. Find out exactly what happened, and spend time and money to educate how it can be prevented.
If the burden of what happens shifts in anyway to the woman dead, it will be a huge mistake.
Milli Vanilli, Bigger than Elvis
The liability works like any other case.
If the AI was at fault, then the liability is Uber's.
If the AI functioned correctly, but circumstances were too sudden for the AI to react to avoid her, then like a lot of other accidents, it's just a tragic event and Uber is not liable.
If you mean "someone needs to go to jail", that doesn't follow. If someone's in an automated car manufacturing plant, and one of the mechanical assembly robots kills them, it's going to be a question of if the plant owner had proper safety protocols in place. If they did, and the victim got around them or the arm malfunctioned, it's not the plant's liability. Same difference. This isn't even new ground; we've had the precedent laid out for decades.