Only if your standards are unreasonable. We don't need self-driving cars to never make an error. We just need their rate of error to be lower than that, on average, of human drivers, by a solid quantifiable margin. As long as they're better than human drivers, the safety needs have been met. As for who'd be at fault in such an accident, you're assuming there WAS fault. A lot of accidents get handed off as being equally at-fault, or there is no fault to be assigned. And even there, it's more about whose insurance company is responsible for covering what damages, nothing more.