How about we don't include morality into technological design? We can easily design self driving cars to stop in order to avoid collision and assuming that everyone is riding in a self driving car, the car behind them would stop as well, along with the car behind them, and the car behind them, and the car behind them, etc. No need for morality in that scenario because they can and are able to avoid the accident entirely.
I think some people forget why we use the term "accident" to describe a car crash. It's because typically car crashes aren't intentional and the whole idea behind self driving cars is that a computer doesn't doze off, can react faster, and can be more aware in 360 degrees around the car than humans can thus we can achieve a reality in which no accidents occur because all of the cars are driven with the necessary awareness and reaction time to completely avoid any collision.
On the other hand, maybe I would love for the car to ignore the presence of bicycler's in front of me and just run em over. Roads are for motor vehicles only, so stay off the fucking road with that slow ass shit >.< . It especially sucks since my air conditioning broke and it's still summer . . .