Self-Driving Car Liability
Debates on legal responsibility for accidents caused by autonomous vehicles, weighing liability between manufacturers like Tesla or Waymo, vehicle owners, and drivers/passengers.
Activity Over Time
Top Contributors
Keywords
Sample Comments
How am I liable for an accident caused by my self driving car, though?
Recent discussion: "Mercedes to accept legal responsibility for a vehicle when Drive Pilot is active"https://news.ycombinator.com/item?id=30763522
How is it the cars fault when the user switches on self driving mode?
The "driver" should be responsible. They were the one with most control over what brand of car was used, what conditions it was driving in and the choice to let it drive itself. They have the power to choose cars that don't run people over. Hold them liable and the situation will sort itself out as quickly as is efficient.The situation would be very similar to how it is now, except a lot less people would be dying in car accidents.
It's logical that the manufacturer would bear the liability--assuming the car has been properly maintained, software is current, etc. The driver (passenger really) doesn't have any real agency while the vehicle is driving itself.It's also a sort of unusual situation in that we can reasonably expect an AI that is a far safer driver than humans will still kill people. That's not a normal expectation for products sold to consumers and used/maintained as directed.
It seems obvious it's the car--or more precisely the operator of the car, i.e. Waymo. Certainly, as a passenger, I'm not responsible for anything my taxi or rideshare does--absent specific behavior on my part that causes an accident or other violation of course.
You can blame it on a car even if the car is not self-driving
The answer almost certainly has to be the manufacturer. I'm sure not responsible if my properly maintained and used self-driving car kills someone. That said, it's a novel area that doesn't have a clear analog to other products today.
If you had a self-driving car, would a malfunction in the car causing a crash be your fault?
Not really though. Human drivers both crash and avoid crashes all the time. In the instances where a fatal crash happens, there is a party at fault and they are held liable under the law. The same logic applies here. If the software caused the crash by being imperfect, then Tesla is liable. It is doubly bad if Tesla caused more people to rely on the software in a perilous manner by overstating the software's safety/feature-completeness. We could argue all day about an ideal future wher