Self-Driving Car Limitations
Discussions focus on incidents and scenarios where autonomous vehicles like Tesla Autopilot, Waymo, and others fail to detect or properly respond to obstacles, barriers, pedestrians, or unexpected road situations, debating their safety, braking behavior, and reliability compared to human drivers.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Shouldn't autopilot at least stop the car because of the barrier in front of it? Even if it can't figure out the lane configuration?
Car might brake suddenly because it thinks two bicycles are right infront of it?
If the car cannot predict "there's something in the next lane, could be in mine the next moment, better slow down just a bit", it has absolutely NO BUSINESS AT ALL driving around on public roads. At each and every occasion that I drove in a city, I have needed to take at least one such minor evasive maneuver due to someone suddenly getting into my lane, be it a car, a bike or a pedestrian. This is not even driving 101; if the car is unable to handle that, it is quite literally unf
You assume it's for safety, but it's more likely to be a case that the car can't handle at all and simply stops.For instance, if there's some kind of minor obstruction, a human may decide to drive over it or go into the oncoming lane or get out and pick it up or follow an officer's instructions.
There is a road (Norway, Drammen-Svelvik) that I travel once or twice a week in my 2015 Tesla S 70D that often has a car parked on a particular corner. The car is directly ahead of my car and side on; my car reliably interprets this stationary car as being on the road and that we are on a collision course and applies the brakes even though the road actually bends to the left and the stationary car is not in the way. If I didn't know the road I might well have reacted in the same way. I&#
isn't that the promise of a self driving car though? it can respond faster to dangerous situations?
Driver behavior on the road can be totally unpredictable. Intent versus observed motion can have divergence.. I wonder if its the algorithms or the sensors.
The answer is probably that this isn't real. A self driving car with better vision, perfect attention, and instant reactions won't get itself into that kind of situation. And in any other case where an accident is unavoidable, the answer is to just hit the brakes because it won't matter anyway.
I believe the normal behavior of self-driving cars in these situations is for the car to brake to avoid hitting the obstacle, so the safety driver has ample time to take over and navigate the situation. Didn’t Uber disable the auto-braking because it was oversensitive, choosing not to pause road tests until that issue was fixed?
Agreed! This seems like an instance where even if the car knew it needed to stop, it couldn't do so in time.