Self-Driving Car Limitations

Discussions focus on incidents and scenarios where autonomous vehicles like Tesla Autopilot, Waymo, and others fail to detect or properly respond to obstacles, barriers, pedestrians, or unexpected road situations, debating their safety, braking behavior, and reliability compared to human drivers.

📉 Falling 0.3x AI & Machine Learning
6,592
Comments
19
Years Active
5
Top Authors
#9640
Topic ID

Activity Over Time

2007
2
2009
2
2010
28
2011
17
2012
79
2013
73
2014
172
2015
291
2016
539
2017
509
2018
1,163
2019
539
2020
407
2021
683
2022
547
2023
710
2024
424
2025
392
2026
15

Keywords

e.g AI FSD liveleak.com PhDs TWICE youtu.be NO AT youtube.com car road driver driving human self driving cars lane self driving cars obstacle

Sample Comments

Shouldn't autopilot at least stop the car because of the barrier in front of it? Even if it can't figure out the lane configuration?

codeulike Jul 24, 2017 View on HN

Car might brake suddenly because it thinks two bicycles are right infront of it?

Piskvorrr Mar 22, 2018 View on HN

If the car cannot predict "there's something in the next lane, could be in mine the next moment, better slow down just a bit", it has absolutely NO BUSINESS AT ALL driving around on public roads. At each and every occasion that I drove in a city, I have needed to take at least one such minor evasive maneuver due to someone suddenly getting into my lane, be it a car, a bike or a pedestrian. This is not even driving 101; if the car is unable to handle that, it is quite literally unf

jeffdavis Dec 16, 2015 View on HN

You assume it's for safety, but it's more likely to be a case that the car can't handle at all and simply stops.For instance, if there's some kind of minor obstruction, a human may decide to drive over it or go into the oncoming lane or get out and pick it up or follow an officer's instructions.

kwhitefoot Jul 24, 2018 View on HN

There is a road (Norway, Drammen-Svelvik) that I travel once or twice a week in my 2015 Tesla S 70D that often has a car parked on a particular corner. The car is directly ahead of my car and side on; my car reliably interprets this stationary car as being on the road and that we are on a collision course and applies the brakes even though the road actually bends to the left and the stationary car is not in the way. If I didn't know the road I might well have reacted in the same way. I&#

8note Aug 5, 2018 View on HN

isn't that the promise of a self driving car though? it can respond faster to dangerous situations?

vjust Jun 9, 2022 View on HN

Driver behavior on the road can be totally unpredictable. Intent versus observed motion can have divergence.. I wonder if its the algorithms or the sensors.

datguacdoh Mar 3, 2020 View on HN

The answer is probably that this isn't real. A self driving car with better vision, perfect attention, and instant reactions won't get itself into that kind of situation. And in any other case where an accident is unavoidable, the answer is to just hit the brakes because it won't matter anyway.

shajznnckfke Sep 16, 2020 View on HN

I believe the normal behavior of self-driving cars in these situations is for the car to brake to avoid hitting the obstacle, so the safety driver has ample time to take over and navigate the situation. Didn’t Uber disable the auto-braking because it was oversensitive, choosing not to pause road tests until that issue was fixed?

samcampbell Mar 20, 2018 View on HN

Agreed! This seems like an instance where even if the car knew it needed to stop, it couldn't do so in time.