Self-Driving Safety Regulation
Comments debate the safety risks, regulatory gaps, and ethical concerns of deploying partially autonomous vehicles like Tesla's on public roads without adequate testing, certification, or oversight, often comparing to human driver licensing.
Activity Over Time
Top Contributors
Keywords
Sample Comments
How is it that Tesla (and other companies developing self-driving capabilities) are allowed to deploy their products in the real world? We make people take a test to get a drivers license -- why don't we have requirements for self-driving systems? We require that cars pass routine mechanical inspections -- why isn't the software controlling these cars regulated?
I don't understand the haste with self-driving cars. The technology and related legal issues (who is responsible for accidents?) are not prepared for mass use. Why put cars with half-baked self-driving technology on regular roads and highways with other drivers who didn't consent to this? Why not start from predictable and tightly controlled environments (e.g. inner territory of a factory or a cargo port), limit their speed to 10-20mph, spend a few years refining the technology, and on
Uh it literally isn't here. It'll be here when you can let a tesla off it's leash in every form of adverse driving conditions known to man and it consistently out-perform a human driver, including in novel situations. What's being tested is the minimum viable approximation that legislators are willing to tolerate on our roadways.
The automotive industry has essentially been beta testing their "self driving" cars on public roads for the past couple of years. I very much doubt AI researchers are gonna take AI safety seriously (nor are they gonna be forced to) until people start dying.
I'm assuming OP is suggesting that Tesla needs to test these conditions, not the end user on a public road where innocent lives are at risk...I could be wrong though.
Wouldn't it be better to come out with this AFTER they have self-driving perfected?
Reading the article, this doesn't strike me as inappropriate. "Self-driving vehicle technology is not yet at the stage of sophistication or demonstrated safety capability that it should be authorized for use by members of the public for general driving purposes" is probably true. The cars are good at what they do, but that doesn't mean they can handle all sorts of bizarre detour, road construction, rural roads, faded/incorrect markings, conflicting markings, control from officers/construction
It's really quite sad that the regulation of autonomous vehicles been so slow to come along. Public roads are filled with other drivers, passengers, and pedestrians that did not consent to be a part of a large scale beta test for partial driving automation that could fail at any time. I believe this is a case where self driving software should be default illegal until proven safe. Most companies in this industry, thankfully, seem to be moving carefully and rolling out their products conserv
I really hope this will not been available in a long time in Europe. I don't want to be part of musk's irresponsible beta testing plan as someone who lives around tesla cars. I also really hope that there will be a point based drivers licence similar to human ones that applies to those technologies before this kind of experiment is started. If it loses its points, the whole system has to be disabled and having to go through a audit/licence process to be allowed on public roads aga
These results are meaningless, Waymo only drives in controlled, good weather, urban streets, while the human control group does not. This is not science, this is PR. We need independent investigations.And remember that currently, self-driving cars in California are exempt from traffic tickets : https:/