Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are these numbers normalized for the number of cars and car-hours?


No, they are not. https://archive.is/dTInm And the article leads with an anecdote but buries critical details about that specific accident (the driver had "rigged weights" to keep autopilot going even when his hands were off the wheel).

I am not aware of any truly fair and impartial comparison of modern Teslas with other modern self-driving (whatever level) car technologies that have shown that Teslas are truly prone to more accidents. Fair and impartial would involve comparing rates, rather than absolute values (although that's just a starting point), correctly determining what model and what technologies were enabled, along with the driver's actions. Most newspapers are looking for clicks and "tesla autopilot crashes" get clicks, and the papers don't have time to do high quality investigative journalism.

(i have no opinions on the matter; I don't own a tesla, don't know much about their technology beyond what I read in the press, and have limited real-life experience with "self-driving", basically just the standard lane-keeping and radar cruise control that's included with my car, which I don't use often, and have found is prone to all sorts of problems)


Years before Tesla released AP Lexus had walled off ACC behind eye tracking.

They weren't offering half the functionality AP claims and they realized the need to track driver attentiveness more robustly than looking for steering wheel torque.

And even after incidents like this Tesla resisted for years before actually using their cameras for eye tracking


Tesla does not provide government regulators or research organizations access to the raw data on Autopilot or FSD usage. Despite their claims that FSD is a autonomous vehicle system in beta testing with a safety driver, they deliberately under-classify it to the California DMV to bypass [1] the mandatory reporting requirements for autonomous vehicles under test with a safety driver [2]. As such, they have intentionally made it impossible for any untainted safety analysis to occur.

As the default assumption when dealing with safety-critical systems is that they are unsafe and the explicit burden of proof is on manufacturer to prove safety, we must assume that it is unsafe. In addition, as no untainted safety analysis can occur at this time, any comparison against alternatives is impossible until they stop blocking audits of their usage data. The only thing we can state for certain is that it is unsafe for any usage by customers since no untainted proof of safety has been produced.

[1] https://thelastdriverlicenseholder.com/2023/02/17/2022-disen...

[2] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...


No they are not. However they do mention that 40,000 people died in car incidents last year.

But the 17 involving a Telsa is where we need to pay attention. LOL


It's even worse for Tesla if you normalize the data, because the rest of the industry also has advanced driving functionality with far fewer accidents or deaths.

It turns out that one of the things that radar is good for is emergency braking regardless of lighting conditions.

It also helps that other carmakers don't oversell the functionality of their advanced driving functionality; if anything they deliberately understate how well the systems work to avoid giving customers the false sense of safety that Tesla does.


Yeah I love this line of argument from the Tesla cult: “But wait people use their Tesla autonomous features more frequently [in conditions the system doesn’t operate well in].”

Like… yeah, that’s called an unsafe system my dude.


Safety of systems isn't binary, so what's the risk profile?

Newer medical treatments have higher risk, just like experimental transportation methods carry greater risk.

Everyone gets sick/injured and everyone needs to get around, so some suffering in the name of advancing these endeavors seems both inevitable and tolerable. The question is to what degree?

Controlling access to and fallout from these automated driving systems is a temporary priority since the roads they're testing on are far more 'public' than an individual's body undergoing new medical treatment, but the long-term priority must be on getting the systems as safe or safer than other human drivers on aggregate. That will happen, sooner or later, and I'd rather see it sooner as long as the cost of the race isn't catastrophe. The only way to advance is to let it learn...


Yep, and when there is a dozen companies taking their obligations to the public extremely seriously, loading their cars with at least as many sensors as it takes to operate safely for the R&D phases, deploying in limited phases and not directly into the hands of the public, and not marketing their unproven tech as “Full Self Driving,” I think it’s completely reasonable to single out the one company who’s not doing any of that.


Tesla's insistence that vision is all you need is one of the root issues.

Humans are able to drive with vision alone in part because we have a high-level conceptual model of the world that we can rely on to fill in the gaps when a straight literal interpretation of vision is wrong or inadequate. We know, for example, that there are not likely to be walls under highway overpasses and that stop signs on billboards are not signs. This is a "strong AI" level problem that a car isn't going to be able to to solve, so instead the best answer is to give the car super-human senses to it doesn't need such a model as much.

That and if we have self-driving cars we want them to be safer than human drivers. That could only reasonably be achieved with super-human senses.


> even worse for Tesla if you normalize the data

Where's the data that proves this? Tesla has ~2% of the auto market, but ~0.4% of deaths.


Most higher end vehicles (and frankly, I don't count Tesla as particularly high end, particularly not the Model 3) have similar stats, for a multitude of reasons (including but not limited to driver experience and time behind the wheel as a function of affordability of the vehicle).


Where's your data to back up your claim?


~2% of new cars isn't anywhere near 2% of cars on the road or miles driven, and newer cars are safer in general.


miles driven, not market share, is the correct comparison here (and even then you'd want to include many other covariates, such as where the cars are operated, socioeconomic status of drivers, etc, etc) if you really want to compare different vehicles.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: