Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Hard for me to see this as anything but the driver’s fault. If you drop your phone, pull over and pick it up or just leave it on the floor. Everyone knows, and the car tells you, to pay attention and remain ready to take over.



The argument is that if the driver was in a different vehicle he would have done just that, pulled over and picked it up, but because he believed the Tesla was capable of driving safely on it's own he didn't do so.

Normally I turn the steering wheel when I want to turn my car. If you sold me a car and told me it had technology to make turns automatically without my input then I might let go of the wheel instead of turning it, something I would never have done otherwise. If I then don't turn and slam straight into a wall, am I at fault for trusting what I was sold to be true?

If the driver has heard that their Tesla is capable of autonomous driving, and therefore trusts it to drive itself, there may be a fair argument that Tesla shares in that blame. If it's a completely unreasonable belief (like me believing my 1998 Toyota is capable of self driving) then that argument falls apart. But if Tesla has promoted their self driving feature as being fully functional, used confusing descriptions like "Full Self-Driving", etc, it might become a pretty reasonable argument.


Every time I engage Autopilot in my Model S it admonishes me with a notice in the instrument cluster that I am to keep my hands on the wheel. If I don't make it clear the car that I am there and holding on by applying a little rotational force to the wheel at least every fifteen seconds the car will remind me.

So how does one conclude the that the car is capable of driving itself? Or is the version of Autopilot in the car in question different in this respect?

Autopilot is not autonomous driving and isn't marketed as such; Full Self Driving (FSD) is an extra cost option.


That was the result of a recall which required Tesla to include that warning. Prior to that Autopilot did not include any warnings.

This accident occurred in 2019, several years before the recall.


Does it remind you thay it cant stop at an intersection, or brake to avoid hitting objects? If it did then the person might be more responsible. But elon wouldnt have let the engineers put that in because it goes against his grift


They also said that by putting his foot down on the accelerator, he overrode the feature. He might say he didn't know that's how it worked, but then there's even more fault for performing such a dangerous action not knowing how the thing you think will save you is supposed to be operated.


Except driver already accepted liability.

Also this doesn't stand water today as all new cars have some basic autosteer.


No one else falsely advertises it as "Full Self Driving".


FSD wasn’t even released at that time.


Substitute Autopilot then.

No other auto maker uses similar language. Ford and GM use BlueCruise and SuperCruise, clearly implying an improved kind of cruise control.


From what bits I've seen of the trial, at least a chunk of it was devoted to the meaning of the word "autopilot". Tesla even brought in a linguist to argue their case: https://bsky.app/profile/niedermeyer.online/post/3lunfw2s2is...


Makes sense. If you’re Tesla’s lawyer you need to do whatever you can to try to prove the name wasn’t a promise of functionality it didn’t meet.


Autopilot in an aircraft is basically cruise control, not full self flying.


This is such a pointless talking point, it amazes me how pro-Tesla people have latched onto it.

It doesn’t matter what it means. It matters what people think it means.

This is discussed tons of other places in these comments, and every previous story about Autopilot and FSD here on HN.


It also ignores how Tesla promoted "autopilot". Until very recently tesla.com/autopilot just showed that a video saying the driver was only there for legal reasons. Yes maybe technically they meant FSD (for which it's also a lie, and which has a lying name as well) but they were definitely mixing the terms up themselves (and I think the video predated FSD)


It’s anti tesla people that bring this whole argument up.

Anyone who drives on autopilot for few hours learns immediately of it’s limitations, it’s not an enigma like some try to purport.


Nothing in the article says he had just bought the car. Seems likely he had had it long enough to learn its limitations.

Didn’t stop him.


Do you honestly think a negligent driver who blows through a stop sign at 62 mph would have acted differently in a car marketed to have "SuperCruise" rather than "Autopilot"? I have no strong belief either way, and I shed no tears for Tesla, but I do worry about any chilling effect on technologies that will make driving safer than the bloodbath we currently suffer on the roads every day.


Yes.

For one thing, SuperCruise and BlueCruise are limited to mapped uninterrupted highways. He couldn’t activate it on that street.

Not only would that have physically prevented it. But if he had been using those systems he would have known they were limited. It makes it much more clear what their capabilities are.

We know how to fix the “bloodbath“. We don’t choose to. I question if level two driving systems help or make things worse despite having used them myself.

Slower speeds and better designed roads make a massive difference. We keep speeds high, make giant wide roads/stroads that psychologically encourage high speed even if the marked speed is lower. We don’t punish speeders anywhere near enough. and we don’t build with trees or other large things next to the road, we put the sidewalk there.

Because when a car that’s going too fast makes a mistake, it’s important that it can mow through a pedestrian and survive instead of hitting a big tree and hurt the driver.

those giant trees next to the road on old streets? They psychologically encouraged the driver not to go as fast. Because they don’t feel as safe at the higher speeds. we chose to stop doing that.

Lower death rates are 100% doable without modern assist systems of any kind.


The model s has terrible phone docks. Don't get me started on cupholders, I'll bet people have drink mishaps all the time that affect driving.

I'm actually kind of serious about this - keeping people's stuff secure and organized is important in a moving car.

I'm surprised the touchscreen controls and retiring of stalks aren't coming under more safetly scrutiny.

With the new cars without a PRND stalk, how can you quickly reverse the car if you nose out too far and someone is coming from the side? will the car reverse or go forward into danger?


What Tesla touchscreen controls are crucial for driving? FWIW climate is quick little swipe, arguably easier than turning dial.


lol, I just mentioned direction of travel? defrost?

"a quick little swipe" seems like you're trying to minimize something that is actually dangerous. Controls like that should be in one location for muscle memory, stateless and able to be invoked without looking at them.

https://fortune.com/2025/03/07/steve-wozniak-says-tesla-is-w...


i’d like to hear that argument.


And why was his mobile phone in his hand to drop, if he was driving? Most states have laws against mobile device usage while driving, and it was never a responsible thing to do even before the laws were enacted.


Perhaps he thought it was safe. After all, he had autopilot.


Sure--dangerous and wrong. Despite that, Autopilot was driving at the time.


The driver at that time was Tesla Autopilot. So yeah, drivers fault as the jury said.


1/3 Tesla's fault, 2/3 the operator's


That sounds indistinguishable from “The driver at the time was the motor”


Motors dont make decisions. Teslas says autopilot does


Why would you pull over when you paid top dollars for Autopilot?


Maybe he would have pulled over if the car’s capabilities hadn’t been oversold. Two entities did stupid things: 1) The person by not waiting to pull over because Elon Musk’s false claims, and 2) Tesla via Elon Musk making those false claims.

It passes a classic “but for…” test in causality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: