> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
This is obviously targeted and the court system should not be playing favorites or going after political opponents
> Tesla cars come standard with advanced hardware capable of providing Autopilot features, and *full self-driving capabilities* — through software updates designed to improve functionality over time.
> Tesla's Autopilot AI team drives the future of autonomy of current and new generations of vehicles. Learn about the team and apply to help accelerate the world with *full self-driving*.
Now you can say that can be interpreted multiple ways - which means the copywriter is either incompetent, or intentionally misleading. Interestingly, the text from 2019 (https://web.archive.org/web/20191225054133/tesla.com/autopil...) is written a bit differently:
> ...full self-driving capabilities *in the future*...
> > Don't advertise their driver assist system as "full self driving".
> The system involved in this crash was never advertised as "full self driving".
I assume "system involved in this crash" is referring to "Tesla Autopilot"; my reply was to contradict the statement '...*was never* advertised as "full self driving"'.
But you mentioning a date made me curious about when the advert text was changed:
It's a dumb argument anyway. Most normies think autopilot means the plane flies itself. At the very least they think it flies itself except for landing or takeoff. By the technical definition of an autopilot perhaps they were correct, but not by the colloquial meaning
It won't happen, but I really wish to see Tesla lawyers telling the court "we know the advert text, video, and the term 'Autopilot' are misleading, but they're just, you know, 'corporate puffery'".
- FSD came out in October 2020; I suppose rounding up to 10 puts it nearly 10 years since. It also, literally, doubles the number from its actual value.
- There have been a lot more than one incident. This is one court case about one incident.
- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.
- Your comment is obviously targeted and disingenuous.
So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.
> It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.
I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.
I went into a Tesla dealership nearly 10 years ago to take a look at the cars, and the salespeople were telling me - in no uncertain terms - that the cars were fully self-driving.
I knew that was complete nonsense, because I knew people who worked on Tesla's self-driving software, but that's how Tesla salespeople were selling the cars.
10 years ago Tesla didn't even have Autopilot. All they had as far as I can tell was lane departure warnings, speed alerts, manual cruise control, some sort of of automatic parking, and low speed summoning on private property.
Could the dealer have been referring to the automatic parking or the summoning?
Autopilot launched in 2014. "Full Self Driving" has been offered as an upgrade since 2016. Musk has been saying that fully autonomous driving is just around the corner (1-3 years away) since 2015.
He's been saying it was just around the corner for a long time, but it wasn't until 2020 that FSD they actually made it available to a limited number of customers in beta.
They were selling FSD as an option all the way back in 2016. They were telling customers that it would launch imminently.
The pattern here is constantly hyping self-driving as something that is basically ready, and at the dealership I went to, they went a step further and claimed full self-driving was already a reality.
What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
This is obviously targeted and the court system should not be playing favorites or going after political opponents