Putting your money where your mouth is, is probably the only way to obtain acceptance for semi-autonomous driving systems.
I imagine airplane manufacturers have long been on the hook if their autopilot engages Kill All Humans mode, despite the best efforts of Boeing's lawyers to blame some of the humans their software killed.
Autopilots in airplanes have nothing to do with autonomous driving.
Airplane autopilots are so simple they they can be implemented with entirely analog circuits. They are simple feedback control loops that operate off a single scalar variable in each axis.
There is ample time for the pilot to correct any “mistakes” the autopilot makes.
There is no expectation that the autopilot operates independently of the pilot’s supervision and judgement.
In terms of decision-making and autonomy and liability - an airplane autopilot is really analogous to cruise control in a car, nothing more. It operates the flight controls to maintain simple parameters chosen by the pilot, and that’s all.
AFAIK automatic landing systems just get the plane into the right 'landing parameters range', but won't take action if anything unforeseen happens (like a blocked runway).
Right the Garmin Autonomi Autoland system is intended only for emergency use to get the airplane on the ground if the (single) pilot is incapacitated. It's not intended for routine autonomous flight.
I think the earlier comments are referring to CAT 3 auto land systems using technologies like ILS, which are used routinely and are good for 0ft visibility. That is, approach in cloud down to the ground
On top of that, aviation autopilots operate in one of the most controlled and stable environments. Ground based individual traffic is the polar opposite of that.
As aviation people eloquently put it: https://en.wikipedia.org/wiki/Big_sky_theory I think there should be a correlative 'small road theory', if only to underscore the difference.
Except the general public seem to massively over-estimate how capable aircraft Autopilots are, and then expect Tesla's Autopilot to live up to their inflated expectations.
the general public doesn't fly planes, so to them every flight is getting on a plane, falling asleep, and magically teleporting from one city to another. They have no perspective on just how much an autopilot doesn't do.
I'm talking solely about liability when the system that you sell as being able to operate on behalf of a human gets it wrong.
Comparisons of the systems is irrelevant to the very simple point I made - putting your money where your mouth is garners trust.
As for your claim about autopilot being simple, and humans having ample time to correct its mistakes, well, two 737 MAXs and one Air France flight are very straightforward counterarguments.
Yeah, that's not autopilot. FLCS, fly-by-wire, drive-by-wire are all different things. Autopilot is really specific. If you want to discuss this productively, you should probably learn the difference.
I apologise that I got the precise terminology wrong when I used a term that most humans understand to mean "when the computer controls stuff your plane is doing on your behalf".
My biggest regret about getting it wrong is that it gives pedants something to fixate on and split hairs about, in lieu of engaging with the actual point.
That's true, but some aircraft are now equipped with TCAS and GCAS which will automatically seize partial control from the pilot and maneuver to avoid some types of crashes. So far I think those systems have been 100% reliable but the manufacturers would be liable if they caused a crash.
Lot's of disclaimers. Plus what happens if the system disengages and you crash 5 sec after because you were not expecting to? Technically they are not responsible anymore it seems.
From the article:
"...Handing over driving responsibility completely requires extremely particular circumstances. Right now, Drive Pilot can only engage at speeds under 40 mph (60 km/h in Germany) on limited-access divided highways with no stoplights, roundabouts, or other traffic control systems, and no construction zones. Eligible roads must be mapped by Mercedes for Drive Pilot use (similar to GM SuperCruise); the automaker has already mapped every such highway in Germany, and most of those in Nevada and California. The system will only operate during daytime, in reasonably clear weather, without overhead obstructions. Inclement weather, construction zones, tunnels, and emergency vehicles will all trigger a handover warning. And no, you can't close your eyes or go to sleep while it operates..."
> Technically they are not responsible anymore it seems.
No, technically whether they or liable or not reverts to normal principles of product liability, rather than their special acceptance of liability.
But manufacturers liability for harms caused by their products, while varying in detail from jurisdiction to jurisdiction, tend to be pretty broad in most of the West (incl. the US, despite it's otherwise weak consumer laws.)
Are there even any limited-access divided highways in California where speeds under 40 mph are safe? I'm not sure what they would have mapped.
As the default speed limit for such a highway would presumably be 65 mph (and traffic usually much faster), anywhere with such lowered speeds would presumably have special circumstances that would probably make it unsuitable.
The guaranteed 10 second handoff is impressive, however.
"[...] Unlike all currently available driver-assist systems, Drive Pilot is designed to give drivers a 10-second warning before switching off; engineers had to make sure that, in every situation, the system would safely and faultlessly hand over control. [...]"
I imagine airplane manufacturers have long been on the hook if their autopilot engages Kill All Humans mode, despite the best efforts of Boeing's lawyers to blame some of the humans their software killed.