Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I haven't seen anyone post this yet but I highly suspect the fact that she was pushing a bicycle impacted the machine learning driven AIs ability to determine she was a pedestrian. This reminds me of the kangaroos messing up the AV programs being tested in Australia. A human paying attention may have been wary of a person pushing a bike in the shadows but for all we know the algorithm thought that she was a bush or something because her profile was impacted by the bike. There is a lot to speculate about but machine learning isn't as smart as we humans tend to believe it is and nowhere does it yet approach the form of general intelligence required to respond appropriately to all of it's inputs the way a human paying attention could.


Weird scenarios that can't be predicted are important to consider. Waymo presentations like talking about a situation they ran into where the car needed to stop because it encountered an old woman in a wheelchair with a broom chasing a turkey. As I understand it, most of these companies sensibly say "unknown weird thing in/near road means stop the car."

ooo, there's video of the turkey wheelchair broom chase: https://www.theguardian.com/technology/video/2017/mar/16/goo...


A person pushing a bike is far, far from a "weird scenario", though.


It may depend if the test cars are used on routes with lots of homeless people or not, for the Mountain View test center I know of a handful of regularly seen homeless people on loaded bikes nearby in downtown Los Altos/Mountain View but it's not close to the level of San Francisco/San Jose/Oakland/Los Angeles/Anaheim/San Diego.


A person pushing a bike is quite a common scenario, and it falls to the Uber engineers to account for it.


To a computer that’s never encountered this scenario before it might be.


Then that is the fault of Uber for not exposing it to this extremely common scenario beforehand.


She had a bunch of shopping bags hanging off the front. I imagine that could change how it was identified.


> As I understand it, most of these companies sensibly say "unknown weird thing in/near road means stop the car."

That's not entirely true. A Tesla, for instance, won't stop if it detects a stationary object on the highway, assuming it's a false positives of the algorithm.

https://www.wired.com/story/tesla-autopilot-why-crash-radar


That sounds rather dangerous considering stationary objects do appear on highways from time to time.


Given that another comment says she "suddenly" crossed three lanes of traffic and was on the far side of the car when she was hit, my suspicion is that the car thought she was riding the bike. Of course all of us computers know that bikes always go over 5 mph so surely she'd make it across. Why brake?

I want safe self-driving cars, when they are safe. But Uber's clearly-established cavalier attitude toward human beings apparently can't be trusted with self-driving cars. I'm disappointed that politicians thought they could.


I don't understand this line of reasoning at all, but it's getting repeated a lot on HN. Is the AI expected to hit cyclists, but not pedestrians? Shouldn't it consider any moving object to be a hazard?


It may not have had good enough information to tell that she was a cyclist at all, and instead interpreted her to be something static/non-human altogether (in parent's example, a bush). Of course, without seeing the data its sensors gathered, and what that data registered as, it's all speculation. The problem is with misidentification, not bad priorities.


I wouldn't be surprised if there was some kind of classification problem involved. But wouldn't the AV by default treat a solid object on the road as a hazard to brake for? There's no possible way to specifically train for every variation of thing that could be encountered on a city street (imagine Halloween, for starters).


It was a 3-lane highway. It wouldn't really make sense to break if the system detected a immobile object in a different lane. The problem is more that the system failed to detect the woman moving between lanes.


But the system has the ability to detect whether an object is moving in parallel (i.e. within its own lane) vs. across lanes, right? That would seem to be fundamentally necessary in everyday conditions, e.g. day traffic in which cars are switching/merging lanes.

If the problem is that the system didn't detect the woman moving between lanes, then that seemingly contradicts the police statement that the victim moved quickly enough to surprise the AV and its driver.


I don't see how this is relevant. The car should avoid hitting ANYTHING. It shouldn't have to recognise what the object is if it's in the path of the car.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: