It may be the case that they do it with sufficient accuracy to stay in business, but insufficient accuracy to provide quality content streams for children.
My above comment seems a bit unpopular, but I really don't think youtube is deliberately trying to show inappropriate content to kids. I think it's happening despite their efforts to do something about it. In other words, they are trying and failing. Maybe I'm giving them too much credit, but you've got to remember a lot of youtube employees have kids too.
My (uninformed and possibly wrong) guess is that YouTube wanted to maintain the fiction that, to the best of their knowledge, everyone using the service was the person who signed up, so that it could argue it was not intentionally showing videos to children. When YouTube knows kids are using the app, it's supposed to be doing lots of things to protect their privacy (which it has not been doing).[1]
Building an algorithm to guess when children were likely using YouTube on their parents' account, in order to show age-appropriate ads, would show that YouTube does know that kids are using the app. So my (again, uninformed and quite possibly wrong) guess is that YouTube did not try and fail at creating such an algorithm; they deliberately decided not to try.
FB is good at behaviorally targeting ads. Google is good at at search ads (90%+ of revenue). Google sucks at behavioral ads. Source: I manage lots of ad spend for many companies.