Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> but website stats is one of the biggest losses in modern web dev

That seems a bit overly dramatic. If you look at your log files these days you'll see that a high percentage of it is just scrapers, bots, scripts trying to access wp-admin etc.

Collecting this information somewhere else (instrumented backend, client-side script) makes a lot more sense now where you can filter out noise more easily. There's also very light client-side scripts like https://plausible.io which are a nice trade-off between privacy and useful information while not being too heavy.



Request logs are a source of truth for certain metrics. A tracker service may get you more metadata than the request line in the logs, but something like 25% of internet users use ad block, which often blocks trackers as well (I block plausible). Not seeing metrics for bots, scrapers, and x% of users can really mess up certain metrics.


25% seems way too high if you are not talking about some tech audiences focused website. Especially on mobile not many people install an as blocker, I would be surprised if it’s more than 1% globally.


I'm not sure about the number, but I was quoting https://www.statista.com/statistics/804008/ad-blocking-reach...


Plausible is not lightweight. Its codebase is large for the problem its solving. JavaScript is not the way for analytics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: