Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wasn’t part of the drive to client side analytics an effort to improve data quality, in particular to differentiate bots from humans, and measure actual human analytics without getting caught by caches along the way.

If you use something like Cloudflare you can also get some of that serverside logging back.

And netlify and Vercel both have first class analytics features.



> Wasn’t part of the drive to client side analytics an effort to improve data quality, [...]

Interesting, I did not know that narrative. But what I can tell from subjective experience is that bots is'nt so much a problem with client side analytics. counter.dev filters them out by not logging very short page views by the way. For me the bigger challenge with client side analytics is not being able to track clients which are using a tracking blocker. Which I guess is the end users right to use (I even use uBlock origin myself). But if you start missing roughly 50% of page visits it starts getting an issue for web site owners. The data does not need to be detailed, just accurate enough.

Web analytics on hosters... yeah if they fit your use case then great, but for me that is vendor lock in and I would avoid it if possible and web analytics is more or less a topic for itself that I'd prefer to leave with a specialised solution. But obviously I am biased haha.


Since then most bots would have abandoned libcurl and moved on to use something like headless chrome to get around bot mitigation techniques, so the playing field has evened significantly.


And ubiquitous https has dramatically cut down on caches that sit in the middle, so you only really have to worry about the impact of the browser cache on your analytics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: