Is there any way to clean up access logs for statistics purposes?
from hylobates@jlai.lu to selfhosted@lemmy.world on 03 Mar 13:08
https://jlai.lu/post/33978623

Hi everyone,

I’m currently selfhosting a website that has quite an audience (~2 000 unique visitors/day) and I’m trying to measure that audience more precisely.

I just want to have a simple report of the most viewed pages, the most popular browser, etc… very very basic stuff.

I want to avoid client-side solution as they can and will very easily be blocked and render my effort completely useless. I had a Matomo until 2023 that registered less than half the visits when compared to my access logs.

I tried to look into GoAccess but it gathers a lot (and I mean A LOOOOOT) of chinese/indian/russian bots which are pretty difficult to filter out (if you have a method, please share it, I’m very curious!).

Is there any way you’re aware of to have decent stats without invading the privacy of my visitors or counting bots?

#selfhosted

threaded - newest

y0din@lemmy.world on 03 Mar 13:37 collapse

You could take a look at AWStats.

I used it years ago when I was running my own web server, and it worked well for exactly this type of use case: generating basic reports directly from server access logs (most viewed pages, browsers, referrers, status codes, etc.) without relying on any client-side tracking.

It processes standard web server logs (Apache, Nginx and similar), so there is no JavaScript involved and nothing for visitors to block. That also means your numbers will generally align much more closely with what the server actually handled, compared to client-side tools like Matomo.

From a quick glance it looks like development activity may have slowed after 2023, but log formats have been relatively stable for a long time. If you run it as an offline analyzer (generate static reports from logs and do not expose the CGI interface publicly), the security surface is minimal.

Regarding bots: since it works purely on access logs, you will still need to filter at the log level. Typical approaches include:

  • Enabling and maintaining bot filtering rules (AWStats has built-in bot detection lists).
  • Pre-filtering logs (e.g., via fail2ban, reverse proxy rules, or a separate sanitized log file used only for stats).
  • Excluding requests by user-agent and/or known ASN/IP ranges before generating reports.

It is not a perfect solution, but if your goal is simple, privacy-respecting, server-side statistics without client-side scripts, it is a relatively lightweight option worth considering.

wreckedcarzz@lemmy.world on 03 Mar 16:49 collapse

AWStats

“are you in good hands?”

(only people from the states will get this)