back to article JavaScript tracking punks given a thrashing by good old-fashioned server log analytics

Netlify this week whipped the covers off its take on dealing with the rise of ad blockers in analytics – do it on the server. With many analytic tools requiring tracking pixels, cookies or JavaScript in the webpage, the arms race in blocking the things – either unintentionally through ad-blockers or intentionally because of …

  1. Nick Kew

    Serverside analytics have been around forever. If asked for an analysis tool twentysomething years ago my first suggestion would've been Perl, and it's still a good answer today. People make it difficult for themselves because they can't (or won't) grasp HTTP's statelessness, and go to ever more absurd and futile lengths to fight against it.

    1. Azerty

      Webalizer is BACK!

      The first thing I thought when I read this: Webalizer. It's website is still up: http://www.webalizer.org/ Grab it while you can (the site mentions, "please use Netscape")

      1. Naich

        Re: Webalizer is BACK!

        I still use it.

      2. Hstubbe

        Re: Webalizer is BACK!

        Awstats is what i used for a long time. Always fun to see decades old tech being marketed again as the next big thing by those young whippersnappers..

  2. Drew 11

    First Google killed off server-side analytics with their "free" service. Then you had to open a Google Webmaster account to access the data. Then they started telling you what your website should look like and have on it in order to rank high. And here we are now with most websites crammed full of shitty javascript and visual wankery. Along with "Google Tag Manager", "Google Fonts" etc etc as more ways to slurp user's data.

    Jesus H Christ on a bike.

    Returning to server-side analytics would be the first step in decreasing Google's grip on everyone's throats (users AND webmasters).

  3. NetBlackOps

    I can live with this.

  4. lvm

    Genius. Sheer genius. Its like 80s all over. I always said: if you want to get web usage stats don't pass to load to clients' browsers. Parse your frigging logs, this is the only acceptable way to do it.

    1. Warm Braw

      What people want (at least some of them) is to know how much of their advertising spend in campaign X resulted in additional sales. That's quite hard to do simply with server log files since you need to correlate information from several sources. Of course, third parties could provide you with their log files that relate to your visitors and you could merge the two - but that would be no better from a privacy point of view than doing it all in the browser.

      Personally, I think it was a great mistake to allow content in web pages from a third-party source - it's a privacy and security nightmare - and there should come a point in the future when it's disabled by default. After all, it's historically been assumed that your advertising spend is a bit of a lottery.

      1. phuzz Silver badge

        What's even trickier to find out is how much of their advertising spending ended up causing potential customers to say "these adverts are too fucking annoying, I'm never buying anything from them again!".

        I suspect most marketeers aren't interested in finding out that bit of data.

      2. Olivier2553

        There is a simple way to solve both problems with one change: host the ads on your website. You can control the ads you display, there is no cross site and obviously, you have all the logs you want.

  5. RyokuMas
    Stop

    Cry me a god damn river

    "Marketeers, however, may be in for a disappointment."

    See title.

  6. depicus

    I gave up on Google Analytics years ago and search my own logs for errors, mainly through Logwatch which sends a nice list of errors and not founds so I can adjust my personal site if a file were missing or more likely a rise in scanners looking for haxkers.php which I can then add to the redirect list to send them off to the the FBI's web site.

    Sadly marketing departments have more control over most companies web sites and pretty pictures are more important than functionality.

  7. Anonymous Coward
    Anonymous Coward

    I'm just going to say

    If you are some massive company with its own IT, marketing and sales departments *and* the internet is a major sales channel (so not the likes of Dassault or BAe), then you probably have some justification to use some advanced so-called "analytics" tools, including client-side Javascript and such. I can accept that.

    What irks me is when every sentient entity with a web site + their pet insist on piling Google's spyware on it. They are not going to use it, even if they use it they are not going to understand it, and even if they understand it they just don't have the resources to do anything effective about it. The only ones who benefit from this is are Google. They just create extra work for you to get to spy on your website visitors on their behalf.

  8. Ben Tasker

    But the likes of Matomo are heavy on the features and costly. While Matomo starts at around $59/month for its "Essential" package and 300,000 page views monthly, Netlify is $9/month for 250,000 page views a month.

    Have I missed something?

    Matomo is OSS and free.

    They do have premium packages and whatnot, but the free version will parse your logs quite happily all the same. In fact I've used it to do just that (though not any more)

  9. Zippy´s Sausage Factory
    Facepalm

    Webalizer, AWStats and analog... all still in use, still open source... but the marketroids all like the shiny... "but this must be better, it has Google written on it"...

  10. -v(o.o)v-

    The article refers to a Matomo cloud service. Matomo (used to be called Piwik) software is open-source and free. We are using it for several million daily pages, works quite well with some tweaks.

  11. Rich 11

    use all those log files generated on the web server itself. After all, unless you have more nefarious aims in mind, that information should be all developers need to keep things humming along.

    I did that for ten years, then a dozen years ago Marketing decided they wanted prettier graphs so it was all dropped in favour of Google Analytics. If they came back to me now and said they wanted to go back to log files I'd laugh in their faces.

  12. JoeySter

    This isn't news. This is old skool. The reason people move on from these methods is because they're not the easiest or most reliable methods.

    They are however often illegal if you do any more than analytics. There are millions of data breaches each day from tracking based on identifiers such as IP address.

    For example, I live in a household that has internet. One modem with one IP address provides internet to many devices. I buy a tshirt at home on my desktop. My flatmate then goes to a website and sees adverts for the same brand of tshirt because he has the same IP address. That's a data breach.

  13. bpfh

    Old school but in an ad blocked world...

    What your server served is a good measure of what is happening, but analysing that data can be fun, but is way far out of the comfort zone of a lot of agencies and smaller sites who may not even know that these logs exist - it’s easier to plop a JS script and get that data from Google (who then know everything about your business especially if they decide that they want to move into your space....), or even just have random content served via google via GTM because the teams that “own” the site are not the teams that design and operate and marketing don’t talk to the techies...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon