Serverside analytics have been around forever. If asked for an analysis tool twentysomething years ago my first suggestion would've been Perl, and it's still a good answer today. People make it difficult for themselves because they can't (or won't) grasp HTTP's statelessness, and go to ever more absurd and futile lengths to fight against it.
JavaScript tracking punks given a thrashing by good old-fashioned server log analytics
Netlify this week whipped the covers off its take on dealing with the rise of ad blockers in analytics – do it on the server. With many analytic tools requiring tracking pixels, cookies or JavaScript in the webpage, the arms race in blocking the things – either unintentionally through ad-blockers or intentionally because of …
COMMENTS
-
Thursday 11th July 2019 00:15 GMT Drew 11
First Google killed off server-side analytics with their "free" service. Then you had to open a Google Webmaster account to access the data. Then they started telling you what your website should look like and have on it in order to rank high. And here we are now with most websites crammed full of shitty javascript and visual wankery. Along with "Google Tag Manager", "Google Fonts" etc etc as more ways to slurp user's data.
Jesus H Christ on a bike.
Returning to server-side analytics would be the first step in decreasing Google's grip on everyone's throats (users AND webmasters).
-
-
Thursday 11th July 2019 08:23 GMT Warm Braw
What people want (at least some of them) is to know how much of their advertising spend in campaign X resulted in additional sales. That's quite hard to do simply with server log files since you need to correlate information from several sources. Of course, third parties could provide you with their log files that relate to your visitors and you could merge the two - but that would be no better from a privacy point of view than doing it all in the browser.
Personally, I think it was a great mistake to allow content in web pages from a third-party source - it's a privacy and security nightmare - and there should come a point in the future when it's disabled by default. After all, it's historically been assumed that your advertising spend is a bit of a lottery.
-
-
Thursday 11th July 2019 07:35 GMT depicus
I gave up on Google Analytics years ago and search my own logs for errors, mainly through Logwatch which sends a nice list of errors and not founds so I can adjust my personal site if a file were missing or more likely a rise in scanners looking for haxkers.php which I can then add to the redirect list to send them off to the the FBI's web site.
Sadly marketing departments have more control over most companies web sites and pretty pictures are more important than functionality.
-
Thursday 11th July 2019 11:02 GMT Anonymous Coward
I'm just going to say
If you are some massive company with its own IT, marketing and sales departments *and* the internet is a major sales channel (so not the likes of Dassault or BAe), then you probably have some justification to use some advanced so-called "analytics" tools, including client-side Javascript and such. I can accept that.
What irks me is when every sentient entity with a web site + their pet insist on piling Google's spyware on it. They are not going to use it, even if they use it they are not going to understand it, and even if they understand it they just don't have the resources to do anything effective about it. The only ones who benefit from this is are Google. They just create extra work for you to get to spy on your website visitors on their behalf.
-
Thursday 11th July 2019 11:03 GMT Ben Tasker
But the likes of Matomo are heavy on the features and costly. While Matomo starts at around $59/month for its "Essential" package and 300,000 page views monthly, Netlify is $9/month for 250,000 page views a month.
Have I missed something?
Matomo is OSS and free.
They do have premium packages and whatnot, but the free version will parse your logs quite happily all the same. In fact I've used it to do just that (though not any more)
-
Thursday 11th July 2019 12:44 GMT Rich 11
use all those log files generated on the web server itself. After all, unless you have more nefarious aims in mind, that information should be all developers need to keep things humming along.
I did that for ten years, then a dozen years ago Marketing decided they wanted prettier graphs so it was all dropped in favour of Google Analytics. If they came back to me now and said they wanted to go back to log files I'd laugh in their faces.
-
Thursday 11th July 2019 13:33 GMT JoeySter
This isn't news. This is old skool. The reason people move on from these methods is because they're not the easiest or most reliable methods.
They are however often illegal if you do any more than analytics. There are millions of data breaches each day from tracking based on identifiers such as IP address.
For example, I live in a household that has internet. One modem with one IP address provides internet to many devices. I buy a tshirt at home on my desktop. My flatmate then goes to a website and sees adverts for the same brand of tshirt because he has the same IP address. That's a data breach.
-
Thursday 11th July 2019 15:24 GMT bpfh
Old school but in an ad blocked world...
What your server served is a good measure of what is happening, but analysing that data can be fun, but is way far out of the comfort zone of a lot of agencies and smaller sites who may not even know that these logs exist - it’s easier to plop a JS script and get that data from Google (who then know everything about your business especially if they decide that they want to move into your space....), or even just have random content served via google via GTM because the teams that “own” the site are not the teams that design and operate and marketing don’t talk to the techies...