Eyes bigger than stomachs (as my mum used to say)
If the security services want to store every GET request generated, have they actually worked out the logistics?
When I last worked in schools back in 2006/7 we had a proxy server that kept logs of the type required, with 1,500 users generating a 250MByte file daily. (In round numbers). Bearing this in mind, let's do some working out.
The population of the UK in 2015 is 64 Million in round numbers. If only 95% of the population uses the internet, then this produces a figure of 60.8 Million users.
Using the log figure I mentioned above, then a file of (60.8*10^6) * (250*10^6)/(1.5*10^3) is generated, which gives a figure of 10.3 TBytes per day, which over the year gives a figure of 659 TBytes.
From when I had to search the 250MByte log files for details of sites that had been visited by pupils was a pain, & could easily take half a day.
Storage wise, this may not be a problem, but searching such an amount of data is going to be a headache, & somehow I don't think log parser is going to be much use. Then there is the problem of getting all the log files in the same format.
One final thing, take my internet usage. Sometimes I tether my computer to my mobile phone, other times I use the local library & connect to the citywide network, (best of luck unraveling that) & then sometimes I use family internets. So putting together a comprehensive browsing history for me could be a nightmare :)