surely it is obvious
Surely it is obvious that this is a really inefficient way of doing this. AVG should just have a central indexing server that does this, that all the clients running AVG then connect to in order to check if the search result URLs are flagged as problematic on the central server? That way a site only gets hit once in a while.
Maybe there is some flaw with this I haven't thought of, but if it isn't practical AVG should just scrap the whole thing as it really is irresponsible.