meanhwhile in California
Google are trying to crowd-source their blacklist. However, I think they're being too polite about it. They have done a lot of work to make mouse-over preview work (not exactly the right word), but how much would it take to put a red X-shaped button to the left of each result, which for your current search session would make that domain/page/whatever disappear? I'd love it if I didn't have to keep adding "-site:xxxxxxx" entries into the search bar to get rid of the crap. Seriously, if enough users did this to enough sites, would there be any reason for Google to continue to index them?
That's the real question. Does Google HAVE to index a site just because it's on the Internet? Or, does Google have to index a site they've scraped just for the underlying site links?
If google blacklisted the handful of domains where these vertical search sites exist, they would not have to tweak/expose their algorhythm, since NONE of the sites' data would ever reach that algorhythm.
While I'm on a roll, what's to prevent Google from buying Websense. Forget controlling companies' outbound internet access, now they're controlling the world's internet access. So long as they leave all of the categories available to be checked/unchecked by the user/parent/[entity paying for the internet connection], I don't care how they rate sites, just give me some well defined category buttons to eliminate shit from their search results.
Ok, I'm done, thank you for reading.