Bork!
Bork bork, bork bork bork bork.
Paris, bork bork bork bork bork.
Google's famously stark home page was a happy accident. It was designed by company co-founder Sergey Brin, and as legend has it, stark was his only option. "We didn't have a web master," Brin says. "And I don't do html." But nearly ten years on, the world's largest search engine still believes in that no-frills landing page, …
> As you all know, an ordinary Google search gives you just ten
> results. You can't get more without loading another page.
An ordinary search for me returns 100 results. Perhaps someone needs to look a bit deeper in the preferences page beyond setting the language to "Bork, bork, bork!"
Argh, the Register comments are just awful for geeks explaining that how *they* do it works far better than anything a billion-dollar company understands.
Most people (read: the vast, vast majority of people) never go into the Google preferences page. They search, find their result and go. So yes, 'power users' can tweak all the way to 100 results per page, but the she was explaining why the default is 10, and will stay at 10.
My google page just keeps loading more searches as I scroll down. Hence no increased latency to get the first results.
Its not a "Default" search, as it is an option in the "customise google" add-on for Firefox called "Stream search result pages", but I really don't see how it could be a bad thing to arrange their searches like this overall.
"After increasing the number of first page results to 30 for a group of guinea pigs, Google watched as the number of searches dropped by 20 per cent. 'It turns out that it takes us longer to produce 30 results per page,' Mayer explained. 'And that latency drove the decline.'"
Yes, that must be it. It couldn't be that with 30 results per page instead of 10, the user found what they needed without having to go to the third page (for the patient) or without modifying the search (hence running a new search) when they didn't find a suitable result on the first page. No, that couldn't be it. It must be because of the latency.
Am I the only one hearing a voice saying "Shut up with the BS and have the balls to speak the truth -- fewer results per page means more pages loaded, which in turn equals more advertising displayed"?
"more pages loaded, which in turn equals more advertising displayed"
My first thought as well. There are quite a number of online magazines that do the same thing - offering tiny bits of the article surrounded by ads and you have to click through to next bit and on and on. When I see that I just leave the site - permanently.
"Shut up with the BS and have the balls to speak the truth"
Or offer a BS translation page.
Bork Bork
Sorry Alastair, but I can't see clicking on a 'Preferences' link that is right next to the field where you enter your search query as some sort of 'deep throat' hacking activity. I wonder what percentage of searches return more than 10 results - I bet Google knows!
As for why 10 is the default, well obviously returning 100 results uses almost 10x the computing power and bandwidth (adds to Google's costs); increases latency, which may discourage some usage (although I don't personally notice any qualitative difference); but most importantly reduces the number of 'sponsored links' displayed (which seems to be restricted to 8 per page no matter how many results you choose to display (reduces Google's revenue).
I've no complaint to make about this - it's Google's train set and they can play with it in whatever way they like. (As long as they're not 'evil', of course :)
This post has been deleted by its author
Every field interpret's Occam's Razor slightly differently. Edinburgh Uni's AI dept taught it thusly:
Always work on the simplest possible hypothesis that fits the observed data.
When you find a case that doesn't fit the hypothesis, then (and only then) revise (and potentially complicate) the hypothesis.
It's likely that there will be more than one "simplest" hypothesis that fits the observed data -- these form the boundary of a "search space" of possible hypotheses. Occam's Razor then "shaves" this search space down until a near-optimal hypothesis is found.
In this way, we avoid processing overly complex formulae in favour of simpler ones. This allows us to iterate through the first few generations of learning systems quite quickly.
Eventually the theory may become too complicated and you just say "close enough". I don't know whether Mr of Occam would approve of this final step.
(off the top of my head)
entia non sunt multiplicanda praeter necessitatem
Colloquialish translation: things should not be unnecessarily complex. More properly "things should not be multiplied more than is necessary" - so a five-bladed razor is probably right out.
Whether William of Ockham actually said this is open to debate: it's not in any of his writings.
I agree, that was my first impression. Also, regarding this gem:
>>"Then she turned to Google Maps. When this AJAX-ified service originally launched, the maps were significantly larger. But Mayer and company soon realized that users were more likely to load smaller images. 'When we reduced the size of the page by 30 per cent, the number of map requests increased by 30 per cent.'"
So, it could not possibly be that because the window is smaller, the user has to make more requests in order to move around the view port and find what he is looking for. Of course not.
-dZ.
The article states:
After increasing the number of first page results to 30 for a group of guinea pigs, Google watched as the number of searches dropped by 20 per cent. "It turns out that it takes us longer to produce 30 results per page," Mayer explained. "And that latency drove the decline."
Um, I hope she has better evidence... I'd say people just found what they wanted in fewer page loads. The extra load time for 30 results versus 10 is negligible.