back to article Google tweaks search results with mystery site speedometer

Google is now using site speed - "how quickly a site responds to web requests" - as part of the criteria for ranking links on its world-dominating search engine. With a blog post in December, Google seemed to indicate that such a change was in the cards, and the company formally announced the change on Friday with another post …


This topic is closed for new posts.
  1. This post has been deleted by its author

  2. Anonymous Coward
    Anonymous Coward

    this is a title

    "What about sites that post lots of photos on their pages or use complex services that take longer to load? What about all the sites that use advertisement[s]?"

    fuck 'em

    "Adding a performance requirement to the web as we know it stands it on its head."

    ... yeah... if by "requirement" you mean "vague incentive" and if by "the web" you mean "one single website" and if by "stands it on its head" you mean "does practically nothing" then you're not far off the mark.

    1. Tom Samplonius


      @AC: "What about sites that post lots of photos ...?"

      This doesn't matter. What most people do not realize, is that browsers do not fetch complete pages, but files. Google only fetches the html file, and never fetches images.

      @jeanl: "I prefer Google invest some valuable time to invent the wheel to warn people which searched links are potential dangerous"

      Google already does not. Except they just bury dangerous sites. The GooleBot detects all forms of dangerous sites, and doesn't even index them.

      @Camilla Smythe: "Wonder what Phorms DPI multiple re-direct cookie spooging bucket of vomit does to site load times?"

      Nothing that Google can see. Phorm is used is employed close to the users, not at the hosting servers. Google is peered with most large hosting networks. And your crappy Phorm using ISP probably just wished that they knew how to host content.

      @Da Weezil: "Google is becoming less and less useful as time goes by. First and foremost - I want my results relevant to my search term, not corrupted by paid placings or speed ratings"

      As someone who has spent time trying to climb the search ranks... there is too much relevance right now. There are thousands and thousands of pages that make each common search term, so there needs to be a tie-breaker, or something to give a better site an edge of thousands of others. And as far as the paid links, the prominence of those hasn't changed, so the "less useful as time goes by" is simply false. And besides, it is freetard thinking a multibillion dollar infrastructure can be supported without making any money. I guess by anyone except for MS, but that is because all of your Windows and Office revenue is shoveled into Live Search. Oops, it is called Bing now. Maybe it will make money someday.

      1. Camilla Smythe

        Oooooooooh Get You

        Signed up for an El Reg Commentard Account Have We?

        Or are we just busting the supplied system to get our post up at the 'top'?


        "Wonder what Phorms DPI multiple re-direct cookie spooging bucket of vomit does to site load times?"


        "Nothing that Google can see. Phorm is used is employed close to the users, not at the hosting servers."

        El Reg

        "Presumably, Google will measure site speed - at least in part - via Google Toolbar, the browser add-on installed on user machines across the globe."


        "As someone who has spent time trying to climb the search ranks...."


        I might have zero credibility[1] but you have a seriously small inverted penis.

        [1]Old days but D-Moz 2 days after the request. Page rank 4[2]

        [2]Oh, that will be zero credibility still.

        You want to get with the picture some time. The new SEO is PhormOIX SEO. I hope you get in tune with your new flavour of snake oil because it is not happening any time on my watch.

        Hand, Fail.... Kent will need quality burger flippers sometime soon. Whatever.

      2. Anonymous Coward

        You just don't get it

        @Tom: The idea of comments is to link them to the posts, not list them all together at the top of the page, BEFORE the originals.

        "As someone who has spent time trying to climb the search ranks..." If your attempts to do this are as good as your commenting, then I'm not surprised you're failing.

    2. Anonymous Coward


      ""What about sites that post lots of photos on their pages or use complex services that take longer to load? What about all the sites that use advertisement[s]?"

      fuck 'em"

      Does that include sites like el reg?

  3. Da Weezil

    Worthless Google

    Google is becoming less and less useful as time goes by. First and foremost - I want my results relevant to my search term, not corrupted by paid placings or speed ratings, the most important thing to me when I search is RELEVANCE. Has Google forgotten that word?

    Too many sites I visit take time to load the Google stuff... the good thing with Opera is that with the right settings you can see the various bits of a site being called... its usually the Google crud that makes it wait. Another example of the web being screwed up by one company with too much influence.

  4. Anonymous Coward
    Anonymous Coward


    "Web design as currently practiced is hereby DEAD. Flash becomes poison..." - yes, sounds good to me?

  5. Camilla Smythe

    Phorm Anyone?

    Wonder what Phorms DPI multiple re-direct cookie spooging bucket of vomit does to site load times?

  6. Bruno Girin

    More "signal"?

    Surely there's a point where combining too many signals just results in a lot of noise?

  7. Anonymous Coward

    You mean what you should have been doing all along?

    "Web design as currently practiced is hereby DEAD. Flash becomes poison - lots of funny little blank pictures to build up a page's appearance will ensure no one ever sees it."

    Web design as currently practiced is only dead if the way you practice web design is shit. You shouldn't need to change your design to load faster because it affects your Google rankings now.

    No, if you were a competent webmaster you would have been doing that all along because it affects YOUR USERS.

    1. Anonymous Coward
      Anonymous Coward

      Well said that man

      Well said. Too many so called "web designers" are nothing more than graphic artists with little or no knowledge of designing properly for the web taking into account accessibility and usability.

      1. Steve Roper
        Thumb Up


        As a long-time web developer myself my primary emphasis has always been on speed. If a home page doesn't load in less than TEN SECONDS on an average connection it's too slow. So my policy is to avoid the use of Flash, animated GIFs or excessive graphics on the home page. Too many sites have pretty animated Flash header bars and inline objects that do nothing for functionality and just waste the user's time. And any web developer who knows anything about Nielsen stats and the Top Ten Elements of Bad Web Design would have known about the 10-second home page limit for years now.

        I'm glad to see Google finally take this into account when rating websites. Speed IS important, and it's gratifying to finally see it'll be reflected in search results.

  8. Nick Stallman

    Not complaining

    You wont see me complaining about this.

    Faster sites can only be good, and its only a weak 'signal' anyway.

  9. jeanl

    waste of investment

    Instead of doing that, I prefer Google invest some valuable time to invent the wheel to warn people which searched links are potential dangerous attacking site with reddish high-lite or icon attached to it.

    1. Martin Smith 2

      They already do!

      @jeanl They already do that. They have links in the search results saying "This site may harm your computer"

  10. Pablo

    I'm for it.

    Maybe this will get webmasters to finally give some thought to bandwidth instead of cramming all kinds of junk into their layouts.

  11. jim 45
    Thumb Down


    This has nothing to do with search relevance - it's all about Google providing a "great user experience", i.e. self-promotion. The idea is to keep users on the Google page longer, clicking more search links, and hopefully looking at more ads.

  12. da_fish27
    Thumb Up

    +1 on this google.

    As long as it is done sensibly, and isn't that important a factor, I think it's a good idea.

    I just checked my page in google's webmaster labs -> site performance and the advice it gives is very useful, I really felt stupid when I saw that I was loading the same file from two different URLs.

    I think this will make webmasters at least think about their page loading speed.

    Funny thing though, Google suggested I remove Google Friendconnect; according to them I should minimize DNS lookups :D

  13. Herby

    They need to compare loads for browsers too!

    Many sites have different pages for IE and FF. If they were REALLY smart those with loads for IE (which are longer) would show up as well.

    Maybe a comment on the results page that says "This page loads longer in IE". It might help show that IE really isn't that good of a browser.

    Of course the flash garbage (and time wasting Javascript) being excised would help everyone. It might even lead to better designs! We can only hope!

    1. Gary Turner

      Er, um

      "Many sites have different pages for IE and FF."

      That would indicate less than competent developers. Hacks and work-arounds for IE<8 are simple and unobtrusive. There is no need, whatsoever, for having a separate page for each browser.

  14. Steven Knox
    Thumb Up


    "Web design as currently practiced is hereby DEAD. Flash becomes poison - lots of funny little blank pictures to build up a page's appearance will ensure no one ever sees it."

    Flash has been poison for quite a while -- and lots of funny little blanks have always ensured I don't see a page more than once. Good CSS and HTML with a few strategic images can do all the styling any web page needs.

    Frankly if this takes even 1% of the bloated, high-on-gimmick, low-on-content sites off the web, I'll consider it a roaring success.

  15. Anonymous Coward

    Google Spider?

    Send a raw text blast.

    I'm pretty sure lots of sites already do something similar.

  16. drfreak

    Jobs would love this quote:

    "Web design as currently practiced is hereby DEAD. Flash becomes poison - lots of funny little blank pictures to build up a page's appearance will ensure no one ever sees it."

    A-men! Especially the last sentence.

  17. Solomon Grundy

    Tools? Where are They?

    I see a lot of potential positive developments with this policy however, as policy makers know, you have to provide empirical evidence of the success/failure of any policy. At present the 'Google-Centric' tools available simply do not provide that evidence.

    Another concern in that this is a lead in to Google hosting services/hosting partners. If it gets to the point where ad's are run with "Hosting Provider 'X' makes your site get better search results" then it has gone too far. There are too may opportunities to exploit this new policy.

  18. skeptical i
    Thumb Up

    As one of the last dialup users on the planet ...

    ... I welcome any effort to get webmeisters to actually READ their code and edit it with a chainsaw, if need be. Granted, sites for videographers will of course be loaded with "sample" product, but too many websites contain pipe- clogging stuff that is simply irrelevant crap. Site "designers" could at least have thumbnail GIFs linking to the bloat ("click 'here' to see a video of poodles dancing the macarena at our vendor booth" or whatever) so those with no time or bandwidth for the eyecandy can give it a miss.

    1. Anonymous Coward

      You'd Like My Code, Then

      If you stick with the basics, then you never need to go back to them.

      I've been having designers sneer at my high-ranked sites for quite a while...

      Mine's the Formula One jacket...

  19. JohnG

    Google's choice

    If users find that Google's search results are not relevant and other engines provide more relevant results, then they will undoubtedly switch and Google would fall by the wayside like so many previous search engine champions. It's Google's choice to select what gives them the best balance between results that users want to see and results that generate revenue.

    Personally, I'd be happy to see faster web sites with less unnecessary active content. I particularly don't understand why the B2B sections of some corporate websites are choked with so many animations, videos and the like that it takes ages to actually get to where you can select products. Have they considered that some of their customers might be stuck with a corporate desktop image using old versions of browsers, Java and Flash?

  20. Anonymous Coward


    Google recommends this to improve my site performance....

    Minimize DNS lookups

    The domains of the following URLs only serve one resource each. If possible, avoid the extra DNS lookups by serving these resources from existing domains:

    * Go to URL

  21. Damien Thorn

    backward tech

    With modern load balancing, faster servers, and more appropriate request handling in the core software and larger pipes on hosting, this is unlikely to affect many websites.

    However what you will start to see is robot.txt's with NOFOLLOW in them for google, and sites with tiny landing pages, designed to be merely 1kb in size welcoming users to the site.





    (body)Welcome click (a href="<?php echo $load_bal_url ?>")here to enter slowsite(/a)(/body)


    simple really.

    1. Allan George Dyer
      Paris Hilton

      You're wrong...

      The tiny landing page has little (no) indexable content, so it won't show up for relevant searches. Anyone who wants their pages to show up in Google searches is not going to tell the googlebot to go away.

      I can only see this improving the value of google searches to me, I want relevant information quickly, not relevant info surrounded by masses of slow junk.

  22. Graham Marsden

    The anti-slashdot effect...?

    The slashdot effect makes sites slow to respond because of the number of people visiting them.

    So will this new nonsense from google mean that such sites will suddenly disappear off the search listings because they're too popular...???

  23. heyrick Silver badge

    Heading towards a two-tier Internet?

    Those who can afford the fastest servers on the biggest fibre optic backbones... and those who can't.

    The deep irony is most of the stuff I search for is found on the second, for the big flashy expensive services are, more often than not, style over substance.

    As has already been pointed out, I care not a damn for speed. My web is plenty fast enough (even with a lowly one megabit) now I've killed all of the advertising. What matters, and what matters A LOT is relevance.

    If Google stops returning relevant results, I will stop using Google. It's a pretty easy equation. And I am sure the makers of my OS would be more than happy to suggest the equally ridiculously named "Bing". I search for answers, not to cum in my pants over blinding speed. I bet half the delay is my own computer dealing with the fancy scripting and all those separate requests for the inclusions, style sheets, logos, photos, blah-de-blah.

    Speed isn't everything. Relevance is.

    1. Anonymous Coward


      It seems to me a clever way to promote cloud services, which I suspect Google are either selling or have available integrated with some ad system...

  24. James 100

    Flash in the bin

    I suspect this is really aimed at the handful of search results I've found which take truly ridiculous times to load (30+ seconds) or time out entirely, rather than whether your site takes 1 second or 3 to load. When I'm designing sites or pages, I do put effort into making the page efficient; when viewing them, I get irritated by slow load times, particularly when it's due to inefficient design.

    Yesterday, I happened to visit Ebuyer's site - then wait for it to crawl down an ADSL connection. Checking the source made the reasons obvious: the CSS alone was spread across at least NINE different files on the same host, with bits commented out as well. That was as far as idle curiosity about their appalling load time took me, but no doubt the rest of the structure was as bad.

    Oh, and of course a lot of "buttons" on their site were actually Flash objects for some bizarre reason, when images would have achieved the same much more efficiently - nice and obvious for me, thanks to click2flash.

    As a prospective customer, that lousy site experience WOULD make me choose Scan or Dabs instead if possible - it's only logical for Google to apply the same tie-breaker. If I'm searching for a product all three carry, the faster sites *are* the better results!

    Cruft like Flash and bloated fragmented objects (CSS/JS) cannot die quickly enough - and if Google and Apple can apply a bit more pressure in this direction, good. I might not like everything either company does, but this particular change is a very good one for their users and for efficient web designers as well.

    1. M Gale

      Yes and no.

      Everything you've said so far is spot on. However, "Cruft like Flash and bloated fragmented objects (CSS/JS) cannot die quickly enough."

      I'd much rather a site have all its styling in a nice neat little .css file that my proxy can cache and not bother downloading again for the rest of the day.

      Anyway, what's it to Google if a site loads slow? If they want to be useful, rather than putting a site that may be good-but-slow right down in the search rankings, why can't a five or ten-star speed ranking appear next to the site name?

  25. Mike Cardwell
    Thumb Up


    Those of us who actually know what the hell we're doing and spend time making our websites respond quickly and perform well will benefit from this. Screw everyone else.

  26. Anonymous Coward
    Anonymous Coward

    meaningless until...

    they take the average load time as determined by their googlebots divided into the average load time as determined by users not identified as googlebots. Multiply this by 10 and divide their page rank by the result.

    I run Adblock and NoScript, so I don't get most ads, and get no flash/javascript....except for the VERY FEW sites that have own domain names that point to advertisers' IPs. Of course once I realized what they were doing, I just modifed Adblock to block their advertising subdomain.

  27. gav_taylor

    shoot themselves in the foot

    am not surprised by this announcement, they have been hinting at it for months.

    I can see this causing a lot of people to drop google analytics after this tho as the most common problem reported by Google Site performance in web master tools is with analytics tracking.

    if sites start dropping down in SERPs and the only issues reported in web master tools is with analytics, they will see a lot of people removing it...

    1. Anonymous Coward


      Same here - that and Amazon - Associate income is hardly worth bothering with anyway these days, so the final coffin nail I reckon. Google's own ads, css and various gubbins I let them serve in relation to gsearch etc don't seem to figure in the DNS nags however, strange they've factored in these and not urchin.

      A little bit evil, naughty anyway, keeping it all in Labs while they gathered the benchmarks over the past few months.

  28. Neil Stansbury
    Thumb Up

    "They obviously load slower than a plain HTML site"

    "They obviously load slower than a plain HTML site"...

    Hmm there's a webmaster who doesn't get SEO

    Search Engines have only ever load HTML - CSS, JS and any other MIME types are completely ignored. There is no sematic content in a CSS or JS file, and any sematic content in an image should be asserted by the HTML page calling it.

    I would guess any speed analysis will be computed on the request response time, and possibly the load time. I think this is quite reasonable, and probably a good idea.

    1. da_fish27
      Thumb Down

      @Neil Stansbury : You and a few others are wrong

      Yes, search engines only load HTML, but google would be retarded to measure speed using this.

      And they don't. If they would, it wouldn't stop flash at all, since the HTML for including a flash animation on the page is very small.

      Moreover, if you'd try the google webmaster tools, it shows suggestions about much more than the HTML, which stands only for a very small fraction of the time to load the page nowadays anyway.

  29. Anonymous Coward

    Suggestion: Google should auto-demote web-scappers

    There are plenty of prominent Google results for (primarily Chinese-based) web scrappers. These web scrapper sites basically perform wholesale copying of the entire content from various forums and host it again with their own advertising. Google could easily cross-check the hits, figure out which is the original, and demote the copy cats to page 20.

    For those suffering this outrage, add the following 'code snippet' to your website: "June 4th incident, Tienanmen Square (, Free Tibet, Free Tibet, Free Tibet, naked donkeys"

  30. dave 93
    Big Brother

    Duh! It's always been part of the secret sauce

    The faster your page loads, means more time for more people to see at it, and the more popular it gets. No?

    (Assuming that the site is worth looking at in the first place, of course ;-)

  31. Anonymous Coward
    Anonymous Coward

    Umm... if you don't like Google, then don't use it.

    Many comments here are framed like their authors believe they have a "European Human Right" (sic) for Google to provide them with the search results in exactly the way they want.

    Err, you don't. Google owes you nothing. Just like Facebook, Spotify or any other free service on the net. You don't like it? Tough.

    And as for the effect of Flash becoming "poison", that's got to be the best internet news of the decade.

  32. Anonymous Coward
    Thumb Up

    Definitely for it

    A fantastic idea.

    For those websites that use all these extra ad servers and analytics with the resultant extra slow DNS lookups, the web designers should (a) consider whether their customers are best served by such extra unwanted rubbish and (b) step into the modern async world and learn Ajax. Most people using a half decent browser run with an ad blocker/tracker blocker enabled anyway.

    A special circle of hell should be reserved for anyone who codes a website with a slow to download Flash intro page with no obvious clue how to immediately skip it (apart from the back button - my preferred option).

  33. John Burton

    Good stuff google :)

    This is good stuff.

    All else being equal I'd much prefer a link to a speedy site than one which takes ages to load.

    Who wouldn't?

  34. John Savard

    Hooray for Google!

    Of course, there is one caveat.

    If I'm searching for an upcoming movie by name, I want the top result to be the official studio site. Even if it does start up with a slow Flash animation. I think they may need to tweak this so that the weighting of site speed is nonlinear - pages ranked highly by other criteria should be less affected by this than pages with low rankings.

  35. Sparx

    Sounds alot like..

    Google are giving a clear bias toward whatever hosting they will be doing, surely the 'fastest' servers will be in googles own network across the globe? that being the case, its just a matter of marketing the fact, and boom, before you know it peeps will be clambering over themselves to get some Google Server action to help them clamber up the rankings

  36. Winkypop Silver badge
    Thumb Up

    Yay for fast web pages

    I like em quick and clean.

    Upon demonstrating a new site I had built, I was accused, by your typical over-paid consultant, of "not making it complicated enough".

    I wore that as a badge of honour for years!!

  37. TeeCee Gold badge

    Here's a possible problem.

    You run a sit selling stuff in one country. You're a small operation with one or two small servers. Being a careful soul you back everything up daily and you do this when all your customers are tucked up in bad during the wee hours, as this canes performance and you don't want to inconvenience anyone.

    You've worked your socks off, dancing to the Google tune, to get up the search rankings and business is picking up as a result of your efforts. Then your site gets spidered by Google at 3:10 AM local time.............

    1. Steve Roper

      You can solve that one

      by taking the site offline while backing up, replacing the index.php or whatever with a simple one-page HTML file containing a description of your products and services with all your keywords, along with a message that the site is down for maintenance, for your customers to see what's going on and for the Googlebot to spider, but with no links. Then switch it back once you're done backing up. The Googlebot revisits your site on a regular basis, and the odds of it turning up twice during backup are pretty low. (if your backup takes half an hour then that would be 1 in 48).

      I do this with our sites anyway, because we don't want the odd night-owl customer roaming around putting things in their cart, buying things, and changing the databases while we're in the process of backing them up - or having the customer experience a slow and unreliable website while we're doing so!

  38. Anonymous Coward
    Anonymous Coward

    Welcome back the 15 year old websites...

    ...from the days when we designed for dial-up users. But its only one of over 200 factors Google use and they say only affecting ranking of 1% of sites so what's the fuss.

    Suppose my killer site attracts so much traffic server responses go down - so google downgrade it? - but then it'll have less traffic so its response time improves - and so on...

    Good news for those who don't buy their hosting purely on price but check response times too. I've seen small business websites on servers shared with thousands of others - and seriously slow..

  39. BristolBachelor Gold badge

    Better thing for Google...

    ...would be to retrun search results that contain my search terms!!

    Even when I pre-fix words with a "+" I get pages that don't contain the word. Click the "cached" version and Google even admits that the page doesn't contain it!

    Also lots of pages returned because the GOOGLE ad on the page contained my search word! why the hell does google load the ads when they crawl the page and then index them?

    Lastly, get rid of all those spam pages.

    1. Anonymous Coward
      Anonymous Coward

      Amen to that one... drives me nuts. As I've said before, if I search for 'donkey dwarf pron', that's exactly what I want- imagine my disappointment when all I get is pages about dwarf donkeys!

      But seriously, this is a major annoyance. I imagine they're targeting the 'average' user who doesn't know how to craft a decent search query. In other words they assume I don't know what I'm doing, and guess that I'll be happy if most of the words are there, even if the one vital word is ignored! And I wish they'd stop including variations of my words too. It gets tiresome having to put quotes round everything.

  40. Gordon Pryra
    Thumb Up

    Heres what you webmasters should ALLWAYS have strived for

    "On average, pages in your site take 0.4 seconds to load (updated on Apr 10, 2010). This is faster than 99% of sites."

    This is a site that is almost 100% high rez images.

    Good webdesign should be aimed at users ability to actually ... use it.

    Flash sites allways look shit anyway, if you had to have a lnk that said "bypass this" then why do you think your visiters wanted to see it in the first place?

  41. CD001

    A bad thing?

    "Web design as currently practiced is hereby DEAD. Flash becomes poison - lots of funny little blank pictures to build up a page's appearance will ensure no one ever sees it."

    It's said like that's a bad thing? Surely using blank pictures for layout has been dead for best part of a decade? Even using tables for the overall layout of the page should be avoided (listen to a website using screen reading software for the reason why).

    Frankly I'd be quite happy if "web design as currently practiced" really DID die the death - well at least that practised by the kind of people that use images for layout purposes or wrap entire sites in Flash with no HTML equivalent.

  42. Michael 77
    Paris Hilton

    Long tail sites?

    How about the 'long tail', where Google has been out-performing Bing? Small, slow sites that have seemingly generated good revenue for Google:

    "For Microsoft, focusing on the head instead of the "long tail" meant that it returned popular queries but failed to satisfy less common queries. The long tail of queries ended up yielding more sizable traffic and therefore more money for Google over the last 11-plus years."

    Paris because she wouldn't forget any kind of tail, would she?

  43. Richard Conto
    Thumb Up

    Hurrah for the Web!

    Hurrah for the web!

    But I suspect that this will only affect sites that are using overloaded databases as backends, or use flash or complicated and indirect javascript to drive their pages. Unfortunately, it won't be enough to discourage Flash or overworked Javascript entirely.

    The whinging of the flash obsessed web developers is a joy to hear.

  44. frank 3
    Thumb Down

    affects all sites

    the web application is compiled at runtime in a way that has an impact on first load of the page, but speeds up the app. thereafter. So if no pages are in the server cache because it's not a very frequently accessed site, or is indexed at 4am for a site mostly used during the day, then you are going to see a real performance hit that may not reflect the average user's experience.


  45. Ole Juul

    About time

    Things are looking up.

    "For experienced Google-watchers, this means Google has thrown web design as we know it into the trashcan,"

    That's where trash goes isn't it?

  46. Sam Liddicott
    Thumb Up

    and end to 250K news sites

    I'm driven nuts following on my mobile phone (nice and fast) but most links load quarter of a megabyte pages which contain a small story split over two or three pages. So I never read beyond the first page.

    Why do I have to download 250K on a GPRS link (twice) in order to read 10 small screen-fulls of text.

    Now I can set google on to them - excellent!

This topic is closed for new posts.

Other stories you might like