back to article Is Google fudging search rankings to benefit pages that embed YouTube vids? Or is this just another ‘bug’?

In yet another indication that Google uses its domination of the search engine market to benefit its own services, SEO experts have noticed an unusual in-built benefit involve its YouTube video giant. In an analysis earlier this year, content delivery experts Swarmify dug into a puzzling change to Google’s PageSpeed Insights …

  1. Pen-y-gors

    Veeery interesting...

    But stupid.

    Never mind YouTube. My first thought was to embed an entire webpage in an iframe - but after a few microseconds realised that wouldn't work, because there would then be no search terms recognised as there would be no content in the actual page! But is there a compromise? Load a basic chunk of plain-vanilla text with the critical phrases, suitably non-displayed, and then put the rest of the page into an iframe? Be interesting to experiment, until it gets blocked.

    How I long for the old days when content was king and SEO meant having useful terms in an <h1> tag and in the first few K of the page.

    1. Dave314159ggggdffsdds Silver badge

      Re: Veeery interesting...

      Content is still king. SEO is... Well, i'd say homeopathy but it's considerably worse, since SEO trickery can get you blacklisted by Google.

      The reliable way to top search rankings is to have the content searchers are looking for. But scammers can't make money out of telling you that. What your website needs is a handful of magic beans.

    2. NATTtrash

      Re: Veeery interesting...

      How I long for the old days when content was king and SEO meant having useful terms in an <h1> tag and in the first few K of the page.

      And then we're not even talking about the interpretation that your objective search engine will do when it presents you those objective search results... Making it different for you, me, and Mabel of a certain persuasion... Only here to help...

      Or as you can see in "The Social Dilemma": Google employees seem to use Qwant for their searches :D

  2. Potemkine! Silver badge

    I see flying pigs in the sky

    They don't do evil, so it's necessarily a bug, right?

  3. redpawn

    G's revised motto

    See no Evil, Speak no Evil, be a Weevil.

  4. Mike 137 Silver badge

    "How I long for the old days when content was king and SEO meant having useful terms in an <h1> tag"

    How I long for the days when web sites had content that could be read without "styles" getting in the way and images that could be viewed without running masses of untrustable scripts.

    I have nothing against good presentation - or even fancy presentation - but we've got to the point where presentation has taken over completely at the expense of content, and this is already making a high proportion of web sites inaccessible to people with some disabilities and those with older kit.

    Just for example, the last time I looked, the Register called about 1100 lines of styles to format a news item page of just over 10k text characters of content (a theoretical average of almost one style per word), and on many web sites a simple query form can't be used unless JS is enabled merely because the submit button uses JS instead of an HTML submit. However the worst case I've encountered is the National Cyber Security Centre web site, the entirety of which is a JS app. You can't even see contact information with JS disabled. Despite JS being the primary vector for drive-by infections and many other client side breaches, the official national guidance agency for cyber security obviously thinks this is "the way forward".

    The changing state of SEO is merely a symptom of the general trend towards mindless introduction of complexity.

    1. LosD

      Re: "How I long for the old days when content was king..."

      (Original title became too long with Re: and the quotation)

      I, for one, has no issue with the web requiring JavaScript. But it needs to be safe, and the generated HTML should always be available and understandable to screen readers (both are mostly up to the developer)

      Dumping old clients is a feature to me, though.

      1. tiggity Silver badge

        Re: "How I long for the old days when content was king..."

        "But it needs to be safe" - problem is, you cannot easily tell if its "safe"

        "Dumping old clients is a feature to me, though."

        Sites should not hammer an old PC, there's no reason someone should have to fork out lots of cash to get a new PC just because some JavaScript jockies create lots of CPU hammering dross

        I run old PC's (as why chuck them until they break) - and script blockers are my friend

        1. Anonymous Coward
          Anonymous Coward

          Re: "Sites should not hammer an old PC..."

          I will go a step past that and say a tab in a webpage should not trash a MODERN pc either.

          Sadly all the current browsers are prone to letting one background tab load the system up to maximum CPU load. One badly coded page will gobble up gig after gig of memory. There are no real user facing methods to limit a page to reasonable memory or cpu use. As a result we had to deal with lunacy like coinhive, and a generation of trashed laptop batteries, all because of crap web programmers building crap websites, and browser manufacturers that all seemed to graduate from the "users are losers" school of management.

    2. I am the liquor

      Re: "How I long for the old days when content was king..."

      In defence of El Reg, this site is one of the minority that still works completely fine with javascript disabled. Even these comments.

      1. Anonymous Coward
        Anonymous Coward

        Re: "How I long for the old days when content was king..."

        "In defence of El Reg, this site is one of the minority that still works completely fine with javascript disabled. Even these comments."

        Indeed!

        Not only can you read and comment on elreg with javascript disabled you can also disable all cookies and use your browser in "Private browsing" mode and STILL be able to post comments, Try It!

        I refuse to enable javascript to view a webpage.

        If I can't view your site with javascript blocked I go eleswhere.

    3. Dave559 Silver badge

      CSS styles

      I think that a large part of the issue is that stylesheets have by necessity (but also unfortunately) become much more complex and larger nowadays as they have evolved to include style rules which render page content and layout differently, depending on the screen size of the browsing device.

      When stylesheets first became commonplace, there was a brief trend for some sites to offer users the choice of different styles to choose from, such as light, dark, high contrast, etc, which was kinda nice, but I guess the multiple maintenance overhead soon put an end to that.

      There was (is?) also the capability for browsers to optionally apply user override styles so that you could avoid the more searing color combinations of some sites, or improve accessibility with more readable or larger fonts, etc, but I'm not sure whether anyone other than the Opera of old ever took that sort of functionality seriously, sadly (perhaps there might also be some Firefox add-ons around which could do similar, if they ever survived the Great Purge of the Extensions…)?

      On a related note, the excellent Dark Reader add-on is definitely worth a mention: it lets you apply either a 'dark mode' or more off-white (thankfully, death to #FFFFFF!) 'parchment mode' style to any website.

    4. JetSetJim

      Re: title is too long now... mutter mutter... "...<h1> tag"

      I would love it if there was a way of forcing websites to define what the minimum JS domain includes are for it to function. e.g. for El Reg, just need to allow theregister.com, but can block doubleclick and google-analytics. NoScript is great, but when you actually *do* have to use a website that bloats many domains worth of JS, it's a pain guessing which one is the one that enables a button.

      1. Anonymous Coward
        Anonymous Coward

        Re: title is too long now... mutter mutter... "...<h1> tag"

        > but when you actually *do* have to use a website that bloats many domains worth of JS, it's a pain guessing which one is the one that enables a button.

        At that point I just navigate away. If I'm *really* bothered and can find their contact details I'll and an email or give a call, otherwise I'll just assume that if that particular website didn't exist my life wouldn't be any different anyway.

      2. Anonymous Coward
        Anonymous Coward

        Re: title is too long now... mutter mutter... "...<h1> tag"

        And of course there is the possibility that having to reload the page so often gets the page made unavailable. For example, I've been told there's a dozen javascripts that want to run in my browser for somesitexyz.com and I know half of them are trash, so it's disable-reload on each of the others to see what is truly necessary and what is not and by the time I've narrowed it down I get a "we're getting unusual traffic from your network, are you sure you're not a bot?" message, which prompts more of the same because they always use the cursed google puzzlebox-style captcha which never works and of course requires more page reloads and gets another "unusual traffic" message. It's like Kafka vacationing at Fawlty Towers.

    5. NATTtrash

      Re: "How I long for the old days when content was king...

      I agree with you, on many things, but do keep something in mind. A lot of the "baggage" that is introduced with for example contact forms and stuff, is because otherwise your days will be filled with miserable, non-productive spam fighting and stuff. So yes, you got a point and I too long for "do-one-thing-and-do-it-well" and Mosaic days sometimes... But don't slip into a "it-is-all-rubbish" because it isn't THAT black and white (I think)...

      1. Anonymous Coward
        Anonymous Coward

        Re: "How I long for the old days when content was king...

        But the problem with people using Google CAPTCHAs is that it mens that, yet again, GRUgle gets to know which websites you are visiting.

        And I really wouldn't be surprised if, having wormed themselves into the form page that way, their script also scans and exfiltrates everything that you type into the form, too (only "to better improve your advertising experience" and definitely not for anything more nefarious, definitely not, what's that buzzing on the line?).

        Has anyone ever thoroughly audited Google's scripts (and those of other, similar, third parties) to check exactly what they're doing? There surely must be a great academic research project out there for someone with the time and skills to do so.

        (Don't get me wrong, the principle of CAPTCHAs is a good one, but it needs to be done in a trustworthy way.)

        1. NATTtrash
          Coffee/keyboard

          Re: "How I long for the old days when content was king...

          I'm sorry, but how does some (protective) measurements/ code (.js, <? php) automatically equal Google CAPTCHA stuff? Are you telling me that on a professional platform like ElReg sifting out crud on your forms equals "let's-see-what-Google-has-for-code-so-I-don't-have-to-use-my-brain-and-can-go-on-looking-in-the-Google-play-store-for-games-to-waste-my-time-while-I-am-sitting-on-the-loo-playing-games-and-keep-making-easy-money-producing-poo" behaviour? Typing code yourself is too exhausting, but we do just manage copy-paste?Oh dear. What ever happened to the world? The end is nigh...

          1. Anonymous Coward
            Anonymous Coward

            Re: "How I long for the old days when content was king...

            [Same AC as the above AC here, but not the same AC as the one before that]

            My response was about Google CAPTCHA because it was a reply in a thread specifically about "the cursed google puzzlebox-style captcha", as your reply also appeared to be, as it didn't mention any alternative solutions that people should maybe use instead.

            I definitely agree with you that competent web developers would do far better to look into more trustworthy solutions than just instinctively going for whatever shiny lure and bait that Google seems to be waving at them (look, I even said that in my previous comment), but your original comment didn't make it clear that you yourself were also taking into account alternative solutions other than the Google CAPTCHA that we were all grumbling about.

            1. NATTtrash
              Pint

              Re: "How I long for the old days when content was king...

              Aren't all these ACs confusing... :D

              I get it. Here, have one... The world is saved!

  5. SloppyJesse
    Happy

    Blockers

    TL;DR Google speed test developers assume everyone uses blockers for ads and videos

  6. mark l 2 Silver badge

    It would be no surprise to see Google giving preferential treatment to website with embedded Youtube videos, just look how a search on Google will usually bring back videos from Youtube as the top result.

    I have actually found iframes to be a dangerous technology that really should be be disabled and you have to opt in if you need it. I have seen several website embedding a 1 pixel size iframe on a page which cab load content without it being shown to the end user, and this can be used to track you even when you are using other tracking blocking techniques. Or even could load up illegal content which would then get stored in your internet cache and a record of you visiting the site recorded by your ISP. Yes you would never know until it was too late.

  7. Gene Cash Silver badge

    Benefit YouTube?

    Try to find a page that hosts a Vimeo video. G'wan, I dare ya.

  8. Version 1.0 Silver badge

    Farting Hippos

    This is a very useful story El Reg, I'll add a new video to the corporate web site.

    1. Falmari Silver badge

      Re: Farting Hippos

      Ah Googles official reply to the findings.

      Full of shit and hot air. :)

  9. Androgynous Cupboard Silver badge

    iframe and probably object/embed too

    All three tags embed the content as a sub-document - they have separate DOMs - so it's understandable, from a certain point of view. In particular the size of these elements is known when the element is declared, rather than when the content is loaded, which means the layout of the document won't change as a result of the iframe/object/embed loading, which is very significant for rendering speed.

    In fact, on a hunch I had a look at the original analysis. The animated GIF was specified without a width or height, which means - in contrast to the iframe - the layout would be required to change after the content was loaded. The same for the video tag. If you're analysing page layout speeds, both approaches are known to be slower.

    I expect had they put a "width" and "height" attribute on their gif or video, there would have been no difference. So I don't think there's anything to see here at all.

    1. _andrew

      Re: iframe and probably object/embed too

      I'm in favour of anything that down-rates pages that have different-size advertisements that make the text that I'm trying to read jump up and down as they change and force page reformats. Not that I use search or ratings to get to most of the web, but if big-G can use its influence on the ones that do, then I'm for it.

  10. Jason Bloomberg Silver badge

    Clickbait

    "Is Google fudging search rankings to benefit pages that embed YouTube vids? Or is this just another ‘bug’?

    In fact, what was happening, the team found out was that anything loaded onto the page using an iframe tag, such as an embedded YouTube video, was simply discounted from Google’s PageSpeed tool altogether.

    Well done El Reg - You made a Clickbait Victim out of me. Hope you are proud of yourselves.

  11. fidodogbreath

    Unless I'm misunderstanding something, this seems to be documented behavior. From Google's description of the First Contentful Paint metric (emphasis in original):

    "FCP measures how long it takes the browser to render the first piece of DOM content after a user navigates to your page. Images, non-white <canvas> elements, and SVGs on your page are considered DOM content; anything inside an iframe isn't included."

    1. Anonymous Coward
      Anonymous Coward

      Documenting how the biased ranking works does not un-bias the algorithm

      The issue still stands, and after getting slapped on the wrist by the EU it has just shifted to less overt tactics. It has been caught over and over, and will continue to seek to abuse it's position like every other large tech monopoly (lookin at you Facebook). Much like a police consent decree, we will have to force it to adjust it's activity when outside researchers or auditors find these little slants.

      That will only take another 10 years or so based on prior cases winding their way through the endless US appeals process.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like