back to article Google shakes empty YouTube piggy bank

Google is still searching under the sofa for a remote control that might, one day, help the company make money out of its unprofitable YouTube video sharing site. In the meantime, the ad broker is hoping to convince more copyright holders to sign up to its service. At the moment YouTube only has about 1,000 media makers on its …


This topic is closed for new posts.
  1. Joefish

    So, let me see if I've got this right,

    They do scan videos for infringement of the rights of media copyright holders, but when it comes to physical and verbal abuse, invasion of privacy, obscene images or just general illegal activity they can't possibly be responsible for reviewing content in any way whatsoever?

    1. cyborg

      They're looking for the fingerprints

      Presumably whatever this technology involves is some sort of machine friendly way of working out what a video is of from the video stream itself.

      That doesn't readily apply to more esoteric human concepts such as, " a video of bullying."

      1. Joefish

        Maybe so,

        and it's pretty obvious they'd use a largely automated scanning solution. It also comes as no surprise that it's a system that dumps the onus on the original copyright holder to comply.

        But in this case they are clearly putting themselves forward as arbitrators of what they host and display, so the argument that they're not responsible for what their users do - that they're just a passive host of this material - is undermined by their own actions.

        1. I didn't do IT.
          Big Brother

          Copyright <> Morals

          The difference here is intellectual property rights vs. societal norms.

          Having a mechanism in place where copyright holders can submit hashes to match against submitted works from the public and mark possible infringements is not necessarily a bad thing - as long as derivitive works and public use rights are not stamped out.

          Trying to enforce societal norms against "questionable" material through an automated set of rules would be a pointless endeavor.

          Whose societal rules do you use? You CANNOT say the "best" one or yours just because you know them - What works for you might be completely intolerable to you neighbor, and vice versa. And NO, YOU CANNOT say "use country controls" because it must be more granular than by country or county - even if ALL your neighbors are in other countries. This would construct a "least common" denominator, and we all know how well THAT works (In case you don't - NOT AT ALL). :(

          By staying out of the subjective realm of morals, they keep themselves from taking the express lane to become complete gits. Odd from Google, eh?

          So, sorry - you will all have to retain your right to personal responsibility and accountability. Google is worried about selling advertising based on your habits, NOT (yet) in being the nanny that says what you can and cannot see based on morals. Isn't that why you have NuLabor?

      2. The Islander

        Valid point but

        it does perhaps reflect their intent / investment choice, i.e. put $'s into such a system rather than develop "something" to check for abuse. Not an easy one to crack. I presume they would shy away from sampling as it would open the can o' worms even more.

        I am reminded of gun / car / bow (& arrow) / other debates where manufacturers of potentially harmful products distance themselves from the use of those products. It often falls to society / legislatures to impose controls, on first sale or indeed general use. Such laws & conventions can take a painfully long time to arrive.

        Denying any contribution towards abuse in a situation where - knowingly - no control is exerted is blinkered and raises questions over company philosophy.

  2. Neoc

    Wrong maths?

    "...Content ID offers three general policies... According to Wiseman "most people" (33 per cent of the 1,000 partners currently signed up) choose the fat dollar option."

    Correct me if I'm wrong, but 33% is 1/3... and if you only have three options, then 1/3 of people choosing 1/3 of options is *not* "most people". It's an average spread.

    I also wonder how the system is supposed to work - if it *is* using hashes, all I have to do is change the compressions settings or fiddle with the resolution and - hey presto - I have a file which will not produce the same hash as before.

    1. I Am Fledge
      Thumb Up

      Making a hash of it all

      @Neoc - With regards to your last paragraph; I believe people do actually try and do that, which explains why I've seen a fair few videos of 'films / TV / music videos / etc.' which have been horizontally flipped so everything is mirrored.

      I imagine they use multiple hashes over the most likely tricks these 'cunning' souls are likely to perform. But as with everything, some will slip through the net.

  3. Anonymous Coward
    Thumb Down

    Greed!! :(

    All I can read is - Us license providers are greedy, we do not care much about creativity and spreading our content as much as possible, all we care about is making money. Screw human creativity, go for worthless paper currency!

  4. Test Handle
    Thumb Down

    Greed!! :(

    All I can read is - Us license providers are greedy, we do not care much about creativity and spreading our content as much as possible, all we care about is making money. Screw human creativity, go for worthless paper currency!

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2022