back to article Google open sources JPEG assassin

Google has open sourced a new "lossy" image format known as WebP — pronounced "weppy" — claiming it can cut the size of current web images by almost 40 per cent. CNet revealed the format with a story late this morning, and Google soon followed with a blog post describing the technology, which has been released as a developer …

COMMENTS

This topic is closed for new posts.
  1. E 2

    Think of the HDD makers!

    -40% size for similar image quality!?

  2. Tony Green

    Internet Explorer support

    Probably by about 2020?

    1. Il Midga di Macaroni
      Gates Horns

      Optimistic much?

      *Transitional* support by 2020, full support projected for 2035. And then dropped in 2021.

      1. Mike Richards Silver badge

        Not forgetting

        It'll be Microsoft WebP (for Windows) format only.

  3. Anonymous Coward
    WTF?

    Jpeg2000

    What about Jpeg2000 - much smaller sizes, available years ago. Used by nobody.

    1. Mephistro
      Linux

      [Paste title here]

      Jpeg2000 is encumbered by lots of patents. If this format ever gets widely used, patent trolls will have a field day.

      1. Ken Hagan Gold badge

        Jpeg2000 legal situation

        Baseline Jpeg2000 is explicitly royalty and licence free, but the patent holders haven't put their IP into the public domain.

        This is not significantly different from GPL-ed software, which is explicitly available for use and modification, but the copyright remains with the authors precisely so that the agreement that permits free use can be backed up with legal force if necessary.

        Talk of submarine patents is just FUD.

    2. Dave Bell

      Used in Second Life

      Jpeg2000 is the image format used by Second Life--it has a few advantages--but it suffers from some poor-quality code in open-source graphics codecs. One advantage is that you don't have to download the whole file to get a lower-resolution version of the image. But the image can end up as just the low-res version in one corner of the full-size bitmap.

      Get smaller files with the same image quality, and the advantages for this sort of graphics-intensive net gaming are obvious. But there would be a huge amount of data to translate, in any existing game, and there's always some quality loss. There are alternatives to Second Life, currently using compatible software. But if this lives up to the hype, is the compatibility worth it?

    3. Charlie Clark Silver badge
      Welcome

      JPEG 2000 is very nice* but

      The patent owners have thus far refused to put it in the public domain. My guess is that JPEG 2000 is the benchmark for Google's new format which they might actually be using to encourage opening of the the JPEG 2000 format.

      * JPEG 2000 is particularly nice if you have text in your images as text suffers so heavily from artefacts in JPEG.

    4. heyrick Silver badge

      Smaller sizes?

      I Wiki'd JPEG2000 to see a comparison. I noticed there are potential lurking patent issues (doesn't everything these days? <sigh>) which could be an impediment. However, the thing that amused me was the example comparison image... which was exactly the same size as the standard JPEG.

      The software I use for doing my JPEGs (PhotoImpact5 - old but reliable) allows you to play with the JPEG type (progressive/standard), the image colour coding (4:2:2 etc) and the quality. The size of my JPEGs depends upon what I'm prepared to sacrifice, and for some things (monochrome scans of letters) you can get away with quite a bit.

      Perhaps this is why JPEG2000 didn't take off? Maybe it was additional complication without enough returns to make it worthwhile?

      1. Charlie Clark Silver badge

        JPEG 2000 compression ratio

        I get stuff to be about 50% smaller again then JPEG with J2000. But as JPEG is already pretty good I guess it was diminishing returns coupled with the patents that prevented take up. Pity, because it really does produce better quality at lower sizes.

  4. Anonymous Coward
    Thumb Down

    Ahh, how we missed you, Cade!

    "It's no secret that Google is on a mission to make the web faster — in any way it can."

    What a bunch of bastards!

    "The faster the web, the more cash Google rakes in."

    First, that's not strictly true. There's a point where faster loading hardly matters, because you can only read and see so much so quickly. At that point, increased compression or throughput mainly helps increase options for the web developers, not increase the number of served ads.

    At any rate, so what? Google is 'raking in cash' - they're a business. In case you haven't brushed up on your economics lately, that's what businesses -do-. By gratuitously ripping on a company for making money while improving the net for everyone, users/customers or not, all you do is make yourself look like a petulant kid shoving over the chessboard. And you also reduce the credibility of any legitimate criticism of Google that shows up on El Reg, because the very fact that your articles are published in the form they are suggests an inherent bias.

    1. Anonymous Coward
      Grenade

      Wait a second

      El Reg? Biased? Who would have thought... Anyway, in regards to your first point. You're forgetting that Google is forking out tons of cash to pay off its bandwidth usage. If they can reduce the amount of bandwidth services such as Google Maps and Google Images use then they will be saving a pretty penny or two.

      1. PsychicMonkey
        Pint

        i thought

        google had bought some massive backbones themselves to avoid bandwidth costs? or did I dream that?

        1. Anonymous Coward
          FAIL

          RE: i thought

          And are those backbones connected to Average Joe's home? No.

      2. M Gale

        I thought once you get to Google's size...

        ...you don't so much buy bandwidth, as negotiate a peering agreement.

        1. Oninoshiko
          Coat

          you still buy bandwidth, but...

          No, you still buy bandwidth, you just start to call it "dark fiber". You do stop worrying about the "Gb" and start thinking in terms of "strands" and sometimes "(Coarse/Fine) wave division multiplexing." You also start to spend lots of time worrying about "idiots with backhoes."

          Mines the one with the OTDR in the pocket.

    2. Anonymous Coward
      Thumb Up

      Wow!

      14 up and 13 down... I'm not sure I've seen such a widely reviled *and* agreed-with post on here. And *I* got to write it!

      *clasped hands / anime girl happyface*

  5. solid gold suleyman
    Paris Hilton

    Ahh, yes

    If the quality is as good but 40% smaller, I'm all for it. It will cut the size of my pr0n partition. I don't care that Schmidt, Page, and Brin want to take over the world and probably have tracking codes inserted in the image format. Paris cuz, well, you figure it out!

    1. ThomH Silver badge

      It's not really 40% though, is it?

      A bunch of JPEGs, PNGs and GIFs were converted to the new format and, across the lot of them, they saw an average 39% saving. A meaningful test would have been to compare it to JPEG only, since otherwise an unknown proportion of the argument is the senseless "we switched from lossless to lossy and saved a lot of space, hence our lossy format is the best format".

  6. Grumpy Fellow
    Pint

    Lena/Lenna required

    It is pointless to discuss a new image compression algorithm without side by side (uncompressed / compressed) pictures of Lena for reference.

  7. Pirate Dave
    Pirate

    I dunno

    Most JPGs already suck, especially at smaller sizes. Compressing them with ANOTHER layer of lossy compression doesn't seem like a good idea. At least not to an old fogey like me. But then, I think most of the video quality on youtube is unbearably bad compared to standard-def TV, so what do I know? Smeary, indistinct, grainy pictures - that's why we got broadband 10 years ago, innit...

    1. blackworx
      Boffin

      Not quite

      "Most JPGs already suck, especially at smaller sizes. Compressing them with ANOTHER layer of lossy compression doesn't seem like a good idea"

      Unless you're directly transcoding a JPEG then that's not how it happens.

      Most of the video on YouTube is cack because it's shot through the tiny plastic lens of a £100 cameraphone/digicam at a lower resolution and framerate than standard def TV and with on-the-fly video/audio compression performed by a tiny processor. This probably goes a long way to explaining why it's not quite as good as standard def TV. The compression algorithms themselves are not necessarily at fault, as in this case they are very much limited by the amount of processing power available.

    2. Real Ale is Best
      Boffin

      Video quality on youtube

      Video quality on YouTube is entirely due to the authoring and mastering processes used prior to upload.

      I recently uploaded some footage taken with broadcast HD cameras, down converted to DVD size, encoded using WebM and uploaded to YouTube. The quality is actually really impressive.

      Smeary, indistinct, grainy pictures - that's camera phones and webcams, that is.

      Plus, I think you miss the point. It isn't the case of compressing already compressed jpeg images. Its about compressing newly authored images - that's the way you keep the quality.

      To gain popularity, it will have to get into the camera market, and that will depend on how efficient the codec is. Jpeg is trivial to produce a low power hardware codec for, this being essential for small cameras, phones etc.

    3. James Hughes 1

      Did you read the article?

      It's not another layer of compression, it's a different sort of compression. Compression techniques have improved greatly since jpeg was developed - it's not surprising that there are now better ways of doing it to the same quality level.

      1. John H Woods Silver badge
        Thumb Up

        Can you please ...

        ... post the URL?

      2. Pirate Dave
        Pirate

        @James Hughes

        yes, I read it. See my response to bilgepipe. Now perhaps there is more to Google's idea than what was distilled into the article, but judging by the article, it looks like Google is trying to further compress JPGs.

      3. Pablo

        Easy mistake

        One quote in the article does make it sound like this would be an additional layer of compression for JPEG.

        "Google decided to figure out if there was [sic] a way to further compress lossy images like JPEG to make them load faster"

        I nearly came to the same conclusion when I first read it.

      4. Ken Hagan Gold badge

        Did you read the linked blog?

        Actually, it *is* another layer of compression. From the blog...

        "We expect that developers will achieve in practice even better file size reduction with WebP when starting from an uncompressed image."

        I read your comment and agreed with it before I read the blog, so I'm as surprised as you are, but I suppose this makes sense in context. After all, in the majority of cases, web sites no longer have have the uncompressed image, so "Can it squeeze my existing JPEGs?" is a fair question.

        The blog also links to a gallery of comparison images: http://code.google.com/speed/webp/gallery.html. (No Lena, for copyright reasons apparently.)

    4. Bilgepipe
      FAIL

      Fail

      Reading comprehension Fail. You're not compressing JPEGS - JPEG has nothing to do with this.

      1. Pirate Dave
        Pirate

        @bilgepipe

        my failure of comprehension is generally well understood, but perhaps you should learn the basics of reading before spewing your bilge.

        From the article:

        "Some engineers at Google decided to figure out if there was [sic] a way to further compress lossy images like JPEG to make them load faster,"

        "Google has tested the format by re-encoding 1,000,000 existing web images, mostly JPEGs, GIFs, and PNGs, and it saw a 39 per cent reduction in average file size."

      2. FoolD
        Stop

        Hasty commentarding?

        @Bilgepipe & James Hughes 1

        I suspect he was referring to people using the command line app mentioned in the article to convert existing JPGs to this new format ; in which case the already lossy pictures will lose even more detail.

        ie. think twice before converting your JPG pr0n collection to this new format, just to save space...

      3. JohnG Silver badge

        Compressing JPEGs

        "Reading comprehension Fail. You're not compressing JPEGS - JPEG has nothing to do with this."

        Oh really? Are you're saying that all your current photos are in WebP and your camera(s)/mobile phone(s) produce photos in the WebP format? If not, then it may not be Pirate Dave who has been inflicted with a lack of comprehension.

    5. heyrick Silver badge

      @ Pirate Dave

      There's two things with YouTube. Firstly YouTube transcodes your input into their own format (another level of quality loss) plus it seems to transcode to a lowish (800kbit?) bitrate so you can see on-screen the colours are blotchy and flattened. I know, from having uploaded a 1800kbit XviD from a 2500kbit H.263 source recorded from HQ video using a Neuros OSD.

      However, the truly terrible videos are from cheap cameras and mobile phones. I'm not sure there's an excuse as my small Agfa digital can do decent looking MJPEG video at something like 820x560 (shame the sound recording is awful). My Nokia 6230i, on the other hand, is just awful. I can barely tell the difference between its 3GP high quality and the low quality, and given the blocky mess that is the result, I'm not sure the word quality even factors into it.

      In short, while YouTube introduces its own problems, most of the cack on YouTube looked like that before YouTube got its hands on the video!

      FWIW, I think us older timers (who remember what a clear analogue picture looked like) will always be slightly disappointed. Yeah, it's cool that I can watch the video of my choice in realtime from places like YT and Vimeo. It's cool that I can store nearly 200 hours of video on DVD-Rs standing in a pile as tall as a single L750 tape (3h15m). It's cool that we can now have a billion channels with nothing worth watching on any of them. And it's cool that we can fill a ridiculously large screen with a picture with sufficient resolution that you don't see the individual pixels. The flip side? If you know what an MPEG macrocell artefact looks like, you'll see them EVERYWHERE. Bluray/HD suffers horribly from this, from the demos in supermarkets.

      1. Pirate Dave
        Pirate

        @heyrick

        well, my point about youtube wasn't so much about the actual quality of the clips as it was that such a level of quality is apparently considered "acceptable" to a large part of the Internet. My kids watch videos on youtube that are barely discernable. It's like 20 years ago when we had those crappy Autodesk Animator flicks. Then we got Quicktime and MPEG videos, and things were much better. Now Youtube and Flash are lowering the standards again.

        1. heyrick Silver badge

          @ Pirate Dave

          I think the level of "acceptable" is a trade-off between available bandwidth and how much you want to see the content. Don't forget that services such as YouTube are designed for in-situ viewing (which is why add-on software is required for downloading from such services). Because of this, YouTube needs to choose something of acceptable quality which isn't going to saturate your connection. I'm on a 2 mbit link and most things non-HD come through in real-time (my little netbook can do HD video, just not *H.264* in HD, too intensive).

          But is this new? Remember in the good old days it was "acceptable" to use a VHS-C video camera to record a ciné screen (and many of them were fixed at a sync rate different to cinema projection leading to flickering) and, no, you didn't watch that tape. If you were lucky you saw a copy of the copy of that tape, which was so degraded it was barely discernable. What's new is that back then you needed to be friends with the video shop guy. Now it's open to anybody who is able to use a web browser and type the immortal words "cute kittens". :-)

          Perhaps in the future, when we *all* have 100 mbit connections (I won't hold my breath!), we'll see a return to video of a better level of quality; but given that minority satellite channels are choosing lower bandwidth per channel in order to squeeze in more channels, I won't hold my breath for that either...

      2. StooMonster
        Coat

        MPEG macrocell artefacts

        @heyrick: " If you know what an MPEG macrocell artefact looks like, you'll see them EVERYWHERE. Bluray/HD suffers horribly from this, from the demos in supermarkets."

        Err... HD DVD almost exclusively use Microsoft's VC-1 codec, and occasionally H.264; not a disc in MPEG-2.

        Blu-ray is almost exclusively H.264 for movie encoding, it was only the really early discs that used MPEG-2 extensively.

        So having a bit of trouble wondering why you're seeing MPEG-2 macrocell artefacts, when these formats use variable block sizes ... maybe you're seeing them EVERYWHERE when they don't really exist. ;)

        1. Charles 9 Silver badge

          He said MPEG, not MPEG-2

          And H.264 is also known as MPEG-4 AVC, which means H.264 is still MPEG (MPEG-4 Part 10).

  8. James Henstridge

    title

    Looking at the sample images, the ones where the new format sees the biggest improvements are those with large areas of solid colour or simple gradients (or close enough).

    I guess that isn't too surprising since JPEG essentially treats each 16x16 pixel block independently, so there are easy wins for any format that takes a more high level view.

  9. Alastair 7

    Great, except

    40% smaller? Fantastic. Except that if we used 'Weppy' images on my site we'd need MORE space, because we'd be storing both JPEG and 'Weppy'. It's going to be a long, long time before anyone can retire JPEG.

    1. Anonymous Coward
      Stop

      Nah, wait till Opera/Firefox/Safari/Chrome

      support it in 6 months or so, then leave IE users with a broken image.

      That's the reverse of what bad websites have been doing for years, getting it working in IE and calling it a day.

      1. Miek
        Linux

        Yeah But ...

        Try explaining this to ordinary website users that cannot distinguish "The Internet" from that "Big Blue E" and see how long you want IE users to be delivered a broken site.

        1. Anonymous Coward
          Happy

          @Miek

          No, just do a browser sniff and if it's IE then display an upgrade link to Firefox or Chrome instead of the image.

          1. Alastair 7

            Uh

            "No, just do a browser sniff and if it's IE then display an upgrade link to Firefox or Chrome instead of the image."

            Unfortunately not an option, because we want our web site to actually make money. Shame.

            1. Anonymous Coward
              Stop

              Isn't there something about supported formats in the http header?

              And couldn't you just transcode on the fly and send whatever format the client says they can handle?

    2. LoD
      Stop

      Space, yes...

      ...bandwith, no.

  10. Sorry that handle is already taken. Silver badge

    Re-encoding PNGs?

    "Google has tested the format by re-encoding 1,000,000 existing web images, mostly JPEGs, GIFs, and PNGs, and it saw a 39 per cent reduction in average file size."

    Well it wouldn't surprise me if applying lossy compression to 24bit PNGs resulted in a reduction in file size. It would surprise me if it didn't.

    How about you give us something unambiguous Google?

    1. TeeCee Gold badge

      Re: Re-enconding PNGs?

      Maybe even worse than that. The weasel word in that quote is "mostly". You don't need more than a small percentage of those 1,000,000 images to have started life as BMPs to make a 39% *average* reduction look distinctly unimpressive.

      1. Snow

        Re: Re-enconding PNGs?

        You should perhaps not assume that a short article such as this one contains all of the information that Google released, or indeed that anything not marked as a quote is actually a quote.

        The information in the article seems to be a distillation of the information which Google has actually provided on their website on WebP (including a breakdown of how many PNG/JPEG/etc):

        http://code.google.com/speed/webp/docs/c_study.html

      2. Piers

        Yeah - that's what I thought...

        If you're using gifs or pngs, you're probably doing it because a) you want some form of transparency and (for pngs) you actually want *lossless* compression - for example fountain fades or text that is/are crisp with specific clour rendition and artifact free.

        So using pngs or gifs *at* *all* as part of your experiment is null and void. What are the figures without those types? Oh, not so good eh? I see...

    2. Ian Yates
      WTF?

      See message body

      Agreed. It'll be interesting to see this in independant hands.

      The "mostly JPEGs, GIFs, and PNGs" is meaningless: re-encoding a JPEG is daft, GIFs and PNGs are solving a different problem (lossless), and "mostly" just undermines the whole thing.

      Show us a JPEG and WebP of an actual RAW photo with filesize comparisons.

      I'm not a naysayer, but their use of stats is awful.

      Anyone else feel like we're back in the 90s, with the image format wars... fun times.

  11. JaitcH
    Happy

    How long until Irfan releases a pluh-in?

    One of my favourite graphics tools, Irfan, usually comes up with a new version, or a plug-in, for events like this.

    I wonder if Jobs will allow WebP to run o his stuff?

    1. Bilgepipe
      Troll

      Troll

      Read it again - Google are already working to add this to the open-source WebKit browser - you know, the one used by the satanic Jobs and his evil cult of baby-eating monsters.

      Learn to read, instead of rushing to troll and making a fool of yourself.

      1. hexx

        you mean

        the one Jobs and his evil cult developed and make it open source?

        1. This post has been deleted by its author

        2. Anonymous Coward
          WTF?

          @you mean

          FAIL

          Apple did not open source webkit and they didn't develop half of it. They took the existing open source khtml engine and forked it. Additionally most of the subsequent work on webkit has been done by third parties and not Apple.

      2. jonathanb Silver badge
        Troll

        Re: Troll

        Yes, and google have made the ogm video format available for Webkit, however it doesn't meet with jobsian approval so Safari can only view mp4 videos.

  12. Anonymous Coward
    Anonymous Coward

    But can it do animated gifs?

    I guess it should be able to do animation being based on a video codec, but can it I wonder?

  13. Justin Clements

    Google....

    Solving a problem that didn't exist.

    1. everyone is moving to faster connections all the time, so a few (porn) pictures downloading quicker (not that you would notice) is going to make no difference.

    2. never have I thought "these graphics are taking an age to download".

    3. Bandwidth is becoming bigger for video, so the savings in this new format are negligible overall. The internet is becoming a broadcast medium for video, so who cares that a webp image is going to load a fraction of a second faster?

    1. DZ-Jay

      Re: Google....

      Although I agree with you in that this is a solution to a non-existent problem for the majority of the world, it is a problem in Google's realm.

      Perhaps they have realised that hoarding everything from everyone forever may not be as practical as original thought, and that their data centers do have storage capacity limits.

      -dZ.

    2. Kubla Cant Silver badge

      But the problem does exist

      You're right about video, but the other noticeable trend is the explosion in mobile browsing.

      If you've browsed the net over GPRS, or even 3G, the thought "these graphics are taking an age to download" will be at the forefront of your mind. It looks like the mobile operators are unwilling or unable to increase bandwidth rapidly to meet demand, so smaller graphics could be a win in the short/medium term.

      1. Justin Clements

        erm...

        Mobile phone providers already do compress images on their connections.

        Also, you can crank up jpeg compression to anything you want. Jpeg isn't a straight compression, there is a scale, and if you crank it to 1% you will end up with 1 or 2 very large pixels on the screen.

        The poster above posed an interesting theory that this is about saving Google money in storage costs, so its not exactly altruistic of Google to release this, when the prime benefactor would be themselves.

        1. Anonymous Coward
          Grenade

          Google - Altruistic?

          What ever made you assume that anything Google does is altruistic? Of course Google stands to gain *something* from this. Personally I'm not quite sure what, but I'd bet my house on somebody from Google having already done a Business Case presentation or fifty.

  14. DavyBoy79
    Stop

    At first I was excited

    As a photographer, the thought of being able to use a newer format that offers a reduction in file size when delivered to the customer seems great... but... it uses the YUV "colour space" and not a derivative of sRGB (or Adobe RGB). This means two things... conversion from a native format into a format not well suited for still photography, and secondly, storing in a format that favours brightness over colour means that many photos will look poorly coloured, especially where there is a strong contrast between colours, at edges.

    So for the casual punter, this will probably be ok, but for the discerning eye, I'll stick to JPG thanks!

    1. Anonymous Coward
      Coat

      RAW anyone !!!

      "So for the casual punter, this will probably be ok, but for the discerning eye, I'll stick to JPG thanks!"

      as a photographer with a discerning eye, i think i will stick to RAW and PSD files.

      I only ever use jpegs if a customer requires it for a website, even then I will suggest a PNG may be better.....

      Mines the one with the pocket full of SDHC cards

      1. Paul RND*1000
        Coat

        Let the digital rot begin!

        So when can we add JPEG to the list of "files I used to be able to read which are lost forever"?

        Screw RAW, mine's the one with the rolls of film in the pocket... ;-)

    2. StooMonster
      Grenade

      Logic fail

      @DavyBoy79: "So for the casual punter, this will probably be ok, but for the discerning eye, I'll stick to JPG thanks!"

      So, you object to Google's new format because it uses the YUV colour space, and will use JPEG instead.

      You know that the first step of JPEG compression is to convert RGB to YUV colour space, right? It's more efficient to compress the chroma (hue) and luminance (brightness) separately, as the chroma can be compressed much more than the luminance.

  15. jai
    FAIL

    No support for transparency yet?

    no transparency support mean this is pretty much only useful for Flicker and the Google Images search page. for everyone else on the web, we'll continue to use PNGs and GIFs

    1. blackworx

      Eh... no.

      That's not a fail. In fact I'd say the fail is in your comprehension.

      This is photographic compression we're talking about, not bitmap graphics. Of course "the rest of us" (whatever that means) will continue to use PNG's and GIFs... for the purposes PNG and GIF were designed for (i.e. not photos).

  16. rahul
    Grenade

    What exactly does Google have to do...

    ... so that people will stop believing it's every action is "evil?" Reminds me of xkcd's "Password Reuse" strip.

    So, it's done a bad thing by opensourcing this format? Here's something it COULD have done to generate revenue: Held the format closed source, or ensure an encoding license requirement or so, then used it's ubiquity to allow the format to display in browsers, then ensured that only it's ads are in the fast-loading format. Anyone else trying to use the "fast loading" format will have to pay Google a license to do so. What would you write in this case?

    Spewing your vitriol at every Google development is neither fair nor reporting. Yes, it's trying to leverage it's revenues; at least it is open about it. You don't like it, use Bing. Do you think they do not collect your data? Or hey, here's a thought: Go Facebook. I'm SURE they are better than Google.

    1. JimC

      so people stop beleiving it's every action is evil...

      Well I could make a list. Just for a start

      - Pay creators a reasonable share of the advertising revenue

      - Introduce a take down system on Youtube and the like that doesn't put the onus for everything on the people they are ripping off

      - stop trying to rip off reators by attempting to get as much as possible labelled as orphan works

      - provide proper non-onerous opt out for stretmap, including democratic vote by communities not o be involved at all

      - stop coming up with BS about collecting data "by accident" and then not deleting it

      I'm sure plenty more could be added... go for it folks.

    2. Justin Clements

      Easy

      > What exactly does Google have to do so that people will stop believing it's every action is "evil?"

      Stop telling people that they aren't evil for a start, stop trying to compete with everyone else and concentrate on their core stuff, stop pretending that their brand is bigger than it is, stop doing a book deal to prevent others from competing in the same market, stop copying everyone - and for heavens sake will they stop trying to beat Facebook - as I (and the rest of the planet) aren't about to sign up to a new social network just to make Google richer - deal with it.

      In fact, if they just stopped becoming the 21st century Microsoft, that would be a real bonus.

      1. Anonymous Coward
        Jobs Horns

        Re: Easy

        <quote>In fact, if they just stopped becoming the 21st century Microsoft, that would be a real bonus.</quote>

        Yes, because that is Apple's job.

  17. JDX Gold badge
    Thumb Down

    re:How about you give us something unambiguous Google?

    It's open-source, try it yourself you dummy.

  18. Anonymous Coward
    Anonymous Coward

    It won't go anywhere...

    ... at least not quickly.

    Microsoft still has the lions share of the browser market and given their track record, they won't adopt this anytime soon, if at all.

    JPEG as a lossy format is good enough for the job. Many years back, we thought we'd see the end of GIF images when PNG started to become prevalent, but they are still around on the web - it also took microsoft until ie7 to support alpha transparency in PNG files!

    I'd like to be wrong, but experience tells me I'm not.

    1. Charlie Clark Silver badge

      What do we want?

      "Gradual change!"

      When do we want it?

      "In due course!"

      PNGs are now replacing GIFs, look at the comment icons on this site, around the web but they will never replace the legacy GIFs. This isn't a problem as you don't have to pay royalties to read GIFs only to write them. Actually, hasn't the patent already expired?

      Same is likely to be true of a new lossy bitmap format. However, with C33-3 media queries that shouldn't be too much of a problem to implement. And if the savings of 30% per photo are possible then uptake is likely to be pretty good. Yes, I know this means storing two versions of the same image but disk space is less of a problem than bandwidth.

    2. Timothy Slade
      Thumb Down

      @It won't go anywhere...

      Yes it will.

      Opera are on board.

      Opera provide the browsers for the greatest number of mobile devices (Bing for the numbers, I can't be bothered to), the Wii, and a bunch of other web capable consumer devices.

      It's particularly relevant to mobile and users with low bandwidth connections, say in the 'developing world', where (IIRC) a majority of internet connections are mobile (don't think smartphone, though, think Nokia candybar style phones).

      1. Alastair 7

        I know you'll be annoyed

        But the basic suggestion that a technology is going somewhere because *Opera* are on board made me giggle heartily.

        Yes, they have a lot of mobile browsers, but that won't make or break the game.

    3. John Gamble

      The PNG Situation

      I did wonder about Google's choice of format to improve. Had I guessed based on my own browsing, I would have picked PNG as the format to work on. As other commentators have said, new compression techniques have been found in the intervening years, and PNG didn't quite manage to replace GIF due to its lack of animation.

      If it's true that Google Maps uses JPEGs, that obviously explains a lot. And I suppose sites like flickr and photobucket impose an increasing cost to cache these days. Anyone have more information on this?

      1. Charles 9 Silver badge

        PNG

        PNG is overtaking GIF for lossless still pictures because it's not patent-encumbered and provides better compression (Deflate vs. LZW). Although newer lossless compressions have emerged (such as LZMA), most require more resources to implement (LZMA, for example, is memory-intensive) and are not recommended due to the fact that some web browsers have resource limits.

        As for animated PNGs, they're still arguing over the matter. It's currently between MNG and APNG, and it may take a while longer for the dust to settle. Plus there's the possibility of the whole thing being moot thanks to things like Flash and web video.

    4. Hans 1
      Boffin

      e t t l i

      That might be the reason why png did not replace GIF.

  19. san1t1

    smaller images mean less bandwidth

    And that is all.

  20. Anomalous Cowherd Silver badge
    Happy

    Good stuff

    I work with image codecs an awful lot and, and this is a really good thing IMHO - a patent-free alternative to JPEG2000.

    JPEG also uses YUV under the hood - most (all) lossy compressors do, as it makes sense to allocate more bits to luma than chroma. It's web-focused, lossy and intended as the final step in the chain, so converting between sRGB to YUV is no biggie.

    You're not going to see transparency in this sort of codec, because it's for photos and continuous tone images. Using JPEG for anything else is wrong, just wrong - use 8-bit PNG.

    The glaring omission for me is there are some useful chunks Embedding an ICC profile would have been nice, an option for grayscale or CMYK would have been great and chunks for resolution and RDF/XML metadata would have been a no-brainer. Given how metadata-oriented google is these are pretty surprising omissions, especially as JPEG can already do some of this.

    As for solving a problem that doesn't exist, duh. Try asking an ISP what a 10% reduction in bandwidth use would save them. If they'd added alternative colour spaces I'd also suggest asking a professional photographer how many GB of photos they have scattered around, but I think they may have missed that trick.

    1. DuncanL

      RIFF

      "The glaring omission for me is there are some useful chunks"

      It's a RIFF stream - you can put in any chunks you like, "ID3" etc. Getting everyone to support and expect them is another issue though....

    2. DavyBoy79
      Thumb Up

      Fair point

      Didn't know that under the hood JPEG uses YUV. As for embedding ICC profiles... surely because it's a RIFF file, this is something really easy to do, because of what RIFF actually allows you to do in terms of storing lots of random shite in chunks.

  21. Kubla Cant Silver badge
    Headmaster

    Sic???

    "Some engineers at Google decided to figure out if there was [sic] a way to further compress lossy images like JPEG to make them load faster".

    I'm struggling to understand the relevance of "[sic]" in this sentence. It's normally used in a smart-arse way to flag up errors of spelling or grammar.

    There's a split infinitive, but that comes after the "[sic]", and a split infinitive is generally regarded as a minor error of style at worst.

    I suppose the "was" that precedes the "[sic]" should be subjunctive rather than indicative. You'd have to be a dedicated grammar Nazi to sic up on that, though.

    1. Sarah Bee (Written by Reg staff)

      Re: Sic???

      I don't know how I lived without this place when I was on holiday. I really don't. It enriches me.

      1. blackworx
        Heart

        Thank god you're back

        I was starting to get paranoid at the number of rejected comments while you were gone. I knew it couldn't have been me as the comments in question were imho no more idiotic or snarky than my usual tripe. I was starting to wonder if there'd been a secret policy change.

      2. Justin Clements
        Happy

        Re: Sic???

        2 weeks in therapy is not a "holiday".

    2. handle

      Pedantry

      If the author is going to be that pedantic about the use of a subjunctive, he really ought to have noticed that later in the same sentence, a far worse crime is committed - the use of an adjective (faster) instead of an adverb ([more] quickly).

    3. John H Woods Silver badge
      Happy

      If I WERE a grammar Nazi ...

      ... I'd point out that the use of [sic] simply means that the source material is quoted exactly and not paraphrased by the author. It is not usually taken to indicate a judgement by the author on the quality of the original prose: it's just intended to stop people piping up "You mean WERE" or "You split your infinitive".

      Of course, I agree with you that it is acceptable to occasionally split infinitives and, if I was asked I'd also be relaxed about people not using subjunctive forms. But I do find it ironic that in an attempt to avoid comments about the finer points of style in a quoted section, the author has attracted a comment about the way they've quoted it!

    4. Anonymous Coward
      Grenade

      Praise be..

      ..that I wasn't the only person who's first thought was to rush to the comments section to express confusion over this.

      I would argue that [sic] isn't really appropriate for grammatical issues anyway given that it of course stands for "spelling is correct". I'd argue that given in the case of direct quotations surrounded by quotation marks [sic] is only relevant to show that you've not introduced a typo yourself.

      I'll be honest, if I'm quoting something with spelling errors or grammatical atrocities in it and the errors are themselves not relevant to the context of the conversation/piece, I usually just fix the errors, rather than behave like an arse.

      Perhaps the original quotation did have a spelling error and it got removed when the author's spell checker was run against the piece :p

  22. Piro
    Pint

    JPEG artifacts...

    Are the bane of my life, spotting them in images where JPEG should never have been used, text heavy banners for example.

    Then again, this isn't solving that. PNG already did that, but somehow people fail to understand that different image compression techniques are applicable to different types of images.

    1. JimC
      Happy

      I envy you...

      If jpeg artifacts are such a big deal for you to be the bane of your life you must have an easy time of it compared to most of the rest of us...

  23. elewton

    Where is the perceived negativity?

    Some responders are defending Google as though it were being attacked. Where is this so? The article seems quite friendly to me.

  24. techmind

    JPEG vary enormously anyway

    JPEG specifies the encoded data stream, and how to decoded. Exactly what psychovisual model you use to reduce the data is up to you. The very best modern JPEG encoding today (Adobe Photoshop "Save for Web" is pretty good) gets a far better ratio of real quality to byte-size than the coding of 10 years ago.

    Even if the new algorithm is better, that's not to say that a lot of the images on the web couldn't be made smaller and/or better by re-coding from the uncompressed source using a newer JPEG encoding algorithm.

    In my experience of 3G, its the compression steps applied by the mobile operator which SLOW DOWN down the loading. That and the 100ms+ latency.

  25. Eric Teutsch

    Related article --> has Microsoft and Google mixed up!

    http://www.tgdaily.com/software-features/51815-microsoft-promises-to-speed-up-the-web-with-new-image-format

  26. JDX Gold badge
    Grenade

    @Timothy Slade

    Hilarious. The whole web is just hanging off Opera's every decision so they can chase the decades-ahead-but-not-used-by-anybody-for-some-strange-reason "titan" of browsers.

    It's not just MS, FF and Safari need to come onboard too.

    Are Google getting support into their own code, or the shared base used by other browsers?

    1. Charles 9 Silver badge

      Safari and Mozilla are already in.

      Safari uses WebKit, so they'll be among the first to implement WebP.

      And as the article states, Mozilla is already collaborating on the project, so they're interested, too. Once the standard's nailed down, expect an update to include support for it (perhaps even Mozilla's helping to push the standard in time for the 4.0 release).

  27. Dick Emery
    Grenade

    The only place I can see this being useful...

    ...is on a certain notorious image board :)

  28. someone up north
    Happy

    weppy ? it can all end up in tears

    weppy - a good name,

    just hope that it does not all end up in tears!

  29. F Seiler
    Stop

    about the ratio "saved"

    Even though comparing the size of two lossy compressed images is totally meaningless if you do not provide that both have the same viaual quality or at least something like a SNR ratio ...hell, even compressing one using the other as source is totally meaningless.

    Anyway, the ratio of 40% sounds not too far off. If you'd took out-of the box h.264 and applied all the tools applicable to still images, you'd probalby end up with a similar figure compared to JPEG, at the same "computed visual quality". No surprise there ... It's almost 20 years since JPEG was born (well also almost 10 since h.264, but still 10 difference...).

    Looking at JPEG2000, it also claims around that marging to JPEG. Frankly, i think it performs worse than JPEG at high quality. And while details are somehow better preserved at qualities you really don't want your eyes to suffer through, it looks like shit there. IOW, it is technically a worthy progress and may have some merit at the lowest quality levels, but no thanks.

    Now, this web-P ...it's derived from VP-8 .. aren't On2's video codecs based on ..wait.. wavelets, like, um, JPEG 2000? Regardless of whether that is acually true or just a misunderstanding of mine, On2 videos damn well look like having the exacly same crappy degradation performance as wavelet compressed images do. Maybe it is technically not exactly wavelet but its failure mode looks a lot more like typicaly wavelet than typical block-transform. It just doesn't look pleasant.

    I hope "40%" is just not enough. While h.264 got away with a "50% off MPEG-2" - basically because the industry needed some new video format to sell "HD", i hope google doesn't yet have the power to force this on everyone. I mean, i perfectly recognize JPEG is old and by now world and dog could do "better" compression wise, but just because google says "40%" i don't think it's worth it. After all, JPEG-2000 faceplanted with wavelets; if it wasn't for youtube/flash crap-league videos, nobody would even know how a wavelet-encoded video looks; h.264 isn't wavelet even though the tech "existed" at the time and the next MPEG codec doesn't smell like it were based on wavelets either. To me, hints at "wavelet compression isn't ready for humans to look at".

    Sorry for the tirade. I like the wavelet concept technically (eg for computer image analysis like feature recognition), but my eyes don't like waveletted images at all.

    1. gratou

      @Seiler

      >I hope "40%" is just not enough. While h.264 got away with a "50% off MPEG-2" .....

      If images can be further compressed by 40%, surely video can do a lot better than 50% seeing that it also benefits from compressing along the time line.

  30. Christopher E. Stith

    GIF is lossy for photos

    Unless your camera is only taking pictures in 256 colors, you're losing a great deal of data in a GIF. It is primarily useful for commercial graphics and illustrations. Just because it encodes every pixel does not mean it's maintaining all the data. PNG can be either lossy or lossless, even though it keeps every pixel, because you can downcode to small numbers of color bits per pixel optionally in that format.

  31. foxyshadis

    Yet another rehash like Microsoft's WDP

    So Microsoft comes up with an image format based on adding overlapped blocks to JPEG to get rid of blocks and shrink images with the same quality, and they are roundly castigated for attempting to embrace and extend to a proprietary format - even after releasing the spec, the reference encoder, and a royalty-free license. Google is praised for doing essentially the same thing, but with a much more complex encoder that will never be fit for most hardware users. Various image codecs have been built off of h.264 as well, but none have taken off, and all have so far been pale imitations of JPEG2000 at low data rates, and barely equivalent at higher. What makes Google think that its own will get any more traction outside of google images, when most sites will just use the bog standard good-enough that everyone can read instead of storing two or three copies just for a few "advanced" readers?

    At least one h.264 image codec has the advantage that you can use it with flash, which world+dog already has.

    The in-progress h.265 codec is the only one that can surpass it so far - at the cost of 15-minute encoding times for each image.

    1. Charles 9 Silver badge

      Microsoft didn't go all the way.

      They made WDP royalty-free, yes, but they never allowed others to tinker with it. This is the difference-maker with Google's effort. They're going all the way and OPEN-SOURCING the entire works (and if they use the same license as WebM, it'll be based off the well-recognized BSD license). This means it's open season and anyone with good ideas can improve on it.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2020