
Open Source?
Anyone notice a rather surprising omission from the list of programs to have, or soon have, support for WebP? GIMP - which is open source!
It seems it's largely closed source and corporate entities that have made the moves so far.
Google has announced that Gmail and Picasa as well as its Chrome browser are now using WebP, the image compression format it open sourced last fall in an effort to replace the aging JPEG standard. With a Friday blog post, the company also said that it has made several improvements to the technology since it was first unveiled …
"It seems it's posters like you who give the internet a bad name".
Have you asked the authors of the "image editor too complicated to use" why they haven't added webp yet?
webp uses existing open source libraries very heavily to provide a truly compelling format: bitmap images are both significantly smaller and of better quality than comparable JPEGs or even JPEG2000 especially if they contain text.
I was thinking the same as the OP - GIMP's format support is pretty good. I imagine WebP will appear in time.
As for being "too complicated to use", I think GIMP's alright really. I'm not even a graphics guy, but I find it fairly straightforward to use when I'm doing the odd bit of editing. If the layout is too unlike Photoshop for you, try GimPhoto.
It's OK, I will explain to you, since it is obvious that you have been living outside of modern society fro some years.
In many countries, especially the United States Of America, they have things called "patents" -- this is a kind of "bagsy" people can have on something that they think they may make money from at a later date -- a kind of bet if you will. So, people who disagree with this kind of blatant abuse of power tend to look to other ways of doing things, which aren't covered by these so-called patents.
Oddly, Google prefer to move away from patents -- but to do so they have to register patents then "open" them.
With this type of strategy google will own the world's data and the world will be a worse place. But nobody else seems to care, so I wouldn't worry your pretty little head about it.
Licensing. Not only do the JPEG formats include patentable tech held by third parties, but all of the JPEG formats are "owned" by the Joint Photographic Experts Group itself, and we simply are "allowed" to be use those formats until the JPEG decides that they no longer wish to "allow" that (like Compuserve and Unisys did with the GIF format back in 1995 and like Larry Ellison over at Oracle is doing with Java and MySQL, for example.)
Other then via patents, you cannot "Own" a format. AFAIK only JPEG2000 is encumbered, which is a vary small and avoidable (albeit at a size penalty) sub section of the image format. Google's main push is because this should outperform (or at least match) JPEG2000.
Unisys never "owned" the GIF format, they owned a patent on the compression scheme used in it. All documentation I have seen indicates that Unisys was unaware of the compression scheme being used by GIF until it was already widely deployed (remember, in the compuserve days, not everyone was online all the time like now, so this is plausible at the vary least).
Oracle may be being a dick, but SUN always said that the Java patents where to guard against incompatible Java implementations. In fact, SUN used this identical tactic when Microsoft was pushing an incompatible Java implementation. Their perspective (which seems to be somewhat accurate) is that Google is pushing an incompatible implementation. The 600lb guerrilla is that Oracle won't permit them to run the certification suite, which is the only problem I have with oracle in this case (although there are other reasons to dislike Oracle)
I'm not aware of anything going on with MySQL, other then one of it's former creators who in retrospect finds the GPL not permissive enough. Also not something I can blame oracle for.
Not as of the latest public release, and we're expecting the next iteration from ISO in the next few months so I wouldn't count on it for a while, at best. PDF can already embed JPX (aka JPEG2000) compressed images, so I'd guess Adobe will leave it there unless this offers a marked improvement.
probably hits the nail on the head. The whole MIME-type support in HTTP does allow browsers to specify what documents they can handle, so good web servers should be able to tailor the files served to suit the user. If this means format shifting or resampling at lower quality, then the server must know how to do those things and support obviously needs to be added for new formats as they appear.
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
All http-servers need to set the MIME type on their response without webp in the magic list Apache will set appication/x-unknown or similar leaving it to the browser to try and work out what to do with the content presumably be reading the first bytes. With webp in the magic list Apache can set image/webp and the browser knows straight away what to do. Not essential but certainly helpful.
When the web-browser requests the contents for an <img> tag, it sends a http-request including one like
Accept: image/png, image/jpeg, image/bmp, image/svg;q=0.5
The web-server can (as others have mentioned) use this to provide the image in the 'best' format that the browser can render, bandwidth considerations included. The optional q= tag is an indication of how good the MIME-type is to the browser, 0 being unacceptable and 1 being ideal.
A conferment web browser should ignore the extension on the <img> tag, and instead use the MIME-type sent in the browsers format to decide the format of the image (and all other media for that matter).
"I see no reason why they would avoid this one."
Because they have very few developers and have been crawling along with improvements to the core over the last few of years, let alone finding the time and manpower to add new image formats which will only compliment adequate existing formats at best.
An open source app not being able to implement an open source format or make use of any existing Google code isn't a healthy sign.
I mean OK sure, once _all_ browsers support it it might have a chance, but seriously images don't take up much bandwidth anymore. When loading a website, the images often barely add to the loading time. Advertisements often add a lot more as they often require a domain name lookup.
So you might get 40% less traffic for images, which might amount to less than a percent of total traffic, but you loose compatibility. That's a fairly bad tradeoff at the moment.
Just look at the example of PNG. It just now has surpassed GIF. It's the only image format in widespread use not existing before the WWW became popular. It took 15 years for it to become popular. Maybe in a decade or so Google's format will have reached a similar popularity, but it's to early for it now.
Webp is visibly superior to JPEG while creating much smaller files. As it uses the same algorithms as in webm you can also hope for better optimisation including the hardware acceleration which is just around the corner. Webp is also a container format so it does also do lossless compression, very efficiently, if desired, so you can use the same format for your graphics as for your photographs, effectively streamline your workflow.
Adoption of PNG was slow partly because it was initially hamstrung - no animation and no transparency - and so offered little incentive to content producers who didn't need to worry about a Unisys tax. Also, PNG was introduced shortly before the Microsoft won the browser wars and killed innovation. It survived as a good replacement for BMP and TIFF and later became the web 2.0 designer's darling because of the alpha channel. Now we're seeing at least two new browser versions per maker per year with fairly rapid take up for all apart from IE on the back of security issues.
webp is a good candidate for adoption which is why it's already in Opera's Turbo stack and my *guess* is that it will make it into a Google's pagespeed http-server module, allowing transparent, feature-driven (WURFL) rollout. And while bandwidth may be cheap and plentiful for many of us page load speed still matters, especially on those pesky mobile devices so the operators will want to plug it in as well. And which webmaster in their right mind won't take a 40% reduction in page load time for free?
It may not make much difference to the end user but a 40% reduction in image size is a hell of a difference in bandwidth for the web servers of a high-traffic site or for ISPs who shift massive volumes of data every day.
I agree that it'll take time for it to become popular but it won't become popular at all if they don't start now.
I've wondered about why JPEG2000 hasn't really taken off. It may be partly patent encumberment and partly that it is just too slow and costly to encode/decode. Microsoft developed a faster encoding that is similar to JPEG 2000, but I don't know the details of it.
WebP uses DCT block coding, like jpeg and mpeg. It almost doesn't pay to get too clever about "better" encoding, because DCT has been optimized so intensely, to the point of influencing processor instruction architecture.
That would be "HD Photo" or "Windows Media Photo", which was standardised as JPEG XR, if memory serves correct.
I haven't seen any example files floating around, though - so I can't easily compare it to other formats, or otherwise dissect a sample in the name of curiosity/science...
No, really. Save a file in WebM at the highest quality, then try saving the same file as a JPG, matching the file size. The JPEG will be much, much worse. Set JPEG to the same quality, and it'll come out a whopping 66% bigger (yes, that's how percentages work to those who see 66% and are confused by the fact that it's not 40% =)
Essentially, WebM isn't just "different", it really is unequivocally "better".
Grab yourself a copy of of the WebM Photoshop format filter, and a copy of filtermeister - open any random "real" image (let's saw a raw photograph of good old' nature, or a crisp illustrator-exported vector graphic), save it to highest quality JPEG, close the file, reopen the original, save it to highest quality WebM, close the file, and then open all three.
Create two new images, copy the origina image's layer to both, and then copy the jpg layer to one, and the webm layer to the other. Invert the colors on both layers, and set their opacity to 50%. We now have a difference result. Flatten Both images, job's a good'n.
So that's the first thing we really care about: can you see any artifacting? I've had to run this test many times, My puny human eyes can't. Not without first zooming in a lot, anyway. But let's assume that this isn't enough to convince you, because what should be {127,127,127} could be {126,128,129}, right? Maybe you care about that difference. You shouldn't, but maybe you do, so let's see how strong the difference really is.
Fire up fiter meister and write a small color ramper - For all three channels (R,B,G, neither WebM nor JPEG officially support A), set up a color ramper so that if the color is off from the 127 mark, it gets stronly boosted, say fifteen fold, with a cap at 0 and 255. Now you can see the difference in how JPEG and WebM do their magic - WebM actually uses much nicer spells, with less intense differences between the original and the lossy encoded image.
This really is a fantastic image format, and long overdue. The only tarnish on its reputation is that the world hates "new" formats. Thankfully, the company that wants to push it also happens to make one of the more popular browsers. And you can be sure that the code for enabling WebM in Chrome finds its way into webkit, followed by the Mac supporting WebM. Photoshop already supports it with the download of a single file, and suddenly its future looks nice and bright - Superior image format alternative for JPG that makes super tiny files? About bloody time.
Mine's the one with the WebM filter on a USB stick attached to my key ring.
Is it not about time we moved on from this miserly pursuit of making things as small and as bad as we can get away with, and moved towards luxurious lossless compressions? JPG and its ilk were fine back in the day, but these days, with our 50mbit connections (or just 8mbit via my mobile) and 2TB harddrives, I don't really see the point in skimping like that.
You quite assuredly would want the best compression you could get, if you were paying the bandwidth bill for a website that transmitted billions of images per day
Like Google.
More interesting is the green angle. Are compressed images unconditionally better because they eat less electricity in all the network routers? Or are they worse because they need extra CPU cycles to create them in the first instance? How many retransmissions equals one compression in CO2 emissions? (Sorry, no answers).
Maybe it's because not everyone is as fortunate as you to have high speed connections and fat hard drive. Maybe it has something to do with the way that websites are no longer being constructed carefully with a view on loading times but have reached a 'chuck it in, everyone is on ADSL2 at least'.
Our own eyesight uses lossy compression. We don't view entire images and process every detail instantaneously. If we did, optical illusions wouldn't work. Why should we distribute photographs using lossless compression when at first glance the human eye can't even tell the difference? We save bandwidth and provide acceptable results with lossy compression. Man-made inventions often are inspired by biological adaptations - lossy compression is just a digital example.
Open source or not, if Google pack their browser with "standards" then they are going to fragment the web as surely as Microsoft did when it churned out VML, ActiveX and VBScript. Every other browser will have to choose to implement these things or not, and even those that do will be playing catchup.
It's nice to have new proposals for video, pictures & protocols but they should go the proper channels to approval. They shouldn't be fasttracked and they definitely shouldn't be under the stewardship of a single vendor.
I can think of two main reasons for compressing an image more for a given quality. One is that you have limited storage, the other that you have limited bandwidth to transfer the image. These days storage is incredibly cheap so the only people who would benefit from such an improvement would be those with very limited bandwidth. As such there is a small and shrinking market for this.
Storage and bandwidth cost companies money, especially those whose business model rides on making tiny slivers of advertising revenue from huge amounts of storage and bandwidth that's free to the end user. As the article explained, Google's servers send out a lot of images each day, and using a better format will have an accumulative effect on their running costs.
As for end users and bandwidth, you may have a fast landline broadband connection, but many people in the world use extremely slow public/shared wireless networks or dialup in less developed countries. They'll all benefit for many years to come. That's why companies like Opera are today pouring resources into developing proxy networks to speed up connections, something AOL was criticised for by the same kind of techies in the 90s when everyone was on dialup.
PNG is a replacement for GIF, and so is suited for line art or simple gradients, etc. It's very poor at compressing photos or more complex computer graphics. Even images of text can be better compressed with JPG sometimes, once you apply a few Photoshop effects, glows and shadows, etc.
PNG is used far more today than it ever has, but there continues to be a lot of inertia and ignorance keeping GIF going. And photo-like imagery is probably more prevalent on the Web than line art, so there are fewer images added to the Web each day that require the type of compression PNG delivers.
There's certainly a need for something good to replace JPG. The problem with Internet connections getting faster is that web pages continue to bloat proportionally. It's not uncommon to see web pages that contain 1MB of HTML, CSS, Javascript, images, Flash, etc, especially on sites that embed feeds from several social networking sites and third-party news feeds. That makes the Web as slow to the end user as it's always been. As Websites will probably continue to become graphics-heavy, WebP may well help speed things up (as will making good use of SVG and embedded fonts etc).