OMG lolkatz
People who post funny cat pics have no idea what Mozilla/Firefox is.
Click on the 'E' for internets!
Mozilla has announced a new open-source JPEG encoding library, which it says could significantly reduce the amount of network traffic used by the web. Dubbed "mozjpeg," the library incorporates image-shrinking algorithms from the public domain JPEG-optimizing utility jpgcrush into the encoder itself, resulting in images that …
The problem with modern encoders is they rely on the DCT method (D stands for Dog)
This has been scientifically proven to not optimally optimise the compression of amusing felines.
I propose a new standard: KitPEG which has custom routines to properly deal with whiskers, fangs and mad staring eyes.
I would have called the standard Kitten Squasher, but whilst technically correct, would probably get me in trouble
Slight error, DCT is Discrete Canine Transform and is a cruder version of the more elegant Fast Feline Transform (FFT) used for smoother purrier cat pictures
The web has certainly moved on from the earlier days when it was dominated by the somewhat limited, Giraffe Image Format and the rather more limited appeal of Porn N' Gerbils files.
AOL had a proprietary encoder they called ART which gave much better compression than JPEG for most things. Since the AOL app went away, so did any use of that encoder. I think that whomever owns AOL now (Huffington?) ought to just release that patent to the public domain and that w3c ought to adopt it into html.
Why not simply charge a tax "per image/video" to the website host.. Then we would see a dramatic decrease in the number of images used.
Images have their place in the web, that is undoubtable, but unfortunately today the majority of images are either catporn, porn or publicity ( sometimes even publicity for more porn).
As for those responsable for providing bandwidth hogs with suitable material, then without a doubt, Google would be in number one position. Youtube, Youporn and Google generated Advertising......
Ok, the article is about JPGs but JPG usage is tiny in comparison to the bandwith hogging that video generates.
This already happens; sites pay for bandwidth from ISPs or peering partners, with enormous enterprises like Google creating (or buying) their own globe-spanning physical infrastructure to handle the bulk of it, with local peering arrangements giving a short 'hop' across the Internet.
To take your Google/streaming video example, Youtube videos would be shared across Google's own nigh-transplanetary fibre network between their local data centres, then popped onto the network of your ISP or their suppliers.
So yeah, it takes a lot of bandwidth- but it has it's own dedicated links. So In terms of bandwidth used, an email sent between two residential email servers from, say, London to Mountain View uses more 'internet' bandwidth than a hundred Youtube cat videos. The emails in that example are being carried by the 'public' internet rather than a company-specific network.
Also, why would the web host pay? That'd mean you could put them out of business with a botnet pulling image or video files from their web server. The cost would have to be borne directly by the consumer.
And why just web sites? What if it was a FTP site? A Gopher site? P2P or Torrent traffic?
WHO would the web host pay? Obviously, I'd be happy to personally take a few hundred million a year from Internet Taxes, but realistically it'd be a Government who'd do this. And at least one of them wouldn't, giving them a great market advantage for enticing high-bandwidth businesses to set up shop with them.
What I'm getting at is that what you described is at best already implemented and at worst a terrible idea that'd lead to the bankruptcy of various small companies.
People who are so short of bandwidth that jpegs are a problem will be very unlikely to run video, so your leap from an article about jpegs to video is bordering on gibberish.
Furthermore you seem to be unaware of the likes of adblock which get rid of all those inline adverts, video or not, that you object to.
The thing is that taxing things on the internet makes no sense anyway. You could only apply your tax to sites based in the UK, which would of course mean that UK based websites would find it hard to compete in a global market world stage. So everybody that could afford to offshore their websites would do so and everybody who couldn't would be in trouble. So webhosting business in the UK would suffer as would smaller companies who couldn't afford to shift their web operations offshore.
BTW are you sure you're not Ed Milliband? A tax seems to be his solution to every problem.
Calm down guys, eat some weetabix and all will be fine.
What I am really getting at is just how stupid this idea really appears to be . Why bother reducing JPGs when they represent no problem, there are far many more areas that need to be adressed first.
BTW: Google does not provide the connection to my home, my ISP does. So if all of my ISPs' users(my neighbours) are downloading torrents, netflicking, viewing porn or watching XFactor videos then the cabinet at the end of my street gets choked.... It doesn't matter squat diddly whether Google has private networks or not, it is the last mile that counts.... Like the majority I don't live in an area where FTTH is prevalent.
>so your leap from an article about jpegs to video is bordering on gibberish.
No, because I am making an attempt, I admit a poorly humerous attempt, to surface areas which "need" to be covered rather than areas that make no almost impact whatsover. Remind me again how many average jpgs = 1 reasonably sized video.......and you call that gibberish
Most, as in *most* of the UK population. Down at the old man's urban house there is only about 2Mbps downstream, but I don't have trouble loading image heavy sites. Up here in our little, rural south pennine village we get ten times that bandwidth so obviously we don't have a problem. The days when images were slow to load were the days when broadband of 512kbps was considered high speed. Only a minority have to suffer that sort of speed these days.
If you're on the move 3G and later make the situation pretty much the same. Most of the people have decent bandwidth most of the time.
If you happen to have low bandwidth then you have my sympathy, but despite what certain parts of the media would have you believe you are part of a shrinking minority.
Because low bandwidth is a problem for a minority these days who exactly is going to bother encoding their jpegs with this tool?
People in cities do tend to be smug about their 5 mbps (and up) connections. Rural wireless here in Canada tends to be 1.5 mbps and I understand that large areas of the USA are the same. There are a few percent on this continent who are still on dialup, but that is often by choice, so they don't have great expectations. The world is much bigger than UK and NA though.
I'd have to agree that jpegs are not much of a problem. A few years ago when I was doing dialup, me and the missus were both able to surf at the same time. Of course we now pull down lots more, but the rural 1.5 mbps "highspeed" will still allow audio streaming, Youtube, and surfing all at the same time. People in cities do tend to be smug about their 5 mbps (and up) connections though, and high definition video streaming is not going to happen in much of NA any time soon.
Except of course that for most of us bandwidth is no longer a problem. Bit late coming this one, like a decade.
Exactly what are you doing on this site? It's quite obvious you do not work in IT, because then you'd realize that it does matter from the point of view of the ISP and the server operators.
Browsing Google+ (yes, some people do actually use it!) I find rather a lot of animated GIFs that appear to take up a lot of bandwidth, and also have a habit of autoloading. I always feel that said GIFs are actually larger than the video stream they were built from...
GIF is a lossless standard like PNG. There's no wiggle room for better compression without discarding data or changing the format, a GIF is a GIF. At this point, it's pretty unlikely that a new animated image format will catch on. Anything you can't do with a GIF, people are probably going to do with a canvas.
People are already doing clever stuff with compressing the motion bit of animated gifs by defining the smallest part of the image that needs to be updated with each frame. Most video clip to GIF software does this.
Animated PNG is superior in just about every way. Smaller files and, unlike GIF, it can handle more than 256 colors.
Not that it gets much use. Microsoft refuses to include support in IE. They dragged their feet for years about supporting non-animated PNG. It's company policy never to support any open standard unless it has become to popular as to leave no option. Apple is no better.
Just because it's lossless doesn't mean there can't be more than one way to represent the same data. Consider file compression formats (e.g. ZIP, RAR). Those are obviously lossless and it's quite possible to compress the same data to different degrees. I'm not saying there actually is room to improve GIF encoders, but it's not impossible that there could be.
Exactly. I've found that many animated gifs can be reduced in size by 'intergif':
Good point. But Mozjpeg doesn't seem very exciting. That 10% saving was with purely photographic images. The jpegs on most web pages are large coloured blocks, like the Register logo at the top of this page. That stuff already compresses into almost nothing anyway.
Also, any tighter compression algorithm is likely to use more CPU at the client end.
any of the other numerous wavelet and other compression schemes that were demoed endlessly, usually via browser plug-in, in the 90s but never adopted by any major browser as a standard? The tech to do far better compression has been around for a very long time and considering we're now in 2014 there must be some of it unencumbered by patents.
I can remember some of the plug-in demos on Pentium II machine running Win98 were a bit slow to decode but the images were remarkably small compared to the JPEG version. Any current platform should be able to easily eliminate that slowness.
Software patents are probably the biggest problem nevertheless. JPEG2000 (first introduced in 2000) might perhaps come into widespread use in 10 years or so, when it becomes obviously unencumbered, even by submarine patents, and in all jurisdictions. And that is only the basic barebones version of the system, later enhanced versions of the standard are encumbered by later patents.
See how succesfully patents promote the progress of the useful arts!
yes, the case of jpg2000 and mp3pro show it doesn't quite pay - neither rights holder has made a fortune off those two, and in fact, they have stomped them out into near oblivion. And now, as transfer speed, disk space and bandwidth are decent, these two are pretty much irrelevant anyway.
...
well, jpg2000 would be still pretty useful, as open source, that is.
Mozilla forgot one other item as well. JPEGs are compressed in phases. The first few are to translate the RGB image to a different colorspace, something like YuYv. It then reduces the image down depending on configuration (remember, JPEGs loose information by design), and then finally hands it off to a general data compression codec.
That codec? Arithmetic.
No, it's not LZW (aka XZip), not BZIP, not even GZIP (PKWare's Deflate) or LZ77 (GIF), hell it's not even Huffman! It's plain arithmetic encoding, and anyone who knows their compression codecs knows plain old' arithmetic encoding SUCKS.
To demonstrate that it sucks, the folks behind StuffIt! (Mac compression tool) put in a custom JPEG compressor, which took the JPEG, undid the arithmetic encoding, and recompressed it with Deflate. When it was uncompressed, it just un-Deflated it, re-arithmetic encoded, and passed it off as the original JPEG. It got like 10-15% savings.
If so, why don't JPGs compress worth a damn? You can take a folder with (say) 2GB of JPG images and compress the whole thing with your choice of compression software, and you are usually lucky to get enough compression to be worth the bother or the extra CPU cycles. Well, that was certainly so last time I tried it, and I tried several different compression methods without seeing any difference worth mentioning between them, though it was a few years back now. Has something changed?
> hell it's not even Huffman! It's plain arithmetic encoding
Actually the JPEG standard allows both Huffman and arithmetic coding, and most JPEGs use Huffman, because of concerns about arithmetic coding patents (another case of software patents hindering progress!), and also because using arithmetic coding does not improve the compression very much, and is slower.
>It's plain arithmetic encoding, and anyone who knows their compression codecs knows plain old' arithmetic encoding SUCKS.
I suspect you misremembered the relative quality of Huffman and arithmetic codings. Arithmetic coding is considered superior to Huffman coding. Please see the Wikipedia article on JPEG, which claims files that use arithmetic coding are about 5–7% smaller.
Compatibility. If you did that, the JPEG wouldn't open in an unmodified browser or viewer, which means it'd be effectively unuseable on the internet. New formats like that are always in a chicken-and-egg situation: No-one can use them until all major browsers support them, and there's no reason for the browser developers to support a format no-one uses. See JPEG2000 - an intended successor to JPEG which has been stuck in this situation for years, complicated further by being patent-encumbered in a way which makes it legally very difficulty for open-source software to support it anyway.
Didn't there used to be an altsrc tag? .. so those that couldn't display webp could just show the jpg - I'm sure it wouldn't be too hard to knock up an apache module that could stick that in and re-encode on the fly if needs be, if traffic is so important.
So whilst it is good to get a 10% saving, aren't they just polishing a steam train?
Improved compression (once) --> reduces bandwidth of every download x every jpg using that encoder.
And it's open source.
This is very much in the spirit of the founders of the web. Thumbs up.
I wonder if El Reg as a tech site already uses this JPG cruncher tech already?
"I have already updated my website to serve WebP, and I set the alt-tag to highlight that missing images are a deficiency in the users browser, and that using Chrome or Opera is recommended."
Way to get people to like your site. Not.
Are you really so egotistical that you consider your site so important that people will change browsers just to look at it? Much more likely that they'll think "stuff you" and not bother with your site.
If you like WebP then by all means use it, but the smart way of doing things is to serve WebP to Chrome/Chromium browsers and do things the normal way for browsers that can't, rather than dictating to your users.
Funny thing is not so long ago it was Firefox that attracted saddos like you. No matter what browser I've used and whatever browser I've been told to use I have NEVER changed browsers just to view a website.