Frankly JPEG XL is much more attractive than the patent-larded monstrosity that is AVIF, but I get their point. Every new format, specially a niche one, increases the already large attack surface of the browser. Most of the exploits used by NSO's spyware are from bugs in image file format parsers used by iMessage, for example, and you can bet the JPEG XL implementation is nowhere near as robust and battle-tested as the JPEG or PNG one (which still deliver a steady stream of CVEs despite their maturity).
Still no love for JPEG XL: Browser maker love-in snubs next-gen image format
Browser makers Apple, Google, Microsoft, and Mozilla, alongside two software consultancies, celebrated a moment of unity and common purpose on Thursday with the announcement of Interop 2024, a project to promote web browser interoperability. The process began last year by gathering proposals for web technologies that group …
COMMENTS
-
Saturday 3rd February 2024 12:00 GMT Tomato42
As a casual graphic user and photographer, I can say that it's the first time I hear about JPEG XL. I did hear about WebP (seen it used "in the wild"). So, what is the benefit of JPEG XL over WebP?
On the other hand, I see the JPEG 2000 in the graveyard, what makes JPEG XL not another format that will simply not join it there?
-
-
-
-
-
-
-
-
-
Sunday 4th February 2024 21:51 GMT Nick Ryan
JPEG-XL is Royalty Free - as in no patents. If you had used "your favourite search engine" before you posted you'd should have come across the following:
JPEG (the definitive holders of the specification of this format): Overview of JPEG XL
"Provides a free and open source, royalty-free JPEG XL reference implementation, also available on Github."
"Unlike some other modern formats, JPEG XL is not encumbered by patents nor does it require proprietary software. The reference software, libjxl, has a permissive open source license and is a production-ready library that can be (and already has been) integrated into a variety of image-related software."
"JPEG XL is a royalty-free raster-graphics file format that supports both lossy and lossless compression."
@anonymous coward: Given the definitive statements above, can you explain your statement about JPEG-XL being stopped because of patents? The statements above make it very clear that this is not the case whatsoever. The only "stopped because of patents" that is likely with JPEG-XL is that an industry player is blocking it because it does not use any of their patents.
-
Monday 5th February 2024 14:54 GMT imanidiot
Royalty free doesn't mean no patents!!! It just means no one is CURRENTLY asking for money to USE the patents involved. And that is a massive trap that no-one is willing to step into.
"what a pretty image format you have there, and are so heavily reliant on. Would be a shame is someone where to enforce payment for those patents now wouldn't it... How about we make a special deal? Just for you?"
-
-
-
-
-
-
-
-
Monday 5th February 2024 07:51 GMT CheesyTheClown
Cool... but
JPEG is a very simple format with the exception of the headers (more accurately the trailers) which are a nightmare. It's a simple early DCT based encoding based on 8x8 macroblocks. Implementing a cleanroom JPEG decoder is simple and easy to do cleanly.
JPEG's primary form of loss comes in quantization and "zig-zag truncation" which is when the image is transformed from the spatial domain to the frequency domain, generally a lossless or less-loss operation allows simply dumping high frequency detail from the macroblocks and then representing frequencies in steps rather than values of a generally guassian distribution.
The "compression" aspect in the sense that after we're done just arbitrarily dumping data is based on quite old and inefficient entropy codings.
Most other DCT compressions could easily implement the feature you're talking about and achieve similar results. As a matter of fact, it would require no changes to the standards and with only a little work could run some mathematical magic to merge 8x8 macroblocks into larger macroblocks with negligible loss. The only thing required to do what you're suggesting is to implement an abortive decoder which simply decodes purely the frequency domain (therefore pre-iDCT) and then pass that data to the compressor of any other standard. The one requirement is that the decoders for those codecs make use of the standard (thanks to JPEG) DCT coefficients for the process.
Here's the kicker... JPEG-XL is a monstrosity of a code nightmare. To implement all the fantastic features which JPEG-XL's baseline requires, there's a tremendous amount of code complexity. A hand written, sandboxed, safe implementation of such a decoder could be very time consuming. Where a good cleanroom JPEG decoder would require a few days to create, JPEG-XL implementation could take a year. Also, without extensive mathematical analysis, I'm concerned the entropy coder could have obscure malicious table jumps which could be quite problematic in the security world.
Finally, there's the performance issue. JPEG-XL isn't nearly as bad as JPEG-2000 (which is a truly amazing standard if impractical), but coding the large amounts of hand-tuned assembler needed to implement the standard is going to take some time.
There just isn't a strong enough use case for JPEG XL to justify its existence outside of niche areas such as newspaper photo archives.
-
This post has been deleted by its author
-
-
-
Saturday 3rd February 2024 13:06 GMT Anonymous Coward
It’s gold plated HDMI for geeks.
Images are small. Storage is large. People don’t care. Your new image format has no relevant end user features.
* poetry is not my strong suit.
Ok - maybe an 8k, 10bit image compression feature is needed over jpeg. You get the point, though? JPEG, PNG, mp3, h264, 640k is enough for anyone…
-
Saturday 3rd February 2024 13:48 GMT Yorick Hunt
Re: It’s gold plated HDMI for geeks.
Indeed, the great masses are content with 320x240* images (palletised 8-bit, no less) just big enough to squeeze into a social media post for nanosecond viewing during their daily scrollfest.
Those who care about their media though will want something as close to source quality as possible - and those who pay for storage themselves (as opposed to their parents paying for it) also want to maximise the use of that storage.
* correction: 240x320, because the iPhone generation isn't capable of taking or viewing images in landscape orientation.
-
Saturday 3rd February 2024 14:55 GMT Ragarath
Re: It’s gold plated HDMI for geeks.
It's not storage that's the problem. It's transmission & quality.
Smaller size = faster transmission. But having that smaller size requires that you loose quality at the moment. In the area I work I would be lambasted if the quality was not good. And even though you are pointing out 240x320, it's not really. Yes that's the virtual size but the screen usually has more pixels than that, a lot more. You therefore if you don't want the image to look crap you have to size it for high PPI devices. At least double the size is needed and size therefore get's larger.
-
-
-
-
Saturday 3rd February 2024 15:43 GMT steelpillow
It's not about choosing
Browsers have to render whatever is frikkin' out there in numbers, be it WebP, AVIF, JXL, or whatever.
If the makers try to distort the market through denial of service, some other browser will come along and toast them.
Anybody still using Mosaic, Netscape, IE or Edge? No? There's a reason for that....
-
Saturday 3rd February 2024 21:31 GMT ldo
A Little Less Conversation, A Little More Action
All this aggravation ain’t satisfactioning me
https://www.youtube.com/watch?v=Zx1_6F-nCaw
What? Wrong JXL?
-
Saturday 3rd February 2024 23:54 GMT rcxb
Write a javascript conversion library
We're in the era of javascript. Most sites don't work properly without JS, at least most won't load images. Why not write a JPEG-XL to (PNG/JPG) JS conversion routine if you want to use it. Will save you the bandwidth, and be compatible with all browsers, not needing to wait on browser maker support.
-
-
Monday 5th February 2024 15:47 GMT Mike 137
Re: Write a javascript conversion library
",,, eliminate most of the images you are sending ..."
Or at least adjust them to the size allotted to their use on the web page. I've given up counting the number of multi-megabyte images thousands of pixels in height and width that commonly must be downloaded to render in a space typically no larger than 1024x768 on some web page. OK, keep the master image at full camera resolution, but resample server side, not client side, to fit the page. The bandwidth saving from this would far outweigh any gains in improved compression, and it would also save comparable space on the web server.
-
-
Sunday 4th February 2024 10:04 GMT Pete Sdev
Re: Write a javascript conversion library
at least most won't load images
I've not tested your assertion. Given the BS I've seen been done by FEmonkeys it could well be true. However images have been supported since before HTML v2 . While background images may be loaded via JS(yuck) the sites I've worked on use the standard img tag for semantically relevant images. In my part of the world accessibility is a consideration and the requirements are getting stricter.
Doing format conversion with JS is on the face of it a clever idea, though I suspect that it will flatten the battery of mobile users and is thus not viable.
-
Sunday 4th February 2024 21:58 GMT Nick Ryan
Re: Write a javascript conversion library
Doing image conversion using JavaScript is a ridiculous idea. Kind of clever, but totally and utterly pointless and in reality, quite a detriment to things. Which means that some idiot developer who has no real clue about anything will immediately include it include straight away in their already JavaScript laden abomination of a website (that breaks every accessibility and usability principle)... because it's new and shiny.
-
Monday 5th February 2024 16:49 GMT Brewster's Angle Grinder
0. The routine to do the actual conversion has to be WASM.
1a. Do the conversion, on the fly, in a service worker. So the page requests `./image.jxl?convert=jpg` and the service worker hooks this and converts it into jpeg. This is the best solution, modulo service workers not being well fitted for this kind of filtering.
1b. Fetch the binary data on the UI thread, convert it into a format recognised by the browser, then put it in a `Blob` and use `URL.createObjectURL()`. You can still use ordinary `img` - you just have to update the `src` based on some agreed strategy. (We put canvas rendered stuff into images in a similar way.) If you want it in CSS, it's possible to enumerate the style sheets and find the URLs and replace them.
In my opinion, the performance hit is probably not worth it. And the more images you have (i.e. the closer you are to a setup where bandwidth/storage gains are significant) the bigger the performance hits become. But it can be layered in to existing setups.
-
-
-
Sunday 4th February 2024 06:45 GMT Anonymous Coward
Put in on cruise control, lay back, lay off, and reap the dividends.
Rabin, the former software engineer, recalled one manager saying at an all-hands meeting that XXXXX didn’t need senior engineers because its products were mature. “I was shocked that in a room full of a couple hundred mostly senior engineers we were being told that we weren’t needed,” said Rabin, who was laid off in XXXX.
-
-
-
-
-
-
Monday 5th February 2024 07:49 GMT Richard 12
Re: Once again Google self Interest
No replacement for IPv4 could ever become common until IPv4 becomes too expensive to use or doesn't work anymore (likely both at the same time)
I can't change until you change, so why should either of us change?
"Backwards compatible" is simply impossible at the IP layer because the problem is insufficient bits available. The only options are to wrap or convert at a gateway - and quite a bit of the Internet is already doing that.
-
Monday 5th February 2024 16:32 GMT Anonymous Coward
Re: Once again Google self Interest
IPv4 has optional extension data features built into the protocol, so there could have been an evolution. For example, they could have started by just extending existing IP addresses 123.123.123.123.x.x.x.x by adding the next 32bit subnet as a packet extension. Then everyone could start using IPv6 addresses without upgrading anything...
There's obviously way more to IPv6 than that, but it jamming the features into IPv4 would allow people to start to use it straight away. Then eventually the full protocol would be adopted.
-
-
-
-
-
-
-
Monday 5th February 2024 15:51 GMT StrangerHereMyself
WebP
I read an article on Hacker News which claimed that WebP (Google's competitor to JPEG XL) doesn't have better compression than normal JPEG for similar picture quality and has weird artifacts if you opt for higher compression.
In other words: there's very little reason to adopt these new standards.
-
-
Monday 5th February 2024 19:12 GMT Davonious
Re: WebP is deprecated
I use the heck out of WebP, and the only reason for it is that it's space efficient. I've tested thousands of images, especially images under 500k, and in almost every case, I save 30%+ on a straight png to webp conversion. And that's even with the PNG images being preprocessed by a lossless compression optimizer. Heck, the margin of difference would be even greater in WebP's favor without out that preprocessing step.
-
Thursday 8th February 2024 17:04 GMT hx
Re: WebP is deprecated
I mean, would you rather have an image file format that serves the needs of photographers and visual artists, or would you rather have an image file format that serves the storage and processing needs of Google without any outside consultation, who was only able to force people to use WebP through coercion thanks to their global monopoly on both search and browser tech via the Giant Carrot that is better search rankings? Can you even trust Google to do the right thing, or even care about WebP tomorrow? They have proven to be unreliable, and there you are, using their image format? If that's the case, I hope you follow the most efficient route on Google Maps.
-
-