Google has always pushed for faster loading webpages. This agenda may hide some sinister plot, but as someone on a shoddy internet service in the boondocks I'm in favour of anything that reduces web payload size. Or if sites like Facebook opted for the same filesizes but less lossy jpegs, that would also be great (but not what I'd expect from them!).
Originally posted by tduell Saw some discussion about this on Darktable list.
It seems that it is slow, and requires a lot of memory, 300MB per 1Mp of image if I remember correctly, so may not be such a great breakthrough.
That's mentioned here as well:
Google’s Free Encoder Drops JPEG Size By 35%, But Maybe There’s Something Better Already… | SLR Lounge But it's an added memory and time cost during the initial encoding process, so probably not a big deal in practical use on a modern desktop or laptop computer, especially if we're talking about prepping files for webpages. This cost might make implementation by an entity like facebook prohibitive though, unless they can offload the resizing to the person uploading.
It will be interesting to see if imaging programs end up adopting it smoothly, or if easy to use plugins start turning up. It's hard to beat free.