Google has devised a new image file encoder that dramatically reduces the size of JPEG images without any significant or noticeable dips in image quality. The Internet giant’s research team has managed to combine excellent compression density with high visual quality. The company’s similar effort in 2014 shrunk images by 10 percent, but its latest development is an open-source algorithm called Guetzli, which encodes JPEG files that are 35 percent smaller than currently-produced images.
Smaller file sizes might seem an arcane tech worry, but they are essential for fast-loading sites. The Mountain View tech company has vested interest in decreasing the time it takes to load websites and services. One way of achieving that is by reducing the size of images on the internet.
For firms storing and serving images, the goal is to keep the image files as small as possible, and Google’s new encoding algorithm will keep everyone happy. CNET notes that previous attempts to reduce image files include Big G’s WebP and RAISR, and Microsoft Corporation’s JPEG XR, however, their success has been limited by the ubiquity of JPEG support.
In 2010, Mountain View launched WebP that offered better compression than JPEG, but the format still has not been widely adopted. Most sites today still rely heavily on the common lossy compression technique, GIF, and PNG images.
The advantage of employing Google’s new encoding algorithm is that the pictures are still regular JPEG files, so they are still compatible with almost every application and browser that exists. According to the Verge, the tech giant claims that Guetzli images are of higher quality than similar and even larger JPEGs build with other methods.
Google’s new JPEG compression algorithm serves up pictures that look great, but their file size is 35 percent smaller. The idea is not to replace the popular method of lossy compression for digital images, but tweak its settings to decrease the likelihood of noticeable problems when files are squeezed.
Forbes notes that 35 percent is a pretty substantial difference, although in the grand scheme of things Internet users might not notice the difference. Users might not gain much from Google’s Guetzli algorithm but the tech giant certainly will. The Internet giant constantly scan sites for caching pages, new content to add to their search engine, and serving up advert banners. There are only a few firms in the world that process as much data from the Web each day than the Alphabet subsidiary.
Google revealed Thursday that Guetzli, its compression algorithm that cuts image file sizes by 35 percent, is currently in its testing stage. The only downside to the algorithm seems to be that it takes longer to produce a compressed image compared to libjpeg. The search behemoth however, says that since the image files are so much smaller and there is no loss in quality, it is worth the tradeoff.
By Anila Maring
Photo Courtesy Google