Here's why you have to deal with so many annoying webPs now
Can't find a dang jpeg to save your life? Here's why webPs are taking over.
Don't tell whoever it is that signs my checks, but at least half my job is downloading images off the internet. I download a lot of images and put them on this website. Just a few years ago, I pretty much never had to worry about re-saving an image in a different file format. It was just the occasional super-high DPI 4K png, or whatever, that needed a little shrinking down. I had no idea what a terrible future awaited me—a future defined by rampant, annoying webPs.
These little pests are everywhere! Why are they everywhere? And when the heck did this happen?
According to Google Trends, which tracks the popularity of terms people are searching for, webP really started to take off in late 2021, but it's been a thorn in our sides a lot longer than that. Here's Foone tweeting about how annoying webPs are back in 2019:
no one has ever tried to save an image and gone "oh, it's a .webp!" and had that be an excited, positive emotion.May 3, 2019
The magical(ly crappy) thing about webPs is that they seem to be simultaneously ubiquitous and also barely supported. Countless times I've tried to upload a webP to Twitter or PC Gamer's content management system only to have it rejected. Photoshop didn't officially support webP until February 2022. For a couple years I resorted to using MS Paint to convert webPs to jpegs. Could I have downloaded a plugin or something to make my life easier? Yes. Did I stubbornly resist because I didn't want to let webPs win? Obviously yes.
WebP, like most things that start out as good ideas and then oopsie-doopsie ruin the internet, comes from Google. Google announced webP way back in 2010, actually, as a more efficient image compression format than jpeg based on its own open source VP8 video codec. Remember webM videos, the briefly popular format for gifs-but-video? Those use VP8. The VP8 video codec is pretty much dead now, though, with the more popular (though less efficient) H.264-based mp4 winning out.
At this point you're thinking what I'm thinking, right? Google was salty that VP8 didn't catch on, so it pushed webPs in a twisted act of revenge? This is definitely what happened, I say with zero evidence to back up this accusation.
Google has been trying to weasel webP into wider adoption for years. It switched its Google+ app to webP back in 2013, the equivalent of putting slightly sleeker sails on a ship with a jagged hole in its hull. Firefox and Safari added webP support in 2019 and 2020—I have to imagine begrudgingly, since they held out for nearly a decade after Google introduced the standard.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Here's the particularly insidious bit about how this file format has become so common: you're not running into webPs all over the internet because all the people who make websites decided they wanted to use it. Most of the images you see on websites as webPs are actually still jpegs or pngs. So… how the hell does that work?
Most traffic on the internet today travels through CDNs, or content delivery networks. Once upon a time, you may have hosted a website on a server in your home, and the traffic to and from that server would've had to make a long, long trip if someone was trying to reach it from across the world. CDNs are server clusters all over the world that cache data, dramatically shortening that travel time (Cloudflare has a good, more detailed breakdown). Caching this data both speeds up website loading and reduces the bandwidth strain on your actual web host. By their nature, CDNs are all about optimization. Now you can probably see where this is going.
Amazon Web Services and Akamai, the two biggest CDNs, can automatically convert images to webPs and serve those newly compressed images when you land on a page. There are lots of tools that website builders can use to implement this same functionality, too. Every time you visit a website, it does quick negotiation with your browser to see what sorts of files it supports, so if you're on Safari, Firefox or any Chromium browser, it'll now essentially say "send me a webP, please!" and a lot of websites are built to now happily fulfill that request, saving themselves a bit of data and a few milliseconds in the process.
Which is… kinda nice, if you're on a slow internet connection, or stuck with a data cap, or care about website loading speed above all. But it's created a real mismatch in how we use digital images, as CDNs and photo editing software and uploading tools now have very different priorities. A lot of websites will serve you a webP, but a whole lot fewer will actually let you upload one. If, like me, you care more about the convenience of using an image after you've downloaded it than saving a few kilobytes of bandwidth, it's pretty annoying that those source jpegs and pngs are being kept just out of reach.
That inconvenience feels like it's somewhat poisoned on the well for webPs, which is actually a shame. We don't use jpeg because it's the best way to compress images—jpeg is old as hell, and webPs can absolutely look better while being smaller files. But until using them stops being a pain in the ass, every website that serves them to me while holding a jpeg hostage is my sworn enemy.
Wes has been covering games and hardware for more than 10 years, first at tech sites like The Wirecutter and Tested before joining the PC Gamer team in 2014. Wes plays a little bit of everything, but he'll always jump at the chance to cover emulation and Japanese games.
When he's not obsessively optimizing and re-optimizing a tangle of conveyor belts in Satisfactory (it's really becoming a problem), he's probably playing a 20-year-old Final Fantasy or some opaque ASCII roguelike. With a focus on writing and editing features, he seeks out personal stories and in-depth histories from the corners of PC gaming and its niche communities. 50% pizza by volume (deep dish, to be specific).
Sony decided fully four years ago to develop its own AI hardware for the PS5 Pro rather than using AMD tech and the big question is why
Multiple Nvidia RTX 50-series placeholders have appeared on the EEC database, although I wouldn't get too excited about a hypothetical 'RTX 5090 Ti Super' if I were you