You might have read some tutorials or articles explaining the importance of images and how to optimize them to deliver a great experience to your users. But does this work really translate into wins for your company? In this talk, you see real-world examples of the positive impact that image optimizations can have on metrics that your bosses and clients care about. You will walk away from this talk with compelling data and useful tools to help you get buy-in and support for this important user experience work at your company.
Allison McKnight is a software engineer on the Performance team at Etsy, where she helps the people who build Etsy reason about the performance of their features, creates tools that reveal how how code changes effect site performance, and checks IRC regularly for any bear gifs that may have wandered in.
Hi. I’m Allison McKnight. I’m an engineer on the web platform team at Etsy and I’m really excited about everything that we’ve been learning about today. There have been some great talks earlier with a ton of really good content and if you're anything like me, you're really excited about what you've learned and you're excited to take that back to your company where you work and implement these performance improvements so that you can get performance wins on your images but you might be having some questions like, how much will this affect the user experience that my users see on my sites? You might also have some doubts like, how am I going to convince other people at my company that didn't come to this daylong conference just on images that it’s worth it to invest in these image optimizations for your site?
I’m going to address these things in this talk today. Before we dive in, I’ll give you a little bit of background. I work on the web platform team at Etsy. Etsy operates marketplaces where people from around the world can connect to make, sell and buy unique goods. At Etsy, we use images to showcase all of the unique items that our sellers make. The web platform team at Etsy provides and maintains the foundation on which production teams at Etsy can build out the site. This includes infrastructure and tools that help these teams understand and reason about their performance of their products.
"how much will this affect the user experience that my users see on my sites?"
I've been working on performance for about four years at Etsy and I’m hoping to share some of what I learned during those four years with you today to help you better understand the impact, the performance, an image performance has on your user’s experience. I also want to share some tricks and tips that you can use to encourage everybody else at your company to get on board with the idea of making image optimizations on your site.
All right, we’re going to cover all of this in three steps. First, I’m going to share with you how performance is a really essential part of the user experience. With that background, we’re going to explore the impact that images have on the performance of pages today and finally, we’ll explore how image performance impacts not only the user experience on your site but also, the business metrics that your company cares about. As we do this, we’re going to go through some case studies that demonstrate that link between performance and business metrics but I’ll also point out some techniques that you can take away from these studies so that you can convince everybody else from around your company that you need to focus on image performance.
I’m going to start out by emphasizing this point. Performance is user experience. If you're looking to build a product that your users are comfortable with, you need to consider performance as a part of that picture. I have a case study that’s kind of going to take that point home to you. This particular case study comes from Tammy Everts’ book, Time is Money, which goes in depth into a lot of different case studies that do make this link between business metrics and performance. If you enjoy learning about what I’m sharing today, you should check it out. I’ll have info up about that at the end of the talk.
The company that ran the study was Radware and Radware was interested in better understanding the emotional impact that a slow experience had on users. To do this experiment, they got two groups of users and they had them complete a series of transactions on a couple of mobile sites. The users were split into two different groups. The controlled group experienced all of these mobile sites at a normal speed but the experimental group saw a 500-millisecond artificial network delay that was added to their experience. The users in the second group didn't realize that they were experiencing a slower version of the site.
After these users completed their purchases, they did exit interviews with the researchers and the researchers took the words that the users said in those interviews and generated word clouds so that they could kind of understand the trends that the users felt when they were looking at these sites. This is the word cloud for just one of the sites that the users looked at and this is the word cloud for the normal speed group of users. We can see immediately that a big takeaway was the site was easy to use, so that’s great. We do see a couple of negative words here. Users found the site ugly, confusing and even at its normal speed, they thought it was a little bit slow. Generally, the users are having a straightforward experience as they’re trying to make a purchase on the site.
Now, let’s look at the word cloud generated for the same site but when users were experiencing 500 milliseconds of added delay. We see immediately that they noticed the site was slower. The word slow is quite large there but let’s take another look at some more of the negative words that users had to say. We see a lot. Users found the site frustrating, inelegant, complicated and hard to navigate. Somebody even said that the site was tacky which has like nothing to do with performance. Even though the only difference between these two groups of users was the speed of the site that they were using, the users who experienced that slower site are generating about three times more negative words than users on the control.
The words don’t always cite performance. A lot of these words have to do with brand rather than just a slow experience. A little bit like … Word clouds are probably not your idea of the highest form of science, right? Even so, these findings do suggest that performance impacts user’s connection to your site and your brand beyond just giving them frustration with the slow experience.
The reason for this is because users, humans, need for the tools that they interact with to be fast. They need this on a neurological level. To illustrate this, we’re going to take a look, look at a set of numbers that kind of show how we perceive different load times when we’re interacting with the machine. These numbers can apply to a wide variety of situations, for example, loading the page for the first time, updating shipping costs when a user changes the destination of the package or loading 10 more comments at the bottom of your favorite cat video.
If a change renders in 100 milliseconds or under, then, that will appear instantaneous to the user. If a change takes between 100 and 300 milliseconds to render, the users are going to notice a small delay. After that, if a task takes up to one second to render, then, the user is probably going to make a mental context switch, switching from thinking about whatever they were trying to accomplish on your site and instead thinking about something else. Finally, after 10 seconds, the user is probably going to abandon that task. They're going to close the tab or the browser and forget about it.
At this point, you might be wondering how these numbers have changed as our technology has evolved over time. These numbers were most recently published in 2013 in Ilya Grigorik’s book High Performance Browser Networking. Before that, they were published in 1993 in Jakob Nielsen’s book Usability Engineering. These were very similar numbers published in both of these books. Nielsen’s numbers were actually based on two studies which were performed in 1991 and 1968.
Now, we’re looking at a 40-year period where the recommendations for ideal response times when a user is interacting with something have been pretty consistent. In fact, we can go back even further. One of these numbers we see pop up as far back as 1740. At this time, people were trying to study human’s visual short-term memory. They did this by taking a glowing coal and attaching it to a wheel that they would spin around. They wanted to see how long, how quickly that wheel would have to complete a rotation in order for humans to see a fiery circle instead of just like a wheel spinning around or a coal spinning around on a wheel.
They found that this took, this happened when the wheel completed one rotation in 100 milliseconds. This demonstrated that our short-term visual memory lasts for about that long. Check it out. That’s the same as the threshold that we’re aiming for in order for a change to appear instantaneous to a user. Since these recommendations have kind of remained more or less the same for 40 plus years, I think it’s safe to say that these are baked into our wiring. It’s really on a neurological level that humans need for the tools that they use to be fast.
If we want to create an experience that allows users to seamlessly and comfortably interact with our product, we need to use these numbers to inform and measure our success. Ideally, we want to aim for this range where we’re getting a change to the user in under 300 milliseconds so that they might experience a short delay but they won’t get to the point where they're switching to think about something other than what they're trying to do on your site because science has shown us that if we’re asking them to wait longer, nobody has time for that.
So far, we’ve seen that performance does matter to our users. The speed of an application does influence their perception and comfort with your product. We need to focus on performance as a part of the user experience. If our goal is to deliver that fast experience to our users, you might be wondering, do images really matter that much considering performance? My guess is that you kind of already have an idea that they do because you are at a day-long conference on images but you might need to make this point to your co-workers or boss so that you can focus on image performance.
In this sections, we’ll take a look at some data to help us understand the impact that images have on the performance of your site. For some background before we start off, I want to just take a look at an example image so we have some numbers on our head to compare to for the rest of the talk. This is a hero image that we’re using on the Etsy homepage right now, the hero image being like that, banner image that you display prominently so the user can get a sense of your brand.
We serve a couple of different sizes of this hero image to fit the user’s screen. I took a look at just two of the sizes that we serve, a 900 width version and a 1440 pixel width version. We can see the different weight in kilobytes of each of these images. I ran a couple of tests to see how long it would take to load each of these images in a few different conditions. First, on desktop, we see that the smaller of the images takes 40 milliseconds to load and the larger takes 140 milliseconds. Then, on mobile, on a 3G connection, the smaller image takes 82 milliseconds while the larger image takes 317 milliseconds.
We can see that just loading this larger hero image on mobile is already pushing us over that boundary of a small perceptible delay. Now, keep in mind that this is data from two particular images. Your mileage is going to vary. Even if you're looking at images that have similar dimensions, the weight of that image is going to change based on things like the content of the image, what file type you're using and whether or not compression has been applied to that image. Similarly, download time will change for images with similar weights, depending on things like the user’s connection. All the same, hopefully this will give us some context as we walk through the rest of the talk to like point back to.
At this point, we want to understand how images contribute to webpages today and how they contribute to the performance of these pages. To do this, I looked at some date from the HTTP Archive which has been mentioned in a couple of talks earlier. The first thing that I wanted to know was for a typical webpage, how much of that page is made of images. Somebody showed this chart before but I'm going to share it again because it really is important.
"On average, images make up 64% of our page weight..."
Because images make up such a large part of webpages today, they're kind of a low-hanging fruit of performance optimizations. They make up so much of the page that anything you can do to optimize that will have a big impact. At this point, you might be wondering how big are the ones that I might see when I'm doing some image optimization. I know that it will affect 64% of my page but like can I really get that much savings?
We can explore one optimization that kind of illustrate how that might go. Let’s take a look at lossless image compression. Just as a refresher, lossless compression is applying an algorithm to an image that reduces that image file size in kilobytes but doesn’t actually make any visual changes to the image. It just allows you to ship the same visual image to the user but you can do it more quickly because you're delivering fewer bytes.
I wanted to understand how pages across the internet today are using lossless compression and to what extent they are using it. For this research, I took a look at the Alexa top 1,000 pages that are tracked by the HTTP … Oops, sorry. The HTTP Archive. The HTTP Archive does record things like image requests that that page made. I was able to, for each page, download all the images that that page had requested, run them through some lossless compression algorithms and then compare the original size of the image to the size of the image once it had been losslessly compressed.
When I did this, I found that 15 out of the 1,000 pages had fully optimized all of the images on their site. This is, I GIFs, JPEGs and PNGs. That’s really exciting, great for those 15 pages but maybe like, what's going on with those other 985 pages, right? I looked at the data for those other 985 pages. I found that the median savings that those pages could have had if they had applied lossless compression consistently on all the images on that homepage was 58 kilobytes.
For some reference, if you remember that larger hero image that we looked at earlier, that image was 41 kilobytes. This means that half of the Alexa top 1,000 pages could have saved an entire hero images worth of bytes if they had just used lossless compression. It seems like there's definitely potential for significant image savings if we put in that effort to optimize our images. On top of that, in this case, there really aren’t any trade-offs that we have to make. This optimization only improves the user’s experience.
Image optimization opportunities really are out there. At this point, you're probably wondering how much of a difference will these optimizations make in the user experience, how will my users feel these changes? I have another example that I think is going to help us understand that. You’ve heard about srcset. Again, srcset just allows you to deliver the correct width of image, size of image to a user based on that user’s device size. In 2015, a couple of employees at Etsy realized that there was a page on the site that we hadn’t applied srcset to. We could start using srcset to take those images on that page and deliver the correct size of image to each user depending on that user’s device.
This would save page weight and speed up load times, especially on mobile when the network is a little bit poorer and the devices are smaller so we can get away with a smaller image. These people implemented these changes and used Chrome DevTools on their desktop browser to see what kind of load time changes is based in the download times for those images once they got the correct sizes.
We see here, this is the original full size image that we were originally sending to all devices, regardless of that device’s size. Using srcset, we were able to send smaller and smaller images to smaller and smaller devices. You can see, we actually saved a lot of image weight by optimizing the image to fit the size of the device on which it would be displayed. We can compare this to the load time of each image. Up here, we see that originally, the full size image would take 651 milliseconds to download. Once we’ve cut that image weight about in half, it only takes 126 milliseconds. For the smallest image, that image was now loading in about 49 milliseconds. These are some really good savings.
These savings do affect the load time of your images and your pages and your users do notice that. When the load time for a page is a little bit slower, your users are less likely to engage with your site. I have another study from Etsy that kind of demonstrates this. In 2014, the mobile web team at Etsy tried this experiment where they added 160 kilobytes of images to the search page. They did this by loading some additional images off of the screen where the user can't see them. The only difference between what we normally would give to the users and what the users were seeing in this experiment was that the page was a little bit slower.
When they looked at the data, they found that the users in this experimental group with the extra page weight had a 12% increase in bounce rate over the users in the control group. This is really big. The users who experienced the slower load times were less engaged. We can see here that images really are not small potatoes. They make up a pretty big part of today’s webpages and there are really good performance ones that you can get by implementing optimizations on your images. In order to start optimizing your images, you're going to need support. Specifically, you’ll need buy-in from your boss and you’ll need help from your teammates to help get the job done. How are you going to convince the people at your company that performance should be a focus?
In order to do this, you need to show other people at your company that image performance means business performance. Beyond the user experience, I'm sure that your company has a purpose. Probably you sell something or maybe you're trying to maintain or increase readership. There's some business metric that your company cares about. Linking performance metrics to the business metrics that higher-ups at your company care about is a great way to get buy-in for doing this kind of performance work on your site.
Now, we have kind of a chicken and an egg problem. I recognized that there was another chicken and egg slide earlier today. Anyway, I'm kind of wondering who did it better. Maybe we can add something to the poll. I don’t know, maybe not. The chicken and egg problem that I'm talking about is how are you going to show these metrics to the bosses, your manager at your company that demonstrate that link between business performance, business metrics and performance if you can't get their support to do that work to find that data in the first place.
Luckily, there are a lot of findings out there that other companies have found and shared with us that demonstrate that link between performance and business metrics. You can also use some tips and tricks that those companies have used to help get the rest of your teammates interested in doing this performance work. In this section, we’re going to go through a couple of case studies that do demonstrate that link between performance and business metrics. As I go through each study, I’ll point out those techniques that you can take away and use at your company to get the interest of your co-workers.
By the way, all the case studies that I'm going over in this section will be linked to at the end of the talk and I’ll also tweet out a link to the slides and the link with the resources later so you’ll be able to reference this after the talk. All right, let’s dive in. This first case study was performed by the Financial Times. The Financial Times was about to start work on their new site. They were about to start work on a new version of their website but before they do that, they kind of wanted to understand the impact that performance have on our key business metrics. Specifically, they were wondering how performance impacted the number of articles that users read. They needed to do this because that was one of their key metrics and if they had that number, then they could tie that to revenue and therefore, kind of know how performance affected revenue.
To perform the study, users who visited the Financial Times were split into three groups. The control group experienced the site at its normal speed. Then, the experimental groups experienced a one-second, a two-second and a three-second added delay. They ran this for a week and they got some results. After seven days, they found that in the one-second bucket, users read almost 5% fewer articles than users in the control. For the three second bucket, users read about 7.2% fewer articles than users in the control. The Times was like more interested in the long-term effects of performance on readership because hopefully, they have readers who’d go back again and again to read their pages so they decided to keep this experiment running for 28 days.
After 28 days, they saw that for every bucket, one-second, two-second and three seconds, users were reading fewer articles than the control. Additionally, in the three-second bucket, they saw that users continued to read fewer and fewer and fewer articles over time as they continued to the exposed poor performance. The technique that you can take away from this is to appeal to the need for a better understanding of the link between business metrics that your company cares about and performance.
In this case, the need was emphasized because the knowledge would inform the work that the team was about to start. As you go back to your company, if you're about to start work on a new product or if you're kind of cleaning up an old feature, you can keep this in mind. Try and think of metrics that you can point … Sorry. If you appeal to this need of a better understanding of how performance impacts your business, you can get your team members more excited about doing this kind of research by explaining how having that knowledge will help you plan your time so that you can balance between performance and other considerations as you're working on that product.
All right. Our next case study comes from Mozilla. In 2010, developers from Mozilla were looking to increase the number of downloads that they saw of the Firefox browser. They knew that the page on which users could download Firefox was pretty slow. Then, when they compared it to the same page for the Chrome browser, they saw that the entire Chrome download page rendered in the same amount of time that it took to show just the header for the Firefox download page. That was pretty bad.
When they saw this, engineers did some research and found some optimizations that they could do to shave 2.2 seconds off of the load time for this Firefox download page. When they did this, they saw the download rate increase by 15.4%. This ended up mapping to an additional 10 million downloads of the Firefox browser annually. The technique that we can take away from this is to compare your product to your competition’s product. If you show other people on your team that the competitor’s product is a lot faster than yours, then you're really emphasizing that that competitor is getting all of the benefits of good site performance while your site is not. This is a surefire way to get your boss interested in letting you do performance work.
It’s bonus technique time. I have kind of one other technique to share that ties in nicely to that technique of comparing your product to your competitors. That technique is to show, not tell. One technique that we can … Sorry. A tool that we can use to do this is WebPagetest which Steve went through a little bit during his talk. Specifically, he did use this comparison view which, in Steve’s talk, he was comparing the normal load time of the site and then like how long it took to load if one of your blocking resources just never showed up.
Here, we can also use the same comparison film strip to see how long it takes to load your site versus how long it takes to load a competition’s site. Rather than just saying here, “Oh, the Google page finishes loading in three seconds whereas Bing just keeps going.” We can show people that. It does a better job of helping them feel that disconnect.
Another feature that WebPagetest has that you can use to do something similar is the record video view. Hopefully, we’re going to do another switch through because I want to show you this video. At Etsy, we've used this tool in WebPagetest to record videos of our site loading in a number of different conditions. We’ll load different pages on the site on cable and compare that to a mobile view or we’ll load it locally here in our data centers in New Jersey and then, try to load it from around the world and see what that difference is.
We displayed these videos on a dashboard that are on a monitor on a wall in our office so that as they're walking around, Etsy employees can stop and look at this and really feel the difference between these experiences. We could alternatively just display the numbers that takes so many seconds to load here versus here … Sorry, again, but that doesn’t help those people really feel the difference. I think that this is important because at work, we’re spoiled. We have this shiny new devices and we have a great internet connection so we lose track of what our users around the world are seeing.
All right. Is this going to work? Yeah, that was it. Amazing. I'm sorry. I need to adjust one thing down here. We’re so close. Okay, cool. I'm actually really excited about this next case study. Has anybody heard of Google or of YouTube Feather? No? Okay. Yeah, a couple. Okay. I'm really excited to share this with the rest of you. It’s one of my favorite performance stories of all time. This study comes from YouTube. A while ago, the YouTube video watching page was like pretty big page weight-wise. One of the engineers at YouTube was complaining that if somebody could build an entire clone of a video game that ran in a browser, a 3D video game in under 100 kilobytes which does exist by the way, then, YouTube really had no excuse for having such a large heavy page.
Hearing this, another engineer on the team decided that maybe he could take a stab at it. He wanted to try and get a version of the YouTube video watching page that weighed in at under 100 kilobytes. After a good amount of work and some manual tweaking, this engineer got a version of the site that loaded in just 98 kilobytes. They went ahead and they pushed that version of YouTube out. This lightweight version was called YouTube Feather.
They waited for a couple of weeks, a week I think, and saw the data come in. When they took a look at it, they were really surprised because they had expected for the load times for this lightweight version of the page to be a lot shorter than the load times for the normal version but actually they were like really long. They were something like two minutes on average. This was really perplexing because it seemed fast but when they looked at the numbers, it was really slow.
After a lot of confusion, they ended up being able to split that performance data out by geographical location. They discovered that the Feather version of the page was getting a disproportionate number of views from areas with really low connectivity. What had happened was that these users, previous to YouTube Feather being launched, would have had to wait for 20 minutes on average just to load that page to watch one video. That was basically impossible. Nobody was going to do that. When Feather shipped, they were finally able to load a video on a kind of reasonable amount of time, only two minutes. This allowed YouTube to reach new users in areas of low connectivity that they hadn’t had the reach to get before.
As the engineer, Chris, who worked on this product put it, large numbers of people who are previously unable to use YouTube before were suddenly able to. This really allowed having this lighter version of a page increase the reach of this product. Something neat that we can take away as a technique from this is to issue a challenge to yourself or to your team to achieve a performance goal. The puzzle of trying to accomplish some performance goal, some stretched goal that might be kind of challenging and new is going to get other members of your team excited to find new ways to improve the performance of your site.
Next, we’re going to talk about search engine optimization. Search engine optimization or SEO is kind of the science of getting your keywords trust rate so that when a user searches for a product that you're selling in Google or Bing, then your site is going to appear hopefully close to the top in those search results. SEO is a pretty big focus for a lot of e-commerce companies because ranking high in search results is an important first step in helping potential buyers find your site.
In 2010, smartfurniture.com was in a situation of having focused on SEO for a pretty long time but they still haven’t been able to break into those top search results rankings but in this same year, Google announced that they were going to start considering page speed as a part of the search ranking algorithm so that pages that loaded more quickly would rank higher in search results than pages that loaded more slowly.
Smart Furniture had been trying to do this for years to break into those top slot so they jumped on this. They focused on this performance of their site and by speeding up their site, they saw their key search terms rise in those search results so that more users were able to find smartfurniture.com when they were looking for their new couch. Because of this bump in search rankings, Smart Furniture enjoyed 20% more traffic to the site through search. They additionally saw a related bump in sales which is pretty nice for them.
Something that we can take away from this is to link performance to another opportunity that is already regarded as profitable at your company. If we’re thinking about Smart Furniture, they already knew that high search rankings were important to their business. When performance became a factor in that, they were really ready to focus on performance instead. By tying performance to your business’s existing goals, you can make a case for a focus on performance. Now, this will work really well if you're able to tie performance to a metric that your company specifically cares about.
I have a tool that can help you find those statistics about specific metrics that you're interested in at your company. This is a site called WPO Stats. WPO Stats has a list. It’s basically just this repository of different statistics that link performance to a variety of metrics. Here are some of the metrics that you can find statistics about on this site. Something that’s really neat about WPO Stats is that they accept submissions, so if you do go back to your company and you’d run some experiment that does end up linking performance to a business metric that you care about, you should go post it on this site so that the rest of us can learn from that.
Now, since we’re talking about metrics, a couple of you in the room might have noticed that I've been talking a lot about a lot of business metrics but none of them are like business metrics that are super related to money, so like, where are those stats? Those stats are also there. I just wasn’t going over them because I don’t have any good technique stories to tie into them but here are a couple of examples. Autoanything.com in 2010, it’s a site where you can buy car stuff, they were able to improve their page load time by 50% and saw a 12% increase on their sales. That’s big.
This one’s pretty interesting. Obama for America, when Obama was campaigning, they did a redesign of the donation page and were focusing on performance for that redesign. They were able to ship a new page that was 60% faster and they saw a 14% increase in donation conversions. Now, 50%, 60%, that’s pretty big. What about smaller ones? A study done by Glasses Direct found that just a one-second increase in page load time led to a 7% drop in conversion rate for their users. Yes, performance does go back to money if that’s what your business is most interested in.
To wrap up, I have a couple of more techniques that I wanted to share that I learned from my time at Etsy. The first of these is to hold yourself accountable to your stakeholders. I have a story that goes along with this. At Etsy, the people who sell on Etsy rely on our site being fast so that they can run their business as effectively. In 2011, the performance team at Etsy published our first site performance report. This report contains some load time numbers for the five most important pages on the site and there was this obvious outlier here. The home page took about a second to load and that was just on the backend, so like just generating that page on the backend took a second.
The homepage team, when we were publishing this data, they saw this and they realized that they needed to focus on performance to make a change in order to help our sellers continue to run their businesses. Performance is important to them. By the time the second performance report was published three months later, the homepage team had been able to improve that load time by 65% making the homepage one of the fastest pages on the site. By holding ourselves accountable to our users, our stakeholders, we gave that team the motivation that they needed to focus on performance, to set aside some time to make these optimizations.
My last technique that you can take back with you and maybe my favorite is to celebrate performance. I think we’re pretty good at Etsy. We keep an eye out for people who are doing performance work around the company and when we see that happening, we celebrate it by sharing what that team is doing around the company with the entire company. Here’s an example of our Performance Hero dashboard. When somebody is able to make a performance improvement to the site, we celebrate by taking goofy picture and showing graphs of how much faster they made the site. We love seeing those cliff drops. By celebrating performance, we’re shining a light on that work that’s already being done and encouraging everybody around the company to be continuously thinking about performance.
On this talk, we’ve explored how performance is an essential part of the user experience and we’ve studied how much of an impact images have on the performance of pages today. We’ve seen real world examples that link performance to business metrics, a large range of business metrics that your company might care about. As you take what you've learned with all of the talks today back to your company, I encourage you to put all of that knowledge into the context of your company. By making performance relevant to your company’s goals and priorities, you can convince the rest of your team that a focus on performance will help your business to pursue those goals. All right, that is all I have for you. Here is the link if you want to check out some of the stats that I had today and I’ll tweet out a link to these sites later.
MC Travis Wright:
Excellent. Allison McKnight, great job.
MC Travis Wright:
It was awesome. All right, do we have any questions for Allison? Speak now or forever hold your peace. Questions? All right. You'd get off the hook that easy? Are we going to let Allison off the hook that easy?
I noticed earlier, you'd mentioned that [WebP isn’t very like a huge priority for you right away. If web performance is so important, what’s the reason behind not implementing something like WebP?
Yeah. That’s a great question. There are a lot of things that our team has on our plate and we’ve actually been kind of small recently so the number of things that we’ve been able to tackle has been limited although we are now a bit of a larger team so hopefully that’ll come up soon. Another really interesting thing about images specifically is … At Etsy, we love data. We want to measure every single thing ever. We do this in part by running A/B tests every time we make any change to the site so that we can see how that change impacts a whole variety of metrics that we track like conversion, bounce rate, favorating rates, other engagement metrics.
With images, right now, it is not super easy to run an experiment that shows us that link because in order to do that with the A/B testing framework that we have right now, I think we would have to backfill like a ton of images. Normally, when we’re talking about images of Etsy, we’re talking about the images that sellers upload of the product that they’re selling and there are a lot of those because we’ve been around for more than 10 years and so we have a lot of images. It’s something that I've been thinking about a lot because there are some experiments that I and other people that I work with are like really excited to run. Hopefully, we’ll figure out a way around that somehow so that we can experiment with a technology like WebP and get data from that.
MC Travis Wright:
Yeah, last one. A quick comment and a question with respect to the … You mentioned the images getting larger year by year. Even worse is the fact that the requests aren’t even increasing so you should do that math. That’s bad news. Now, here’s the question. You mentioned sort of downloading all the images from the top 1,000, 10,000 and then doing a lossless optimization.
Why did you go lossless?
That’s another good question. Lossless compression … All right. Coming from my background, again at Etsy, our sellers do rely on us running the site well so that they can sell things and run their businesses. Something that we’ve learned is that they really care a lot about the quality and the crispness of their images so lossy compression is something that I’m pretty conscious about. It does change the quality of an image and it’s hard to really understand without having a human look at it what that threshold is where you're like, “Good, you're cutting out bytes but the image still looks great.” Then, at some point, the image just doesn’t look good at all.
The optimizations that I ran on those images are kind of like at the least you could have saved this number base. It could be that if those sites also use lossy compression, they could save a lot more of their image weight but I don’t have the authority to make that decision of where that boundary is and also like no way I was going to, with my human eyes, look at the images for 1,000 websites. Does that make sense?
Good stuff. You get one more question. You get the bonus question.
I noticed in the public data that you showed that the elasticity for conversions was actually dropping as you gained performance but in the opposite direction, you actually lose a lot more conversion. Your team can always, or your boss can always make the argument that, “Well, we’ve already hit … We’ve already gained all the low-hanging fruit that we can get.” It’s hitting an asymptotic upper limit to what conversions we can get. How do you get around that especially considering that it’s really hard to run an A/B test where you're showing the increase in performance for a few …
Yeah, okay. That’s a great question about kind of coming up against that point where you're getting diminishing returns. If your boss is making that argument, I’m not sure I have a ready answer for you right now but maybe I’ll think about it and we could talk later. In terms though of running an experiment to get data where you're slowing down you site, that’s an interesting thing that you’ve brought up. We’re considering maybe doing something like that at Etsy in the future.
The way that we’re thinking about this is that it’s like you are slowing down the site, it’ll make performance worse, it’ll make user engagement metrics worse, we’re probably going to lose money but it’s like a research cost. You're gaining a lot of valuable information because it’s not just that you understand that performance impacts business metrics. It’s that you understand how much performance impacts business metrics. You can use that to reason about what your focus is going to be and how you're going to balance your focus between working on performance and working on other things.
MC Travis Wright:
Great stuff. Thank you so much, Allison McKnight. Give her another big round of applause. Excellent stuff. Thank you.