If the internet was a country, it would be the world’s sixth biggest polluter. The internet consumes a lot of electricity. 466TWh per year to be precise. That’s more than the entire United Kingdom (300TWh)!
Carbon emissions are generated all over the place, from data centers to our personal devices. The average website produces 1.76 grams CO2 per page view. For a website with 10,000 monthly page views, that's 211 kg CO2 per year. These are staggering numbers and they will only go up as the internet is growing at a frightening rate. To make the internet more sustainable, web developers have three areas of attention.
- Design and content
- Front-end development best practices
- Server architecture choices
In this piece I focus on what web developers can do to lower the carbon footprint of their projects. By web developers I mean people in design, front-end and solution architects.
In this post I will teach you how to reach sustainability goals and attract more customers. Websites with high performance, a clear content strategy and good SEO are easier to find. They get people to what they are looking for faster. This all means: less usage of devices and servers. Which in-turn means: a more sustainable web.
You can also check out this article as the talk I did for ImageCon 2020.
Let’s kick this off!
Are you ready to jump in? We will go over the three subjects mentioned above and for each I will provide tips and tricks. There are many more considerations to make to lower our collective carbon footprint. We are not dealing with a simple problem here. For now we will solve the low hanging fruit issues. Fixing these get’s you to a lower carbon footprint than ~95% of the websites out there.
Do this first
Go to https://www.websitecarbon.com/ and put in one of your pages. I’ll wait… Exciting!
My personal website produces 0.14g of CO2 per page visit. If I had 10000 monthly page views (I wish), that would come down to 16.58kg of CO2 emissions. This is roughly the amount of carbon a tree absorbs in one year.
How did you do? If it’s less good than my score (which is not even in the top 10% of all sites), continue reading.
1. Design and content
Bad accessibility and unclear UX are big offenders for websites that do not have a low carbon footprint.
Proper usability on a website lowers the carbon footprint. Having an easy to navigate journey helps users to find what they are looking for faster. By spending less time searching for content the user will consume less pages. When you find what you are looking for fast you use your device less: less CO2 emissions! Captain obvious.
But why do so many websites have such terrible user experience? This is due to clients putting their internal processes onto the end user. A good UX designer / story teller is worth their weight in gold (or carbon).
Funny right? Websites and apps nowadays are actively trying to keep us engaged as long as possible! I’m guessing Facebook and Google aren’t the best for the environment if you look at it from this perspective. They do try to mitigate this by being carbon neutral/positive in their office buildings.
Make sure your SEO is on point and that the accessibility of the site is at least AA code compliant. The simplest (blunt) tool is Audits in Chrome Developer Tools. This doesn’t get you close to being compliant but hiring a specialized for agency is not in the cards for everyone.
One last note, make sure you convey information with as little content as possible. Adding 20 pictures of the same cute kitten likely has the same effect as adding only two. It also increases the page load and causes the device to emit more CO2 than needed.
Quick tips that help you with SEO and “findability”:
- Use a canonical URL on every page. Even when it points to itself
- Have a good title and description for each page
- Use a sitemap.xml
- Use schema.org micro data
- Have correct open graph data
- Have no 404 or 500 errors
- Use https
- Use semantically correct html
- Use the correct page outline
The rest is up to you. Research a bit on how to achieve these things and you’ll be fine.
On the accessibility side of things make sure to be “code compliant WCAG AA”. Do not use different markup for small and large screens. Mobile first is important for users but also for search engines. Content should be the same across devices.
Use skip links for components that require lot's of keyboard traversal. Also make sure the website is easy to use when zoomed in. A lot of users zoom to 200% in their browser. This means all your CSS properties should be relative (em, rem and %). Due to the zoom they are effectively using your website in small screen mode. Make sure your website is keyboard accessible in it's "mobile" view.
But my business model depends on engaged users that never leave my site!
Really? Are you Facebook? If so, at least follow development best practices and make the right architecture choices. This will lower the emissions as much as possible. More about this below.
2. Front-end development best practices
Performance has a huge influence on how much CO2 emissions your website has per page. The average page on the web has an emission of about 1.75 grams per page. Only three short years ago the average page weight was 4 grams per page load. The web has come a long way since then. Nowadays a well optimized page can have as little as 0.1 grams of emissions.
Who are the biggest offenders?
The biggest offenders for poor performance are the things can't control. Think of third party libraries, analytics trackers, personalization engines, embeds and media assets. The more you show on a page, the more connections need to be made. Connections to sources you can't control are problematic. They might not use green energy hosting or they send over a bunch of stuff you likely won't need. More stuff to lead means more CO2 emissions on your page.
Eliminate as many external dependencies as possible. Optimize the ones you decide to keep. Do you actually need that 1mb photo slider or can you write one yourself in 300 lines? Check if you need all that bloat. The same goes for frameworks like Bootstrap, moment.js and lodash. We generally only use 5-10% of those kind of frameworks.
If you insist on using a framework, check if it supports modern import/export features. With the black magic of Webpack you can load the stuff you need rather than the whole library.
Generally you don't need highly detailed user tracking on a page. The same goes for client side personalization scripts. Their features are rarely implemented correctly and they add a LOT of bloat to the page. Do it right or don't do it at all. Don't go half baked and in return get a slow website. In my career I have never seen a client implement these things in a way they were useful to them.
One last note: things like custom web fonts can hurt the page performance. Try to stick to web safe fonts or only load the specific character sets you need for the custom font. With google web fonts you can only load a font with the specific characters you need. Smaller font file, less data over the wire, less CO2 emissions.
Use progressive enhancement.
Progressive Enhancement is the principle of starting with a simple foundation. On top of this foundation we then add enhancements if the visitor's device can handle them.
Progressive Enhancement is the opposite of Graceful Degradation. Graceful Degradation is the journey from complex to simple. Progressive Enhancement is the journey from simple to complex.
Progressive Enhancement has less code due to the simple base. Graceful Degradation has all feature code as a start. It has has extra code on top to make the website work for less able devices. Hence, more code.
By using Progressive enhancement the user has less to code download. Less data over the wire means a lower carbon footprint for the website.
There is more!
Progressive enhancement is the base of Progressive Web Apps (PWA). A term that Google coined a while back. As mentioned above, Progressive Enhancement serves the simplest experience first. It enhances it based on features of the end users device.
That simplest experience is: show content, on a small screen, without an internet connection. This is why PWA's use service workers to control the network stack of the browser. What if a user came back a second time and you didn't even need to go to the network to fetch your assets? That's pretty low carbon right!?
Optimize media and its delivery
Images and video are complicated but they are also key to a successful message of a webpage.
What you have to do is fairly simple as a concept. Optimize assets to have the smallest file size, the right file type and the correct resolution. Do this for each context they are used in.
But executing that concept seems to be a daunting task, even in 2020. On the one side we have Google Stadia which streams 4K games at 60fps to the browser without much input lag. And on the other side you see websites serve 4mb PNG files all scaled up and ugly looking.
Why can't web developers and CMS builders help the content editor out when it comes to images and video? I'm baffled by the lack of knowledge. Media is arguably the most important part of commercial websites. Why is media management not the center point for most CMS platforms? What would the Nike website be without image assets and video? Not much...
I have to explain the following things every project:
- image mime types
- responsive images
- srcset, sizes, media
- alt attribute
- picture tag
- ways of rendering/streaming of video
And the worst part is: I always have to look it up myself. And. We still don't always get it right. Sigh.
Ok, rant over.
This post is not about the details of image and video management. I'll instead give some quick wins that can help you along the way.
Quick tips for media optimization
If you want to do it yourself, install some applications on your computer that help you to optimize assets before uploading. For video you can go the ffmpeg route but it's very fiddly and for the geeks among us. I have dabbled in this for a while but I gave up. Too many options for too many different codecs...
The best way to approach all this is to use a third party service. A few examples I have used: Cloudinary, TwicPics, ImgIX or Akamai Image Manager.
These services have feature sets to highly optimize your image and video assets. Next to serving the right image file type for the, they can also transform your images on the fly by changing the URL.
Cloudinary can even do photoshop like transforms on the URL for both video and images. I once used Cloudinary to align 5000 product images that were all shot differently. They had fake shadows added in PSD files and their layout was all over the place.
We converted all the images to the same product image template. The product is the middle, without shadows, with ample white space around it. We did this by using the Cloudinary URL API.
The assets were stored as high quality but served hyper optimized to the users in their context.
Imagine if we had to use an army of interns to do this. Lot's of meetings, computers, phone calls and weeks of work cause lots of CO2 emissions. By using a service like this we saved money and we were much more friendly to the environment.
Lazy load all the things
Think about lazy loading whatever you can. Lazy loading means that you only load an piece of content or code when the user actually needs it. If a user does not need to see the asset, why render it? Less bytes over the wire means a lower carbon footprint. You should lazy load images, videos, iFrames and all third party scripts. Pay attention to GDPR related scripts. These need to be on the page at load.
There are a bunch of use cases here: 1. load the items when the page is ready 2. Load them when the user scrolls down. 3. Load them when the user opens a panel where the item needs to be shown. Use the Intersection observer API to check if the thing you want to load intersects with the window. If so, load it.
This post is not explaining how to do this but there is a lot of information about this on the web. Google is your friend.
3. Server architecture choices
We have used centralized monolithic systems for the last twenty years. These systems are a lovely bucket of everything you need in one place. Think of Adobe AEM, Wordpress, Sitecore, Drupal, etc.
Monolithic systems like these are “always on” and keep state at all times. This means that the servers never stop consuming power, even if you do not need them. Costly and very unfriendly to the environment.
A decentralized system has different parts of the application defined as agnostic pieces. If you don’t use a piece, you don’t pay for it as it is turned off.
In case of JAMstack, the dynamic part of the project only happens at build time. The result is a completely static website. No state keeping servers are needed to run the project as the website is 100% static and it is hosted on the CDN edge.
Hosting on the CDN edge means that there is no origin server the CDN copies the files from. Origin servers are the state keeping things that are always on.
If your site is a bunch of static files, you can deploy them to the CDN edge with ease. Scaling the website now only means adding the files in more places. Nothing more needed. You can forget about load balance systems with multiple application servers behind them... Those sticky sessions when users move between servers were a b*tch to deal with anyways!
Most websites that tell stories, show products or display the written word can be on the JAMstack. There is no reason for them to use a super fancy expensive monolith like Sitecore or Adobe AEM. Imagine the money we could save!
I can already see my colleagues shake their heads in disappointment when they read this.
Of course things are different if you deal with complex projects. If you have highly dynamic pages and 200 websites under one umbrella, you need a solid platform.
But even for that use case, check out https://uniform.dev/. It makes a JAMstack website from your complex project. It reduces hosting costs, CDN bandwidth and the sites get way faster. No more runtime .net rendering baby!
Serverless sounds funny right? Everything needs a server to run. The term serveless is used in context of the decentralized context. There is no main origin server that keeps track of everything.
Serverless functions are agnostic stateless pieces of code that you use as microservices.
If not used they are off and you don't pay for them. They go to sleep. Serverless functions generally do not know about each others existence. They do a simple task of transforming data or posting a tweet for example. You can chain serverless functions together as well. This way you can create a whole architecture in the cloud without much effort.
The elephant in the room
Server parks are crazy big and use A LOT of energy. From processors to GPU's to cooling systems. It's pretty crazy. Make sure to choose a green hosting company or CDN provider.
It's actually not that hard to be energy efficient as a developer. If you make sure to follow general development best practices you have conquered the base of producing low carbon websites. Main topics: UX, performance, accessibility, reducing bloat, media management and architecture. Also choose a hosting provider and CDN which run on green energy.