Businesses all over the world are moving towards a greener future. Steel factories look at green steel, dairy companies offer plant-based alternatives, office roofs get solar panels, and cloud providers make sustainability pledges. However, even though every company nowadays uses software and hardware of some form, the IT landscape is often not considered in those sustainability plans, even though data centres have substantial carbon emissions (more than aviation and growing!). Sustainability is also not of mind when software and hardware developers talk about non-functional requirements; they think about performance and reliability but not so much about carbon efficiency.
I believe software engineers have a largely untapped opportunity to contribute to a greener world. And a big bonus: as we green our IT landscape, the applications in it also likely become cheaper to run, with better performance and resilience. Win-win!
In this article, we will take three steps into the world of green software engineering, with concrete examples along the way. The three steps are visualised below.
1. The Software Application
Running our applications causes carbon to be emitted. We will probably never be able to emit nothing at all, but we can work to minimise the amount of carbon emitted per unit of work. There are essentially two approaches to making applications more energy-efficient.
The first approach is to simply make our application consume less energy. How to achieve this exactly depends on your application. For web applications, we could look at the network communication between the frontend and the backend and identify three options:
- Reduce the size of network packets by minifying the web assets or removing unused CSS definitions.
- Reduce the distance those packets travel by introducing a CDN (Content Delivery Network).
- Reduce the number of API calls needed to render the page.
We could even look at the programming language you pick. Five years ago, researchers in Portugal compared the energy consumption of several popular programming languages and showed how their execution time and memory usage influenced energy usage. They found that compiled languages (C, Go, Rust, Java) tend to be more energy efficient than interpreted languages (JavaScript, Python).
Alternatively, we can make our applications more intelligent. Our power grid is supplied with electricity from various sources. When the sun is shining and the wind is blowing, then more of the available energy comes from renewable sources. Based on that concept, we build our applications to be carbon-aware: do more when more energy comes from low-carbon sources and less when more energy comes from high-carbon sources. Imagine an application that reads a task description from Kafka and performs some CPU-intensive computation that adjusts its pace based on how clean the current energy supply is. This pattern is called demand shaping, as we shape the energy demand of our application based on the carbon intensity.
Instead of changing the pace (demand shaping), we can also shift to a different time and/or place. This is called demand shifting. Imagine that your company is in Australia, and you want to run computations during nighttime. You could run the workload in the Netherlands, where its daytime and where solar power is more readily available.
2. The Runtime Platform
Our software applications are often operated on runtime platforms such as Kubernetes, which makes them a great place to achieve more sustainability gains. Making improvements at the platform-level profits all applications running on it. The k8s community has some great initiatives to assist you. I'll give three examples here.
Suppose your company has separate Dev, Test, Acceptance and Production environments. In that case, you can already achieve significant emission reduction by turning off the Dev and Test clusters on weekends. You can automate this through kube-green
, which automatically shuts down your deployments and cronjobs when you don't need them. An added benefit: you also promote a healthy working environment where developers take some rest outside business hours.
By default, Kubernetes scales workloads based on CPU and RAM utilisation. However, we can use components like KEDA
to scale based on relevant demand metrics such as HTTP requests or queue length. This can help us reduce resource utilisation and, therefore, your carbon emissions. KEDA
can even scale deployments down to zero.
Finally, we can adjust the Scheduler to assign pods to nodes based on the carbon intensity at that machine's location. Bill Johnson, Principal Software Engineering Manager, wrote an article about this topic here.
3. The Hardware
All software we build eventually runs on hardware, from servers in a datacenter to the phones and laptops of our end users. These devices need to be manufactured, and they also need to be dismantled eventually. The carbon that is emitted during the creation and disposal of a device is called embodied carbon. The bar chart below (credits: Green Software Foundation) shows that this embodied carbon is not a trivial factor.
Reducing the embodied carbon of servers requires embracing responsible lifecycle management, which involves implementing proper disposal and recycling methods for old servers. By refurbishing, reselling, or recycling them, we extend their useful life and prevent them from ending up in landfills.
Besides the embodied carbon, we should also consider how servers are used. Simply said: the more we utilise a server, the more efficient it can run our applications. We call this energy proportionality (more info here). Therefore, we can run our software using less electricity by running on as few servers as possible.
I believe that embodied carbon and energy proportionality are why cloud providers have an amazing opportunity to contribute to a better world. I am happy to see the first steps already being taken. Servers in data centres are often written off after four; all three big cloud vendors recently announced extending this lifespan in the light of sustainability. Also, as they manage the applications of multiple companies, they can intelligently allocate workloads in such a way that no servers are sitting idle, improving utilisation rates.
Besides making hardware more sustainable under the cloud-hood, cloud providers should also give tools to their customers to make greener choices. For example, AWS Lambda has a SnapStart option to decrease startup times. In the same way, we could have a CarbonAwareStart that applies demand shaping to Lambda executions. Moreover, cloud providers could inform customers about their emissions in the same way as they now give insight into the costs. What gets measured gets improved.
Conclusion
Green software invites us, software engineers, to make our IT landscape more sustainable. Engineers can play a vital role in protecting the environment by focusing on energy efficiency, reducing waste, and minimising carbon emissions. Joining the green software movement is an opportunity and a responsibility to build a better future together. Using sustainable practices, we can drive innovation, improve performance, and contribute to a resilient digital world across all industries. Let's work together to create a greener, more sustainable planet.
Interested? Check out the Green Software Foundation.
Image attributions
Cover photo by Matt Anderson on Unsplash.
The icons used in my visualisation are found on Flaticon.com.
- Globe icons created by Uniconlabs - Flaticon
- Lifespan icons created by Flowicon - Flaticon
- Server icons created by Freepik - Flaticon
- Development icons created by Design Circle - Flaticon
- Application icons created by Smashicons - Flaticon
Top comments (0)