The growth of the digital economy globally has been significant in recent years and is expected to continue in the future. Worldwide IT spending is projected to total $4.6 trillion in 2023, an increase of 5.1% from 2022, according to the latest forecast by Gartner, Inc. Demand for IT in 2023 is expected to be strong as enterprises push forward with digital business initiatives in response to economic turmoil.
The increasing demand for software products and services is contributing to a growing carbon footprint in the tech industry. Data centres consume about 2% of global electricity today. By 2030, they could consume as much as 8%.
By 2040, the information and communications technology sector is expected to account for 14% of the world’s carbon footprint (up from about 1.5% in 2007). Training a single neural network model today can emit as much carbon as five cars in their lifetimes. And the amount of computational power required to run large AI training models has been increasing exponentially, with a 3.4-month doubling time. By designing and developing more environmentally friendly software, developers can help mitigate the negative impact on the environment.
Let’s be clear: On its own, software doesn’t consume energy or emit any harmful discharge. The problem lies in the way software is developed for use— and then in the way it is used. Software runs on hardware, and as the former continues to grow, so does reliance on the machines to make it run.
Green software is carbon-efficient software, meaning it emits the least carbon possible. Only three activities reduce the carbon emissions of software; energy efficiency, carbon awareness, and hardware efficiency.
Green software engineering takes into consideration software practices and architecture, hardware and data centre design, electricity markets and climate change. In addition, green software engineering aims to generate fewer greenhouse gas emissions and reduce a company's carbon footprint.
Green software engineering is crucial for the tech industry's efforts to reduce its environmental impact.
By incorporating sustainable practices into software development, engineers can create systems that are energy-efficient, resource-optimized, and responsible.
Shifting towards a green software engineering approach is important for a number of reasons:
- Cost savings: Green software engineering practices, such as optimizing resource usage, can help reduce the cost of operation and maintenance of software products.
- Improved user experience: Software that is designed to be efficient and sustainable can provide a better user experience, as it will likely be faster and more reliable.
- Ethical responsibility: As software developers, it is important to consider the wider impact of their work and make choices that contribute to a more sustainable future.
- Competitive advantage: Companies that adopt green software engineering practices may have a competitive advantage, as they can demonstrate their commitment to sustainability and differentiate themselves from their competitors.
The principles of green software are a set of concepts that software engineers should follow when they're designing, building and deploying sustainable software applications.
There are eight principles of green software development:
- Carbon. Developers should build software that offers value to users, while producing fewer carbon emissions.
- Electricity. Energy-efficient applications use software components that don't consume much energy.
- Carbon intensity. Intensity refers to the amount of carbon emissions that is generated for every kilowatt per hour of electricity that's used. Companies should consume as much as electricity as possible from renewable energy sources, which have lower carbon intensity.
- Embodied, or embedded, carbon. Embodied carbon is how much carbon is released when companies develop and dispose of electronic devices. This principle calls for organizations to build software that reduces the amount of carbon that's released when they develop and dispose of electronic devices.
- Energy proportionality. Energy proportionality is the relationship between how much a device is used and the electricity the device uses. The more someone uses a device, the better it can convert electricity. The goal of this principle is to maximize the energy efficiency of the hardware by ensuring a high rate of utilization.
- Networking. Sent and received data travels across multiple devices that are connected in the network, including routers, switches and servers. Each of these devices contains embedded carbon and uses electricity. The objective is to decrease carbon emissions and increase the energy efficiency of the software by reducing the size of the data, as well as how far it must travel across the network.
- Demand shaping. This principle describes moving the demand for computing power to another time or region and shaping it to match the supply that's available.
- Measurement and optimization. Implementation of long-term, consistent optimization strategies can boost the overall carbon efficiency of software.
We can start by asking: What is the smallest possible environmental footprint we could make with this application? Let’s use that expectation to guide the first stages of the software development cycle.
Then we can develop recommendations on, for example, the algorithms, programming languages, APIs, and libraries you can draw on to minimize carbon emissions. We might require constant assessment of alternatives that might be more efficient.
At the deployment stage, we can monitor real-time power consumption through techniques such as dynamic code analysis. The data we gather will be critical for understanding the gaps between the design choices and actual energy profiles.
Newer application architectures— such as serverless computing or functions-as-a-service (FaaS)— enable even more control over capacity and by extension, energy consumption. Serverless computing, for example, efficiently shares infrastructure resources by executing functions only on demand. And since it bills by execution time, it compels programmers to improve their codes’ efficiency. Large serverless computing service providers such as AWS Lambda and Microsoft Functions, for example, provide for continuous scaling with a pay-as-you-use cost model.
A programming language’s efficiency depends on several factors – energy, memory, and time they consume in execution. The efficiency can also be improved with the quality of virtual machines, compilers, optimized libraries, and better source code.
Today, engineers, developers, and researchers are working to make coding as green as possible. That said, a team of researchers from Portugal studied the 27 most used programming languages to determine their efficiency and answer – is a faster programming language more efficient or greener?
The researchers tested all these languages by executing or compiling programs using advanced virtual machines, compilers, libraries, and interpreters. Next, they analyzed each programming language’s efficiency with respect to energy consumption, execution time, and memory usage.
As a result, the 5 most energy-efficient programming languages they found were: C, Rust, C++, Ada, Java. Kotlin offers almost the same runtime performance as Java.
Furthermore, they also found that faster programming languages are not always the most energy efficient. While Java ranks well in both energy efficiency and speed, Python, Perl, Ruby are among the least efficient programming languages.
You can read further here.
Cloud computing helps IT companies adopt green computing by addressing issues like resource consumption and energy usage. The approaches like virtualization, energy-efficient data centres, multi-tenancy, etc., enable cloud computing to reduce energy usage and carbon emissions. Using cloud computing eliminates the need to have energy-consuming data centres.
Moreover, edge computing allows you to redistribute computation near the users to lower costs. It lets you have smaller data centres, where maintenance is also less in terms of energy consumption and cost.
Parallel computing, similarly, helps reduce energy consumption as multiple small-size computations or functions can run simultaneously on multiple processors that interact through shared memory, instead of running on separate hardware consuming more power.
- Relocating servers: Relocating your servers to colder climates can reduce Greenhouse Gas (GHG) emissions by 8%.
- Implementing data centre best practices: You can follow some data centre best practices to reduce unwanted emissions. These practices are as follows:
- Power usage effectiveness should be 1.2 or lower.
- Remove or unplug zombie servers that aren’t doing computing while still plugged in.
- Elimination of Dark data: eliminating duplicate copies of data or compressing data into smaller chunks would save energy
There are almost always trade-offs between business and environmental goals, and software engineers need to be able to determine where the go/no-go line is. Equally important is that the strategy calls for flexibility— allowing engineers running room to improvise and to learn through trial and error.
You can be a Green software practitioner by following the training and certification provided by https://greensoftware.foundation/.
In May 2021, Microsoft, Thoughtworks, Accenture and GitHub -- together with the Joint Development Foundation Projects LLC and The Linux Foundation -- established the Green Software Foundation, a nonprofit organization aimed at developing a network of "people, standards, tooling and best practices for green software."
Organizations are pledging to address the worldwide climate crisis by becoming carbon neutral, if not carbon negative. Optimising existing resources and building green software will definitely help organisations meet ESG scores. Let’s pledge for a better planet and change the way we develop software.