Search for any green Service

Find green products from around the world in one place

Big data, low carbon: how data centres innovate for sustainability

Big data, low carbon: how data centres innovate for sustainability

Data centres are well-known for being energy guzzlers because of the growth of digital demand. Worldwide, they consume an estimated 200 terawatt hours a year (TWh/yr), or nearly 1 per cent of global electricity demand.

That said, the energy consumption of data centres has not grown at the exponential rate of Internet traffic. This is due to the huge strides made in energy efficiency in data centres. Improvements in the efficiency of servers, storage devices and data centre infrastructure, as well as the move away from small data centres to larger cloud and hyperscale data centres, have all helped to limit the growth of electricity demand.

According to figures from a report by the International Energy Agency (IEA), from 2010 to 2020, the number of internet users worldwide has doubled and global internet traffic has expanded 15-fold. But global data centre energy use has been flat since 2015, at about 200 TWh/yr.

Globally, leading data centre operators have committed to carbon neutrality and science-based targets for emissions reduction by 2030. To achieve these goals, they have partnered with technology companies to develop ways of reducing energy consumption at all levels of operation – from direct-to-chip cooling to providing on-site prime power through alternative energy fuel cells.

 

New cooling solutions

One of the main areas of innovation is developing new solutions to cool data centres more efficiently as their capacity grows. Typically, cooling accounts for a large proportion of overall power consumption. Estimates from 2021 suggest that the figure ranges from 30 to 37 per cent.

Air cooling has been widely adopted in data centres since their inception. The basic principle of such systems involves circulating cold air around the hardware to dissipate heat.

 

More high power-density racks of up to 50kW are being deployed in data centres, such as those at Equinix’s International Business Exchange (IBX) data centres around the world. Source: Equinix.

 

But air cooling systems are struggling to keep up with the increases in the power density of racks. Thanks to new generations of central processing units (CPUs), rack power requirements have moved from below 20 kilowatts (kW) to up to 40 or 50 kW today, easily.

Air cooling systems have evolved to address higher densities, but there is a point at which air just does not have the thermal transfer properties to do so in an efficient manner. This has caused organisations to look into liquid cooling, as water and other fluids are up to 3,000 times more efficient in transferring heat than air.

Liquid cooling is available in a variety of configurations that use different technologies, including rear door heat exchangers and direct-to-chip cooling.

Rear door heat exchangers is the more mature technology, where a liquid-filled coil is mounted in place of the rear door of the rack. As server fans move heated air through the rack, the coil absorbs the heat before the air enters the data centre.

Direct-to-chip cooling integrates the cooling system directly into the computer’s chassis. A liquid coolant is brought via tubes directly to the chip, where it absorbs heat and removes it from the data hall. The warm liquid is then circulated to a cooling device or heat exchange.

One of the world’s largest data centre providers, Equinix, for example, is developing a new direct-to-chip cooling technology at their Co-Innovation Facility (CIF) located in the Washington DC area. Developed in collaboration with Zutacore, the system introduces a cooling fluid to an evaporator overlying the CPU to absorb heat directly, which in turn causes the liquid to evaporate and produce a constant temperature over the CPU.

 

Hotter temperatures

Some operators are challenging the thinking that data centres should be operated at low temperatures of 20 to 22 degrees celsius. There is evidence to support the running of data centres ‘hot’, i.e., increasing their temperature by 1 or 2 degrees Celsius, which improves efficiency without any significant sacrifices in system reliability.

In Singapore, the Infocomm Media Development Authority has been trialing the world’s first ‘tropical data centre’, to test if data centres can function optimally at temperatures of up to 38 degrees Celsius and ambient humidity up to or exceeding 90 per cent.

Running with simulated data, the trial would test how data servers react under various situations, such as peak surges or while transferring data, and in conditions such as with no temperature or humidity controls.

 

Using digital resources and analytics to optimise energy usage

Smart solutions monitoring energy consumption patterns allow data centres to configure the optimal use of their resources, as well as to identify and diagnose equipment problems and take steps to fix them. Software powered by artificial intelligence (AI) can also assist companies to better manage their infrastructure and maximise the utilisation of their CPUs.

In an interview with Fortune, Equinix’s chief executive Charles Meyer explained that AI is used in the company’s data centres to “anticipate where power needs to be applied, how cooling… needs to be done to improve the power usage efficiency of the facility overall”.

 

Using on-site lower-carbon energy sources

New cooling solutions and digital resources are offsetting the energy consumption from increasing data centre services. However, there remains the question of energy supply to the facility overall.

A totally carbon-free solution would involve locating a data centre beside a wind- or solar-generated renewable energy source, or purchasing 100 per cent green energy from the grid. But these may not always be feasible solutions. In Singapore, for instance, space constraints limit the use of solar energy, and wind conditions are not sufficient for wind power.

Alternatives include the use of fuel cells for primary power supply at data centres. Fuel cells generate power through electrochemical reactions using natural gas, biogas or LPG. Testing by Equinix at CIF indicates they are 20 to 40 per cent cleaner than gas-powered electricity generation.

 

Fuel cells generate power through electrochemical reactions using natural gas, biogas or LPG. Source: Equinix.

 

When fuel cells are set up near data centres, there are even greater efficiencies. The generated electricity has less distance to travel and hence less energy is lost in the transmission process.

Equinix has deployed fuel cells at 15 of its facilities, including the carrier-neutral SV11 opened in San Jose in 2021, which utilises 4 megawatts (MW) of fuel cells for primary power production on site and can scale up to 20 MW of fuel cells.

Equinix is also part of a consortium of seven companies (including InfraPrime, RISE, Snam, SOLIDpower, TEC4FUELS and Vertiv) which launched the Eco Edge Prime Power (E2P2) project. E2P2 is exploring the integration of fuel cells with uninterruptible power supply technology and lithium-ion batteries to provide resilient and low-carbon primary power to data centres.

This work will also pave the way to transition from natural gas to green hydrogen (hydrogen produced using renewable energy) in fuel cells. Such advances are a step change towards sustainability where green hydrogen is available.

 

A holistic approach

Energy efficiency is crucial in determining future emissions in an industry that will continue growing in response to digitalisation and data consumption.

Besides energy efficiency, major data centre operators are interested in holistic sustainability gains that minimise carbon emissions. They consider how sustainable their supply chains are, total resource use and the company’s whole carbon footprint such as the embodied carbon in building materials.

Equinix, for example, has adopted a global climate-neutral goal by 2030 and has embedded decarbonisation actions across its business and supply chain.

Jason Plamondon, Equinix’s regional manager for sustainability in Asia-Pacific, says that the company is “well on (its) way to meeting (its) climate commitments, with over 95 per cent renewable coverage for (its) portfolio in FY21, maintaining over 90 per cent for the fourth consecutive year”.

He adds: “As the world’s digital infrastructure company, we have the responsibility to harness the power of technology to create a more accessible, equitable and sustainable future. Our Future First sustainability approach includes continuing to innovate and develop new technologies that contribute to protecting our planet.”

 


 

Source Eco Business

Google to help fashion brands map ESG supply chain risks

Google to help fashion brands map ESG supply chain risks

Consumers are demanding more transparency about where their clothes are produced and under what conditions. With the average supply chain for a merino sweater spanning 28,000 kilometres, fashion brands have the colossal task of tracing a product’s history from field to shelf in a bid to clean-up the sector’s spotty environmental, social and governance (ESG) record.

In partnership with conservation group World Wide Fund for Nature (WWF), fashion label Stella McCartney and non-profit The Textile Exchange, the search giant has developed the Google Impact Fibre Explorer, that it says will enable companies to identify the biggest risks associated with more than 20 fibre types in their supply chains, including synthetics.

Despite sustainability pledges, the fashion industry is failing to tackle its hefty carbon and environmental footprint and is on a trajectory that will far-exceed the pathway to mitigate climate change to align with the United Nation’s goal of keeping global temperatures from rising above 1.5°C since pre-industrial times, according to research by McKinsey, a consultancy.

The fashion industry is one of the largest contributors to the global climate and ecological crisis — accounting for up to 8 per cent of global greenhouse gas emissions.

A large chunk of emissions could be avoided in its upstream operations with approximately 70 per cent of the industry’s greenhouse gas emissions stem from energy-intensive raw material production.

 

The Global Fibre Impact Explorer (GFIE) dashboard allows brands to upload their fibre portfolio data and get recommendations to reduce risk across key environmental categories. Image: The Keyword, Google

 

Environmental factors such as air pollution, biodiversity, climate and greenhouse gasses, forestry and water use are calculated to produce risk ratings. The tool will also provide brands with recommendations for targeted and regionally specific risk reduction activities including opportunities to work with farmers, producers and communities.

During a pilot phase, British fashion house Stella McCartney was able to identify cotton sources in Turkey that are facing water stress.

Brands such as Chanel, Nike and H&M are among the 130 companies that have pledged to halve their greenhouse gas emissions by 2030 under the renewed United Nations Fashion Charter announced last month during climate talks in Glasgow. Alongside updated commitments to cut emissions, the charter promises to reduce the environmental impact from the use of materials such as cotton, viscose, polyester, wool and leather.

The renewed agreement is more ambitious than a previous commitment in 2018 to cut emissions by a third. Nevertheless, the signatories represent a slither of the vast garment and footwear industry with fast-fashion brands such as BooHoo, Shein and ASOS notably missing from list.

The textiles sector also called for policy change to incentivise the use of “environmentally preferred” materials, such as organic cotton and recycled fibres earlier this month.

 

Consumers do not want to buy products made with forced labour…Without government regulations, many companies will continue to make choices based on profits not on rights.

Laura Murphy, professor of human rights and contemporary slavery, Helena Kennedy Centre for International Justice

 

Improved data mapping tools should help to shed light on fashion’s murky supply chains. Many brands do not have reliable information on their upstream suppliers beyond the manufacturers they deal with. Data from cotton farms and spinners are rarely available on paper, let alone a digital format. Blind-spots are perpetuating environmental and social problems that have dogged the industry for decades.

Cotton supply, in particular, has come under the spotlight. China’s northwestern Xinjiang region, which produces a fifth of the world’s cotton, is where the Chinese government has allegedly committed grave human-rights violations against the largely Muslim population of Uyghurs and other minorities.

A new report published on 17 November by Sheffield Hallam University in the United Kingdom analysed supply chain connections identified through shipping records to show how cotton from the Uyghur region circumvents supply standards and import bans to end up in consumer wardrobes around the world.

In the report, Laundering Cotton: How Xinjiang Cotton is Obscured in International Supply Chains, Professor Laura Murphy and co-authors identify more than 50 contract garment suppliers – in Indonesia, Sri Lanka, Bangladesh, Vietnam, India, Pakistan, Kenya, Ethiopia, China and Mexico – that use the Xinjiang fabric and yarn in the clothes they make for leading brands, “thus obscuring the provenance of the cotton.”

“The benefits of such an export strategy may be clear: the end buyer is no longer directly involved in buying Xinjiang cotton,” the report said. “International brands and wholesalers can buy from factories in third countries that have few visible ties with Uyghur region-based companies.”

The researchers identified over 100 international retailers downstream of Xinjiang cotton, Murphy told media on a call on Friday. These include Levi Strauss, Lululemon, H&M, Marks & Spencer and Uniqlo, according to the report.

“Consumers do not want to buy products made with forced labour,” Murphy told Eco-Business. “We need our governments to insist that companies trace their supply chains back to the raw materials and make those findings public. Without government regulations, many companies will continue to make choices based on profits not on rights.”

 


 

Source Eco Business

How big data and open data can advance environmental sustainability

How big data and open data can advance environmental sustainability

The industrial revolution brought many advances, including improved living standards, for many (but not all) people around the globe. But it has also led to environmental degradation, and is responsible in part for the climate crisis we now find ourselves living in.

One potential contributor to solving this environmental crisis is the use of open environmental data, available to all, that can be analysed and used in ways that maximise sustainability. The only problem is that there are not, at present, global environmental open data resources – although many jurisdictions do have open data projects focused on the natural and built environments.

 

Open data and big data – the opportunities and challenges

Open data is just as it sounds: data sets collected by agencies that are made freely available to anyone that wants to use them. The Australian government has its own open data program available at data.gov.au. Data.gov.au has collected open data from all levels of government and all types of data. From an environment standpoint it covers everything from tree planting to garbage bin locations, collection schedules and contents.

It’s true this open data project doesn’t sound particularly sexy. But it’s the possibilities that this vast data resource opens up that are the most interesting aspects of the program. Suddenly, there’s information available about what’s happening in the local environment, all available in a format that is easily digestible by common data analytics programs.

 

The use cases for open data

Where governments can use open data is in developing policies designed to ensure better environmental regulation.

The potential for data collection is also limitless. It’s not just restricted to satellite data but is also open to everything from home weather stations, citizen activist activities like counting bird populations or tracking the growth of bush and forests, through to advanced “internet of things” sensors.

These IoT devices can capture just about any sort of data imaginable. Want to know how much sunlight fell on a particular field over a certain period of time? An IoT sensor can tell you.

This latest sensor technology offers real-time reporting of environmental data. And that data can be used to create open databases available for anyone to use.

Organisations can also use open data, and the IoT to track their own sustainability efforts. Miners, for example, can understand how much CO2 their operations are creating, and then use that data to create carbon offsets in a bid to meet net zero emissions – as many Australian organisations, including mining giants like BHP, have committed to.

A critical part of the open data movement, however, is the analytics associated with finding insights and answers about our environment.

 

The importance of analytics

Analytics works in two ways. First, it can derive insights into what has happened and why. But more importantly, it can also provide insights into what will happen, when it will happen, and what are the contributing factors for that particular outcome.

Business and government needs to use this open data, and analytics, to create new models around sustainability. That’s because until recently, the environment was treated as an externality – that is, something to be used (and abused) but which wasn’t factored into calculations about the bottom line.

With the shift towards sustainability, more and more companies are taking environmental inputs and outcomes into their ledger books, and calculating profit based on their environmental performance. These calculations are all powered by data, and the insights from advanced analytics.

Without data and analytics, we’re going to repeat the mistakes of the past when it comes to environmental issues. The tragedy of the commons is real, but by using open data sets, we can map a future where business, government and the environment are moving forward for the betterment of the earth – and humanity.

 


 

By Paul Leahy, Country Manager, ANZ, Qlik

Source Eco Voice