Why AI progress doesn’t have to come at an environmental cost
Nutanix I 4:15 pm, 11th July
Huge
efficiency gains can be made in underlying infrastructure freeing-up
organisations to embrace the future, says Sammy Zoghlami, SVP EMEA at Nutanix
While there is no sugar coating the
pressures on businesses and government departments to meet NetZero carbon
targets, most IT leaders have the added strain of trying to keep up with
demands for new technologies. It’s a constant balancing act of enabling people
to work and perform better, while addressing ESG compliance and not blowing IT
budgets.
Automation is now dominating IT buyer
thinking. New products and tools keep emerging. Only recently, Microsoft
founder Bill Gates talked about the huge potential of AI assistants,
for example, suggesting the race is on for organisations to develop powerful AI
assistants that could reshape the digital landscape, putting the likes of
Google and Amazon under threat. He suggested these AI assistants could
radically change behaviours impacting everyday life and work. We’ve already
seen an element of this with ChatGPT, while Microsoft has already made a play
in this direction with the announcement of its Copilot AI assistant for 365.
The fact is, automation is attractive to
organisations for productivity, efficiency and overcoming skills shortages but
it can come at a cost, both a financial and environmental one. As Gartner
warned in its 10 StrategicPredictions for 2023, AI comes with increased sustainability
risk. By 2025, it says, “AI will consume more energy than the human workforce,
significantly offsetting carbon-zero gains.” With this in mind, something
surely has to be done now, to enable AI without undermining environmental
efforts.
Meeting ESG targets is, according to Deloitte at least, a more prominent
issue in boardrooms this year, so how organisations balance this with increased
automation needs will be key. Cloud computing is, of course, central to the
enablement of AI tools in organisations. Digital transformations to implement
platforms that unify organisations and therefore data are driving cloud
adoption.
As Gartner revealed recently in its research, worldwide spending on cloud is expected
to hit around $600 billion this year, driven primarily by emerging
technologies, such as generative AI. Sid Nag, vice president analyst at
Gartner, says generative AI requires “powerful and highly scalable computing
capabilities to process data in real-time,” with cloud offering “the perfect
solution and platform.”
Cloud bursting
And yet, cloud continues to be dogged by
claims of being bad for the environment and not helping organisations hit their
ESG compliance targets. In fact, the cloud industry has been one of the most
active in trying to increase efficiencies and reduce environmental impacts.
Such is the demand for cloud services, that inevitably keeping up is difficult.
Piling on more racks in a datacentre is a short term solution but not really a
long term answer, especially given the leap in power demands to manage
increased automation.
In our Enterprise Cloud Indexresearch, 85% of 1,450 IT decision makers acknowledged that meeting corporate
sustainability goals is a challenge for them. While nearly all (92%) said
sustainability was a much more important issue than a year ago, there is
clearly a disconnect between what organisations want to achieve and how they go
about it. What we have seen is that there are big challenges arising from a mix
of complexity and IT budget restraints.
Our research shows that most organisations
use more than one type of IT infrastructure, whether it is a mix of private and
public clouds, multiple public clouds, or an on-premise datacentre, along with
a hosted datacentre. This is only going to grow but mixed infrastructures
create new management challenges. Given the increased complexity, organisations
need a single, unified place to manage applications and data across their
diverse environments, to reduce costs but also to measure impacts.
Increasing efficiencies in data processes
is an important step in reducing ‘hits’ on IT systems but this really only goes
part of the way. The real step change for any organisation operating in the
cloud is looking at the underlying infrastructure. Measuring and then managing
impacts from datacentres will continue to be key to reducing carbon impacts of
organisational computing. As with a car, if you have a smaller and yet more
powerful and efficient engine, not only are you going to reduce emissions, you
are going to enable room for growth and increased performance, through tools
such as AI.
Re-framing the picture
As Atlantic Ventures suggests in its report
Improvingsustainability in data centers, the required energy demand on datacentres is still very high and
results in large amounts of carbon dioxide emissions. Energy consumption is a
major factor in measuring environmental performance of datacentres but one
traditional method is now being questioned.
As we outline in this paper, power usage effectiveness (PUE) as a tool
for measuring is diminishing in value. It still has a place as an internal
improvement metric but it isn’t that helpful when it comes to making
outside comparisons. As PUEs fall to scores of around one, differences become
more marginal. The point is that relying on a PUE score as a measure for efficiency
and low carbon emissions doesn’t really work.
Fundamentally, changes need to be made at
the rack. Infrastructure modernisation starts with hyperconverged
infrastructures (HCI), reducing ‘moving parts’ and therefore energy needs. This
also means less complexity, both in terms of cloud structures but also data
management. This is what will achieve the most direct outcomes.
As Atlantic Ventures says, “in the EMEA
region HCI architectures have the potential to reduce up to 56,68 TWh from
2022-2025 and save up to €8.22bn in electricity costs in the same period for
companies and data center providers undertaking a complete transformation
towards HCI.” This, combined with next generation liquid cooling is a huge step
towards creating a low impact platform for the future.
For any organisation looking to embrace AI
and related automation applications, addressing infrastructure complexity now
is key. Running datacentres is an increasingly specialist business (especially
given on-going high energy prices) and as more and more data is required in
real time, so the challenges for organisations only increase. With the right
partners and the most efficient infrastructure in place, any organisation could
consider itself AI-ready without sacrificing an ESG targets.
Subscribe to our Newsletters
Stay up to date with our latest news
more news
Orange Business lance « Live Intelligence » : une gamme de solutions clés en main d’accès à l'IA générative pour les entreprises
by Orange Business Digital Luxembourg I 3:11 pm, 27th November
Orange Business, intégrateur réseau et numérique de référence en Europe, annonce le lancement de sa solution clé en main d'IA générative Live Intelligence. Cette solution simplifie le déploiement et la gestion de l’IA générative pour les entreprises de toute taille et les collectivités, en France et bientôt en Europe.
load more