<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>ESCADE: Energy-eficient artificial intelligence for cost-efective and sustainable data centers</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Sabine Janzen</string-name>
          <email>sabine.janzen@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Hannah Stein</string-name>
          <email>hannah.stein@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Katharina Trinley</string-name>
          <email>katharina.trinley@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Cicy Agnes</string-name>
          <email>cicy.agnes@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vaibhav Jain</string-name>
          <email>vaibhav.jain@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Karan Rajshekar</string-name>
          <email>karan.rajshekar@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Nirav Shenoy</string-name>
          <email>nirav.shenoy@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Anika Rusch</string-name>
          <email>anika.rusch@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sujatro Ghosh</string-name>
          <email>sujatro.ghosh@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Wolfgang Maass</string-name>
          <email>wolfgang.maass@dfki.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>German Research Center for Artificial Intelligence</institution>
          ,
          <country country="DE">Germany</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Saarland University</institution>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Data centers play a central role in digital transformation, especially in the field of artificial intelligence (AI). However, their energy consumption is enormous, e.g., 20 billion kWh in Germany in 2024. At the same time, energy costs are rising and climate neutrality requirements are increasing. These factors pose major challenges for the sustainable and cost-efective operation of data centers. This paper introduces the ESCADE project (05/2023 04/2026), an ongoing research initiative funded by the German Federal Ministry of Economics and Climate Action, aiming to optimize the energy-eficiency of AI in data centers. AI compression techniques such as knowledge distillation, quantization and neural architecture search result in smaller, more energy-eficient AI models that deliver comparable performance. When combined with neuromorphic hardware, these models can achieve energy savings of up to 80%. The ESCADE consortium, a multidisciplinary collaboration of seven industry and academic partners, explores energy-eficient AI in two use cases: visual computing for scrap sorting in steel industry and natural language processing for software development. This paper provides a comprehensive overview of the ESCADE project, outlining its objectives, work packages, and anticipated outcomes. A central contribution is the introduction of first results in terms of the information system EAVE: Energy Analytics for Cost-efective and Sustainable Operations. By using AI-based analyses, EAVE optimizes the relationship between AI performance and operating costs of AI applications in data centers. The system measures and predicts the energy consumption, 2 emissions and operating costs of diferent AI model configurations, including hardware options. At the same time, it analyzes which factors significantly influence these values. This enables decision-makers to manage the operation of data centers in a data-based and eficient manner while meeting environmental targets.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Data center optimization</kwd>
        <kwd>Energy eficiency</kwd>
        <kwd>Artificial Intelligence</kwd>
        <kwd>Sustainability in IS</kwd>
        <kwd>Decision Support Systems</kwd>
        <kwd>Socio-technical systems</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        By 2027, the energy consumption of data centers worldwide will reach 500 terawatt hours (TWh)
per year – an increase of 38% since 2023 [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The growth in data centers and their capacities is
driven by digitalization, Industry 4.0, cloud computing and artificial intelligence (AI); Nearly 70% of
German companies use data centers for operation and development of business-critical IT applications
(turnover of €12.5 billion in 2021) [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. However, storing, processing, transporting and using data
consumes energy and releases considerable amounts of 2 [
        <xref ref-type="bibr" rid="ref4">4, 5</xref>
        ]. Training larger AI models can
emit the equivalent of five SUV lifetimes (284 tons of 2). For companies that develop AI-based
services for innovative business models, there are currently no methods that help them to manage the
implementation of AI models including training and inference with respect to cost-efectiveness and
sustainability. This “black box” problem is a key driver behind the fact that, according to the annual
electricity report from the International Energy Agency (IEA), data center energy consumption could
more than double by 2026, exceeding 1,000 TWh in a worst-case scenario [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. By 2030, data centers
and their hosted AI applications will account for 13% of global energy consumption [6]. Together with
rising costs for electricity, gas, mineral oil and coal, the ability of companies to operate data centers
economically is threatened [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]; forcing data centers to reduce energy consumption while maintaining
the same computing power. This trade-of and the demand for sustainable operation is currently being
addressed by sourcing the energy used from renewable energy sources [7] and using modern, direct
or indirect free cooling systems [8]. However, this does not change the inefective energy balance of
classic data centers and AI algorithms.
      </p>
      <p>ESCADE is an ongoing research project funded by the German Federal Ministry of Economics
and Climate Action (05/2023 - 04/2026) that aims to improve the cost-sustainability trade-of of AI
applications in data centers1. By means of compression techniques for energy-eficient AI, neuromorphic
chip technologies and energy analytics for decision makers, companies, data center operators and
government institutions can move from a reactive to a proactive stance by forecasting and balancing AI
performance, sustainability and economics. AI compression techniques such as knowledge distillation,
quantization and neural architecture search result in smaller, more energy-eficient AI models that
deliver comparable performance. When combined with neuromorphic hardware, these models can
achieve energy savings of up to 80%. The project consortium consists of seven partners from industry and
academia that take diferent roles in the project: data center provider (NT.AG), industry end user (SHS),
system provider (Seitec), and research &amp; development (DFKI, University Dresden, University Bielefeld,
Salzburg Research). ESCADE investigates the application of energy-eficient AI and neuromorphic
computing in two domains - scrap sorting in steel industry (visual computing) and natural language
processing for software development - with the objective to demonstrate considerable energy-saving
potential.</p>
      <p>In this paper, we present an overview of ESCADE, including its objectives, work packages and
expected outcomes. We discuss the role of information systems in ESCADE, focusing on the current
state of work and first results in form of energy analytics for cost-efective and sustainable operations
(EAVE). By using AI-based analyses, EAVE optimizes the relationship between AI performance and
operating costs of AI applications in data centers. The system measures and predicts the energy
consumption, 2 emissions and operating costs of diferent AI model configurations, including
hardware options. At the same time, it analyzes factors that significantly influence these values. This
enables decision-makers to manage the operation of data centers in a data-based and eficient manner
while meeting environmental targets.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Project objectives</title>
      <p>Objective of the research project ESCADE is the significant reduction of energy consumption of
data centers by using cutting-edge hardware and software technologies to improve the ecological
footprint of AI applications. Concerning the hardware focus is on the use of neuromorphic chip
technologies, as these promise eficiency gains of up to 50% in training and up to 80% in inference of AI
models. Concepts for integrating neuromorphic processing unit into classic GPU-based data centers, AI
compression techniques and AI-based energy management services will help planning new data centers
more sustainable, while existing data centers will be supported for hybrid operation. A sustainable,
resource-eficient design of data centers and AI systems contributes directly to achieving the UN
Sustainable Development Goals 9, 12 and 13. The concepts for cost-efective and sustainable data
centers will be demonstrated in two use cases for measuring sustainability and economic eficiency.</p>
      <sec id="sec-2-1">
        <title>Use case 1: Sustainable steel industry through energy-eficient AI</title>
        <p>Problem statement: The energy consumption of large visual computing (VC) models such as ResNet
can exceed 650,000 kWh annually [9], costing approximately €330,000 to train a single model. Such
models are necessary for scrap sorting in steel industry. In contrast to plastic, steel is an ecologically
sustainable material per se, enabling a completely closed-loop economy in the steel industry [10].
Producing one ton of steel from scrap consumes around 3.5 times less energy than conventional
production [11] (potential savings of 89.3 billion kWh per year for the German steel industry) [12].
However, steel scrap has only been used for 40% of steel products [13], as correct and eficient visual
pre-sorting in real time is impossible or very resource-intensive [14].</p>
        <p>Objective: ESCADE enables data centers to process large VC models up to 50% [15] more eficiently in
terms of energy consumption and inference time. For a large VC model, this corresponds to savings of
energy (- 334,500 kWh) and costs (- €167,250). It enables efective and eficient classification of scrap
types during pre-sorting, with a side efect of increasing use of steel scrap (at least 20%, i.e. savings of
nearly 30 billion kWh and approximately €15 billion in Germany).</p>
      </sec>
      <sec id="sec-2-2">
        <title>Use case 2: Large, energy-eficient NLP models</title>
        <p>Problem statement: For NLP-based software such as ticket systems, no “one-fits-all” AI models can be
ofered, e.g., for topic extraction. The training or fine-tuning of sometimes multiple models requires
approximately 18 hours on an NVIDIA V100 GPU and 2 Intel Xeon 642 CPUs (378 kWh and 540 hours
per year and model). Thus, training is the bottleneck of the entire business model with respect to
latency (e.g., to react quickly to customer requirements), scalability and energy consumption.
Objective: Data centers implementing ESCADE concepts can achieve a reduction in energy
consumption of 50% for training and 80% for inference of NLP models [16], i.e., energy consume of 189
kWh/year/model for the aforementioned setting [17]. Based on 1000 customers of a ticket system, this
means a potential for cost savings of around €95,000 per year for training the models.</p>
        <p>ESCADE: Energy-Eficient Large-Scale Artificial Intelligence for Sustainable
Data Centers
01.05.2023 – 30.04.2026 (36 months)
German Research Center for Artificial Intelligence GmbH (DFKI)
(Coordinator), Technical University of Dresden (TUD), University Bielefeld (UB),
StahlHolding-Saar GmbH &amp; Co. KGaA (SHS), NT Neue Technologie AG (NT.AG),
SEITEC GmbH (SEITEC), Salzburg Research Forschungsgesellschaft m.b.H.
(Salz)
German Federal Ministry for Economic Afairs and Climate Action (BMWK)
GreenTech Innovation Competition - Digital Technologies
5 Mio. €
escade-project.de and eave.dfki.de</p>
        <p>The project is structured into ten work packages (WP) (cf. Fig.1) that were distributed among the
partners based on their areas of expertise (cf. Tab. 1). WP2 is led by DFKI and develops a framework to
measure the sustainability of data centers and AI algorithms. On this basis, the framework is
implemented in WP3 through open-source software modules that enable energy analytics (SEITEC). WP4, led
by TU Dresden, energy-eficient AI algorithms for Use Case 1 and 2 are implemented on conventional
an neuromorphic hardware. Also led by TU Dresden, WP5 pursues the design &amp; implementation of
sustainable AI data centers through the creation of reference architectures with neuromorphic hardware
for "the world’s most sustainable AI data center." WP6 (University Bielefeld) and WP7
(Stahl-HoldingSaar) focus on the implementation and evaluation of Use Case 1 (WP6) and 2 (WP7). WP8, Evaluation &amp;
Benchmarking (NT.AG), assesses energy eficiency gains of the new data center concepts and performs
utility evaluations. WP9, Founding &amp; utilization initiatives (eco2050), applies economic analyses and
examines utilization concepts (e.g., startup creation) based on the project results. Within WP10,
knowledge transfer and communication of the results through publications, standardization, and community
building are conducted. Within WP1, the project is coordinated through DFKI.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Current status and intermediate results</title>
      <p>The decision support system EAVE - Energy Analytics for Cost-efective and Sustainable
Operations was developed - for data center providers as well as companies training and deploying AI
models. EAVE2 optimizes energy eficiency and cost-efectiveness of AI applications in data centers. It
measures and predicts the energy consumption, 2 emissions, and operating costs of diferent AI
model configurations, including hardware options (cf. Fig. 2). At the same time, it analyzes which
factors significantly influence these values. EAVE consists of three main modules: Measure, Predict
and Optimize.</p>
      <p>The Measure module calculates operational energy costs, energy consumption, 2 emissions,
as well as the Power Usage Efecitveness (PUE) in terms of an energy analytics summary based on
hardware configurations, AI model, time and location. The PUE value represents an indicator for energy
eficiency by providing the ratio of total facility energy to IT equipment energy [ 18]. The operational
costs are calculated by using historical energy price data of the chosen location. In addition, EAVE
provides a causal factor analysis [19] to quantify which variables (e.g., accelerator usage, cooling load)
afect the PUE.</p>
      <p>Based on the Measure module, the Predict module performs spatiotemporal optimization for the
given data center configuration. It predicts the most eficient combination of location and time of
year to minimize energy consumption, operational costs, and environmental impact, based on the
hardware configuration selected in the Measure module. The module currently supports predictions
across Germany, France, Netherlands, Italy, Poland, Austria, and the United States. Random
Forestbased machine learning models [20] were trained to estimate key metrics, including energy costs, 2
emissions, and PUE value. The module displays a comparison between the initial setting (in the Measure
module) and the predicted optimal setting with respect to costs, energy usage, PUE, and 2 emissions.</p>
      <p>The Optimize module (Fig. 3) analyzes and optimizes AI model eficiency. Based on the initial
hardware configuration, diferent baseline AI models (e.g., Vision Transformer (ViT)) can be selected and
analyzed (costs, energy consumption, accuracy, model size and 2 emissions). The AI compression
techniques (cf. Fig. 3) lead to more energy- and cost-eficient models. The techniques include Knowledge
Distillation [21, 22], which transfers knowledge from a large model to a smaller one; Quantization
[23], which reduces the precision of model parameters to decrease memory and energy usage; Neural
Architecture Search [24], that automates the design of more eficient AI architectures. The Optimize
module compares the key metrics of the AI baseline models with the compressed AI models. It provides
a visualization of Pareto-optimal models depending on model size, costs, accuracy, and inference time
(cf. Fig. 3).</p>
      <p>EAVE was implemented using Python, with React (frontend) and FastAPI (backend).</p>
    </sec>
    <sec id="sec-4">
      <title>4. Relevance of project for CAiSE</title>
      <p>The research project ESCADE directly contributes to the CAiSE community by addressing a critical
challenge in modern information systems (IS) engineering: the sustainable and cost-efective operation
of AI-driven applications. As digital transformation accelerates, IS increasingly rely on energy-intensive
AI models and infrastructure. ESCADE advances the field through the development of energy analytics
for IS, a novel capability that enables decision-makers to balance performance, sustainability, and
economics both before and during system operation. This aligns with CAiSE topics of interest, including
data-driven decision support, system optimization, and the socio-technical impacts of information
systems. By embedding sustainability metrics such as 2 emissions and energy consumption into AI
model lifecycle management, ESCADE expands the conceptual and technical foundation for engineering
responsible, energy-aware information systems. Furthermore, the application of AI compression
techniques and neuromorphic hardware introduces new design paradigms for creating energy-eficient
information systems with applications beyond the two domains initially studied. This aligns with recent
directions in the IS community to systematically embed sustainability goals within AI-based systems
[17].</p>
      <p>A central element of this contribution is the EAVE platform, which integrates decision support
capabilities into the system lifecycle and enables organizations to account for cost, performance, and
sustainability factors during both the design and deployment of AI-driven information systems. EAVE
exemplifies the project’s efort to operationalize sustainability within IS engineering processes, for
example through AI compression techniques. By combining expertise from data center operations,
AI optimization, and decision support, ESCADE provides a multi-layered IS architecture that directly
supports the CAiSE 2025 theme, "Bridging Silos." Its emphasis on interpretable analytics, cross-domain
applicability, and socio-technical trade-ofs reflects the CAiSE community’s broader vision of designing
systems that are not only intelligent but also responsible and resilient.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusion and future work</title>
      <p>The growing energy demands of data centers, driven by digital transformation and the advancement
of AI, poses serious challenges to sustainability and economic eficiency. ESCADE addresses this
critical issue by developing methods and tools that optimize the energy consumption of AI applications
without performance losses. Through the implementation of AI compression techniques, neuromorphic
hardware, and energy analytics, ESCADE enables stakeholders to shift from reactive to proactive
strategies in managing AI workloads. The initial results of the decision support system EAVE illustrate
the potential of data-driven energy analytics to reduce operational costs and 2 emissions in data
centers. By quantifying the trade-ofs between model performance, hardware choices, and environmental
impact, EAVE empowers decision-makers with actionable insights for sustainable AI deployment.</p>
      <p>Future work will extend the current system with real-time monitoring capabilities, training and
inference times in data centers, and integrate additional AI use cases beyond the initial domains of
VC and NLP. Further research is needed to generalize the findings for hybrid and pure neuromorphic
hardware settings and related AI workloads. As energy costs and sustainability targets continue to rise
in importance, ESCADE aims to become a blueprint for energy-conscious AI infrastructure in private
and public sectors.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgments</title>
      <p>This work was partially funded by the German Federal Ministry for Economic Afairs and Climate
Action (BMWK) under the contract 01MN23004A.</p>
    </sec>
    <sec id="sec-7">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the author(s) used Generative AI tools such as GPT-4 and Grammarly
to assist with grammar correction, spelling, and occasional paraphrasing during the writing process
(W). No generative AI tools were used to generate figures, tables, or scientific results. The research also
involved experimentation with pretrained AI models (e.g., Vision Transformers and DeiT), and code
assistance using tools such as ChatGPT or similar models may have been used for non-critical scripting
tasks (C+E). All AI-assisted outputs were critically reviewed, validated, and edited by the author(s),
who take full responsibility for the content of this publication.
[5] United Nations Environment Programme, How artificial intelligence is helping tackle
environmental challenges, 2022. URL: https://www.unep.org/news-and-stories/story/
how-artificial-intelligence-helping-tackle-environmental-challenges.
[6] A. Liebl, M. Ballweg, M. Wehinger, T. Rückel, How to leverage ai to support the european green
deal, 2022. URL: https://aai.frb.io/assets/logos/AppliedAI_SYSTEMIQ_Whitepaper_ClimateAI.pdf.
[7] International Energy Agency, Data centres and data transmission networks, 2022. URL: https:
//www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks.
[8] U.S. Department of Energy, Best practices guide for energy-eficient
data center design, 2011. URL: https://www.energy.gov/femp/articles/
best-practices-guide-energy-eficient-data-center-design.
[9] A. Fu, M. S. Hosseini, K. N. Plataniotis, Reconsidering co2 emissions from computer vision, in:
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp.
2311–2317.
[10] M. Sahoo, S. Sarkar, A. C. Das, G. G. Roy, P. K. Sen, Role of scrap recycling for co2 emission
reduction in steel plant: a model based approach, Steel research international 90 (2019) 1900034.
[11] EuRIC aisbl, Metal recycling factsheet by euric, 2020. URL: https://circulareconomy.europa.eu/
platform/en/knowledge/metal-recycling-factsheet-euric.
[12] C. Broadbent, Steel’s recyclability: demonstrating the benefits of recycling steel to achieve a
circular economy, The International Journal of Life Cycle Assessment 21 (2016) 1658–1665.
[13] C. Friedl, Schrott vor steiler karriere, 2021. URL: https://www.recyclingnews.de/rohstofe/
schrott-vor-steiler-karriere/.
[14] R. J. Compañero, A. Feldmann, A. Tilliander, Circular steel: how information and actor incentives
impact the recyclability of scrap, Journal of Sustainable Metallurgy 7 (2021) 1654–1670.
[15] Intel, Intel advances neuromorphic with loihi 2, new lava software framework and
new partners, 2022. URL: https://www.intel.com/content/www/us/en/newsroom/news/
intel-unveils-neuromorphic-loihi-2-lava-software.html#gs.fmm9op.
[16] D. Schmidt, G. Koppe, M. Beutelspacher, D. Durstewitz, Inferring dynamical systems with
longrange dependencies through line attractor regularization, arXiv preprint arXiv:1910.03471 (2020).
[17] T. Schoormann, G. Strobel, F. Möller, D. Petrik, P. Zschech, Artificial intelligence for
sustainability—a systematic review of information systems literature, Communications of the Association
for Information Systems 52 (2023) 8.
[18] N. Horner, I. Azevedo, Power usage efectiveness in data centers: overloaded and underachieving,</p>
      <p>The Electricity Journal 29 (2016) 61–69.
[19] A. Sharma, E. Kiciman, Dowhy: An end-to-end library for causal inference, arXiv preprint
arXiv:2011.04216 (2020).
[20] L. Breiman, Random forests, Machine learning 45 (2001) 5–32.
[21] J. Gou, B. Yu, S. J. Maybank, D. Tao, Knowledge distillation: A survey, International Journal of</p>
      <p>Computer Vision 129 (2021) 1789–1819.
[22] M. Phuong, C. Lampert, Towards understanding knowledge distillation, in: International
conference on machine learning, PMLR, 2019, pp. 5142–5151.
[23] A. Gholami, S. Kim, Z. Dong, Z. Yao, M. W. Mahoney, K. Keutzer, A survey of quantization methods
for eficient neural network inference, in: Low-power computer vision, Chapman and Hall/CRC,
2022, pp. 291–326.
[24] B. Zoph, Q. V. Le, Neural architecture search with reinforcement learning, arXiv preprint
arXiv:1611.01578 (2016).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Goldman</given-names>
            <surname>Sachs</surname>
          </string-name>
          ,
          <year>2024</year>
          . URL: https://www.goldmansachs.com/insights/articles/ ai-to-drive-165
          <string-name>
            <surname>-</surname>
          </string-name>
          increase
          <article-title>-in-data-center-power-demand-by-2030.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>ScaleUp</given-names>
            <surname>Technologies</surname>
          </string-name>
          , Bitkom study
          <year>2022</year>
          :
          <article-title>Data centers in germany, 2022</article-title>
          . URL: https://www. scaleuptech.com/en/blog/bitkom-study
          <article-title>-computing-centers-in-germany-</article-title>
          <year>2022</year>
          /.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <article-title>[3] Ofice of Technology Assessment at the German Bundestag (TAB), Energy consumption of ict infrastructure in germany, 2021</article-title>
          . URL: https://www.tab-beim-bundestag.de/english/projects_ energy
          <article-title>-consumption-of-ict-infrastructure</article-title>
          .php.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>International</given-names>
            <surname>Energy</surname>
          </string-name>
          <string-name>
            <surname>Agency</surname>
          </string-name>
          ,
          <year>Electricity 2024</year>
          :
          <article-title>Analysis and forecast to</article-title>
          <year>2026</year>
          ,
          <year>2024</year>
          . URL: https://iea.blob.core.windows.net/assets/6b2fd954-2017
          <string-name>
            <surname>-</surname>
          </string-name>
          408e-bf08-952fdd62118a/
          <fpage>Electricity2024</fpage>
          -Analysisandforecastto2026.pdf.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>