**1. Introduction**

The iron and steelmaking industry is an energy-intensive sector that accounts for about 18% of the world's total industry final energy consumption [1]. Steelmaking processes are also carbon intensive, and the sector accounts for 5% of global CO2 emissions [2].

Consequently, the steelmaking industry is currently subjected to emission trading schemes (ETSs) in several countries [3,4]. Overall, emission certificate costs have been low in recent years, hardly providing steel plant operators with an economic rationale to reduce their energy demand and emissions. However, progressively more stringent environmental standards and energy policy scenarios increase the likelihood of a rise in primary energy and CO2-emission certificate costs [5]. To avoid a consequent increase in the market price of steel products, it is crucial for steelmaking industries to identify cost-effective solutions for carbon emission reduction.

Worldwide steelmaking industries are also aware of the water–energy nexus [6] implications of their attempts to improve efficiency: a position paper on water saving by the World Steel Association [7]

points out that "the additional processes (required to save water) are nearly always in conflict with objectives to reduce energy consumption or CO2 emissions".

Similarly, research in the steel sector also reports some unexpected cases of water consumption increase as an observed outcome of energy-saving measures in real settings [8] or as a potential consequence of suboptimal carbon reduction practices under simulated incentive frameworks [9]. This may even happen in the case of waste-heat recovery [9,10], which is generally considered a synergistic option to decrease water and energy demand, as far as it reduces the need to discard water into the environment via cooling towers and cooling fans [11]. Therefore, to highlight synergies and avoid pitfalls, it is important that energy-saving projects in steelmaking and, more generally, in energy- and water-intensive industries, are evaluated with a nexus view [6], considering their impact on primary energy consumption and carbon emissions, as well as on water consumption.

Overviews of heat recovery options in the steelmaking industry have been presented by Moya and Pardo [12], He and Wang [1], as well as Johansson and Söderström [13]. Several waste heat utilization practices have been proposed, including iron-ore or scrap pre-heating, in basic oxygen furnace (BOF) steelmaking cycles, or electric-arc furnaces (EAFs), respectively, as well as power generation with Rankine cycles [13] which to date mainly occurs in BOF plants [14].

However, all these waste-heat utilization routes allow the exploitation of only a fraction of the sizeable waste heat flows available at steelmaking sites [15]. To improve energy efficiency and decarbonize industries, other forms of the utilization of waste heat are sought, particularly as direct or upgraded heat utilization [13].

One option for the internal utilization of medium-low-temperature waste heat flows in steelmaking, and more generally for energy-intensive industries, is to identify some process cooling demand that is currently met with vapour compression cooling systems, and substitute these with waste-heat-based absorption cooling systems. In fact, absorption cooling is a mature technology [16] that makes use of low global warming potential and non-ozone-layer-depleting natural materials as working fluid pairs. In particular, H2O–LiBr and NH3–H2O are the best performing and most common working fluid pairs [17]. H2O–LiBr systems are safer and less complex than NH3–H2O systems; the latter are therefore almost exclusively used for applications requiring refrigeration temperatures below 0 ◦C. Overall, absorption cooling running on solar heat or on waste heat sources can be regarded as zero-carbon-emission cooling systems [18].

For absorption cooling based air conditioning systems, the literature has focused primarily on solar cooling [19]. Most solar cooling applications make use of single effect cycles, which can be regarded as the state-of-the-art commercially mature technology for low-temperature applications running on hot water below 100 ◦C [16]. With the increasing spread of parabolic concentrators, double, triple, and variable effect cycles have been increasingly investigated [19,20], as they are more adept at exploiting medium-temperature heat sources (up to about 260 ◦C; see [18]) by enabling systems to reach the coefficient of performance (COP; or energy efficiency ratio (EER) on the order of 1.25 (double effect [18]) or 2 (triple effect [18]), depending on the heat source temperature, whereas single effect cycles have EERs on the order of 0.7 (hot water temperature on the order of 90 ◦C [21]). Readers are asked to bear in mind that in this paper we will refer to this parameter as EER, in accordance with the terminology introduced by standard EN 14511, which defines the EER as the ratio of the total cooling capacity of refrigerators to their effective power input, both expressed in Watt.

Practical industrial waste heat (IWH)-based cooling applications have received relatively less attention than solar cooling in the literature: the technology was proposed in some review papers [22,23], and mathematical models for the optimization of district cooling applications based on industrial waste heat recovery have recently been proposed for illustrative case studies from the chemical industry [24,25]. Some techno-economic feasibility assessments of absorption cooling as a recovery option for industrial low-grade waste heat have been performed by Brückner et al. [26] for general European IWH potentials, and by Cola et al. [27] for a drying process in the textile industry. In both cases, the assessment was performed either on a purely economic [26] basis or on an economic and

thermodynamic basis [27]. However, the environmental implications of di fferent choices, particularly with a water–energy-nexus-aware view, have hardly been considered.

An application of H2O–LiBr single e ffect absorption cooling to the air conditioning of electric transformer, generator, and switch cabins for the steelmaking industry was recently proposed in [9]. In fact, electric cabins must be air conditioned to remove excess heat and avoid damage to electric components, in order to avoid abnormal functioning or breakdowns of electric equipment due to Joule heating. This is especially true for EAF steelmaking sites, where transformers are required to provide electricity to all the electric equipment (e.g., electric motors, control rooms, robots, and the EAF electrodes). However, this application could be of interest for any industrial site housing large transformers in electric cabins.

The authors of [9] demonstrated that at average climate conditions for the EU-15 area, absorption cooling is economically preferable to Organic Rankine Cycle (ORC)-based power generation for exploiting intermittent low-grade waste heat flows available at EAF steelmaking sites. Moreover, they performed an assessment of the carbon emission and water consumption performance of those systems at average EU conditions.

However, they admit that a limitation of their study is that climate di fferences among di fferent countries have not been considered, assuming a constant cooling demand for the whole year and for the entire area of analysis.

This may be acceptable when considering cabins located within industrial sheds and when performing comparisons for geographically limited areas. However, modern electric and electronic equipment in energy-intensive industries, including electric steelmaking plants, is often housed in outdoor cabins. The water–energy impact of such systems is likely to be a ffected by climate, particularly depending on the residual waste heat dissipation systems installed, such as forced air coolers (a.k.a. dry coolers, DCs in the following) or cooling towers (CTs). The energy and money savings generated by heat-recovery-based cooling systems might even be negligible in some climates, and other options might be more e fficient for cabin air conditioning.

The present study aims to overcome the mentioned limitations, and to investigate the economic and water–energy nexus implications of exploiting low-grade process waste heat in outdoor electric cabins worldwide, based on typical situations at EAF sites.

To the best of the authors' knowledge, this problem has not ye<sup>t</sup> been addressed on this scale. However, some input for research design and methodology selection could be obtained from research on data centres [28–32], which also need intensive and continuous cooling to preserve electric and electronic components. Indeed, for data centres, absorption cooling has been proposed as a means to recover waste heat from the internal electric equipment (e.g., a subset of servers) to meet a part of internal cooling loads [28,29]. However, to the best of the authors' knowledge, the opportunity of exploiting an external waste heat source to feed absorption cooling systems for data centre air conditioning has not been investigated. On the other hand, direct air free cooling technology, which uses the cold outside air to remove the heat generated inside these facilities, has been extensively investigated for data centres [30–32], and could be an interesting low-cost option for electric cabins as well.

From a water–energy nexus perspective, this paper aims to determine whether and where process waste heat recovery for absorption cooling may be a better option than airside free cooling for maintaining acceptable temperatures within electric cabins. Thereby, this research is expected to widen current knowledge of the environmental performance of absorption cooling systems as a recovery option for low-grade industrial waste heat, particularly from a water–energy nexus perspective.
