The use of fossil fuels has been growing for two centuries along with the growth of GDP: reversing this trend, which has been underway since the first industrial revolution, is, according to the report Net Zero by 2050 by the International Energy Agency (IEA), the greatest challenge facing the humanity has ever had to face.
The production and use of energy are responsible for more than 75% of greenhouse gas emissions in the EU area. The decarbonisation of the European energy system is essential in the pursuit of the 2030 climate objectives and in the long-term strategy to achieve carbon neutrality by 2050.
According to the IEA’s World Energy Outlook 2022, hot off the press, overall demand for fossil fuels will decline steadily from the mid-1920s through 2050, with annual declines equivalent to the production of a large oil field over its lifetime.
The EU’s political agenda points to a double transition, green and digital, by intertwining the strategy of accelerating decarbonisation with technological innovation. High Performance Computing is at the heart of this new paradigm, and not just in a technical sense, but on many levels. We are talking about the enormous potential of supercomputing, a multiplicity of tools with which scientific communities can access more powerful resources and use them to carry out simulations focused on energy challenges. Simulation allows you to plan and work for tomorrow’s clean energy sources in a digital framework, significantly reducing prototyping costs, waste, margins of error and time to market.
Data science, artificial intelligence and HPC, high performance computing (High Performance Computing) are precious resources with respect to the most ambitious goals of the energy sector: from improving the exploitation of sources to developing weather forecasts (crucial for wind and solar) , from the design of new devices and advanced materials of energy interest to issues of energy distribution.
The first supercomputer was developed in the USA in 1954, 67,000 operations per second when for the rest of the world computers were just «machines ready to buzz and flash, full of electronic mystery» (Richard Yates, 1961).
Since then, the performance growth of super-performing machines has been unstoppable; they are now indispensable for the interpretation of geophysical data, for the study of renewable sources and for increasingly accurate weather forecasts: all leading activities carried out by Eni’s HPC5, active since February 2020 in the Ferrera Erbognone Green Data Center.
It is a computing cluster, a set of computers that work together to multiply performance. It has a power of 52 Petaflops which, associated with its predecessor, still in operation, the HPC4 system, allows a peak power of 70 Petaflops: 70 million billion mathematical operations performed in one second. HPC5 is twelfth in the world ( first among industrial computers, i.e. those not owned by state entities or institutes) in the June 2022 edition of the TOP500 ranking of supercomputers, and is also present in the upper part of the special GREEN500 for energy efficiency. A single watt of electricity allows it to calculate nearly twenty billion operations per second. Hardware of this type – which keeps in memory the data of 70 years of exploration – combined with proprietary algorithms, produces three-dimensional models of the subsoil hundreds of square kilometers wide, 10-15 km deep and with a resolution of a few tens of metres, useful to model reservoirs and optimize their exploitation.
The two supercomputers also run original programs for magnetic confinement fusion research at the Eni-CNR research center in Gela. Finally, HPC4 and HPC5 are used to support advanced mathematical models that combine marine-meteor information and to develop Arctic weather and climate models.
By increasing efficiency, waste of energy, resources and time is reduced, thus HPC5 helps Eni to progressively improve its environmental performance.
In the framework of the European project EXSCALATE4CoV, HPC5 has worked with institutions and centers of excellence to carry out billions of molecular dynamic simulations to understand the interaction between the 30 proteins of SARS-CoV-2 and over 70 billion molecules with antiviral properties to identify those more effective to block the Covid-19. Without Eni’s supercomputer it would have taken years to complete the calculations.
Eni is one of the founders of the National Supercomputing Center which came into operation in September on input from INFN (National Institute of Nuclear Physics) to create the largest Italian system for high-performance computing, big data management and quantum computing.