NETL researchers used the Laboratory’s Joule 2.0 supercomputer to provide a clearer picture of subsurface geological formations that could be used to effectively store captured carbon dioxide (CO2) and to address any potential issues with integrity. 

Coupled with advanced Machine Learning (ML) models, researchers were able to provide detailed three-dimensional reservoir field images of CO2 pressure and saturation fronts at potential subsurface CO2 storage sites — work that is essential to monitoring and evaluating activities related to optimal reservoir management and risk reduction.

Training of ML tools requires the generation of training data sets from the execution of a large number of physics-based reservoir models that are computationally intensive. Trained ML tools can provide results almost in real-time, while physics-based models can take several hours even on a super-computer.

Supercomputers are computers that feature a high level of performance compared to general-purpose computers. They aid scientific research, weather forecasting, and advanced simulations and solve the world’s most intricate problems with unparalleled precision. A supercomputer consists of thousands of small computers called nodes. Each node is equipped with its own memory and processors. To make the supercomputing system fast, a communications hub connects those minicomputers. Instead of working as separate units, they act as one, managing millions of tasks to tackle complex problems quickly.

NETL researchers used 30 years’ worth of CO2 injection and oil and gas extraction data in the project. The oil and gas well leakage information came from more than 26,000 wells and 106 well attributes from one of the largest oil and gas fields in the United States. They then used the supercomputer to develop a ML learning model to forecast well integrity.

The supercomputer processed thousands of iterations of decision tree models. The final models were able to provide information on integrity issues, providing new insights into the drivers of oil and gas well leakage.

Experienced expertise and a complex infrastructure are required to operate supercomputer facilities and make them available to scientists, who use their power to run computational experiments like those involved with the subsurface research project. 

According to the National Academies of Sciences, Engineering, and Medicine, which provides independent objective advice for the benefit of society, “Supercomputer simulations can augment or replace experimentation in cases where experiments are hazardous, expensive, or even impossible to perform or to instrument.”

At NETL, Joule 2.0 has allowed researchers to model energy technologies, simulate challenging phenomena and solve complex problems as they seek to make efficient use of the nation’s energy resources. Joule 2.0 has also supported effective collaboration among researchers at NETL and with external research partners in a virtual environment. Ultimately, Joule’s advanced computational tools save time and money by accelerating energy technology development.

The success of this and other energy-related NETL supercomputing projects set the stage for the next generation Joule supercomputer that will be unveiled later this year. Joule 3.0 will enable additional and even more complex projects to be tackled by NETL researchers that will drive innovation and deliver solutions for an environmentally sustainable and prosperous energy future.

NETL is a DOE national laboratory that drives innovation and delivers technological solutions for an environmentally sustainable and prosperous energy future. By using its world-class talent and research facilities, NETL is ensuring affordable, abundant, and reliable energy that drives a robust economy and national security, while developing technologies to manage carbon across the full life cycle, enabling environmental sustainability for all Americans.

Originally published on netl.doe.gov.