СƵ

Skip to content

Supercomputer analysis of fracking and perhaps a lot more

Regina – The United States Department of Energy has, for the last several decades, routinely been in competition for operating the most powerful supercomputers in the world.

Regina – The United States Department of Energy has, for the last several decades, routinely been in competition for operating the most powerful supercomputers in the world. In June 2018, it fired up “Summit,” which, according to NBC News, “has been clocked at handling 200 quadrillion calculations a second (or 200 petaflops). That's more than twice as fast as the previous record-holder, , and so fast that it would take every person on Earth doing one calculation a second for 305 days to do what Summit can do in a single second.”

But what does that have to do with oil? Shawn Bennett, Deputy Assistant Secretary, Office of Oil and Natural Gas, U.S. Department of Energy (DOE) spoke in Regina at the Williston Basin Petroleum Conference on May 29. He said the DOE is looking at applying its big data computational abilities to analyzing geology and completions in the oilpatch.

Just how powerful are these supercomputers? An Oct. 3, 2011 Washington Postarticle noted, “The Obama administration has said that with computing advances, the United States will never need to resume nuclear explosive testing. Undersecretary of State Ellen Tauscher said in May that “our current efforts go a step beyond explosive testing by enabling the labs to anticipate problems in advance and reduce their potential impact on our arsenal — something that nuclear testing could not do.”

The 2018 Summit computer is capable of 10 times the number of calculations of the “Sequoia” supercomputer described in the 2011 article. The DOE still operates Sequoia, and it is currently ranked 10th in the world, according to the DOE. In total, the DOE operates five of the ten most powerful computers on the planet. That’s the level of computer power the DOE has at its disposal, and it is now looking at applying that to the geology of the American oilpatch.

“When we’re looking at that big data, we’re trying to see how we can use that supercomputing process, to see how we can, if there is an opportunity for us to use super computing in oil and gas development,” Bennett told Pipeline Newsat the conclusion of his presentation.

“Not for an individual company’s basis, but to unlock some of these questions that we have. When you look at predictive analytics and you look at big data, you need that very fast supercomputing power to potentially unlock some of these mysteries in the shale. So we are in the early stages of developing a program where we can hopefully utilize the supercomputer capacity to unlock some of these universal mysteries of oil and gas.”

When asked how soon they could do this, he joked that “My boss asked how quickly we can get it done, too.”

“In order to compile the data, work out the data with companies, and have that conversation, we have to gather that data, big data. It means a lot of data has to be acquired. So we’re in the very beginning process of acquiring data and seeing if there’s an opportunity to start looking at different algorithms to go at it.

“It’s not going to be a next year thing. But hopefully, in the next few years, we’ll have some questions answered.”

As an example, he said taking a subset of data from a basin to look at anomalies and similarities

“There’s been a lot of data that’s been acquired by these companies over the last decade of field development. Being able to clean up that data, use that data, and to start to see similarities and new predictive analytics through algorithms and physics-based analysis, and hopefully be able to increase the EUR through that big data approach, through these supercomputers,” Bennett said.

“When you look at big data, we know, right now, what works. But ultimately we want to improve resource recovery, the EUR, the estimated ultimate recovery, of these wells. And by doing that, going through these massive amounts, reams and reams of data. The problem with all these reams of data is it takes months, even years, to compile that data and to be able to understand it better. With those supercomputers, if we can do it in a more realtime manner, we could have realtime changes to the drilling program, whether it’s the drilling portion or completions, for each well.”

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks