Oil majors are second only to the US Defense Department in terms of the use of supercomputing systems. That’s because supercomputing is the key to determining where to explore next—and to finding sweet spots based on analog geology.
What these supercomputing systems do is analyze vast amounts of seismic imaging data collected by geologists using sound waves. What’s changed most recently is the dimension: When the oil and gas industry first caught on to seismic data collection for exploration efforts, the capabilities were limited to 2-dimensional imaging. Now we have 3-dimensional imaging that tells a much more accurate story. But it doesn’t stop here. There is 4-dimensional imaging as well. What is the 4th dimension, you ask: Time (and Einstein’s theory of relativity). This 4th dimension unlocks a variable that allows oil and gas companies not only to determine the geological characteristics of a potential play, but also gives us a look at the how a reservoir is changing LIVE, in real time. The sound waves rumbling through a reservoir predict how its geology is changing over time.
The pioneer of geological supercomputing was MIT, whose post-World War II Whirlwind system was tasked with seismic data processing. Since then, Big Oil has caught on to the potential here and there is no finish line to this race—it’s constantly metamorphosing. What would have taken decades with supercomputing technology in the 1990s, now can…
Oil majors are second only to the US Defense Department in terms of the use of supercomputing systems. That’s because supercomputing is the key to determining where to explore next—and to finding sweet spots based on analog geology.
What these supercomputing systems do is analyze vast amounts of seismic imaging data collected by geologists using sound waves. What’s changed most recently is the dimension: When the oil and gas industry first caught on to seismic data collection for exploration efforts, the capabilities were limited to 2-dimensional imaging. Now we have 3-dimensional imaging that tells a much more accurate story. But it doesn’t stop here. There is 4-dimensional imaging as well. What is the 4th dimension, you ask: Time (and Einstein’s theory of relativity). This 4th dimension unlocks a variable that allows oil and gas companies not only to determine the geological characteristics of a potential play, but also gives us a look at the how a reservoir is changing LIVE, in real time. The sound waves rumbling through a reservoir predict how its geology is changing over time.
The pioneer of geological supercomputing was MIT, whose post-World War II Whirlwind system was tasked with seismic data processing. Since then, Big Oil has caught on to the potential here and there is no finish line to this race—it’s constantly metamorphosing. What would have taken decades with supercomputing technology in the 1990s, now can be accomplished in a matter of weeks.

In this continual evolution, the important thing is how many calculations a computer can make per second and how much data it can store. The fastest computer will get a company to the next drilling hole before its competitors.
We are talking about MASSIVE amounts of data from constant signal loops from below the Earth’s surface. For example, geologists generate sound waves using explosives or other methods that dig deep into the Earth’s surface which are then sampled 500 times per second. Only a supercomputer could possibly process all this complex data and make sense of it.
We’ve moved beyond geographical interpretations, such as pursuing exploration based on geological proximity, like Tullow’s Ethiopia play is on trend with its massive Kenya finds. This is child’s play. As Steve LeVine points out in Quartz, analog exploration goes well beyond this, and supercomputing tells us that standing in prolific Brazil is pretty the same as standing in Angola—an up and coming sweet spot. Or Ghana is the same as French Guiana.
In Brazil, the offshore Santos, Campos and Espirito Santos Basins—the site of prolific oil discoveries—“and the application of plate tectonic concepts has enabled explorers to extend that play across the Atlantic to offshore western Africa,” according to an academic study on plate tectonics and exploration.
The Upper Hand in the Supercomputing Game
Total
France’s Total SA says its new supercomputing system, Pangea, is unrivaled, hands down, and will help it find new discoveries 15 times faster than before. The company purports to be pushing the envelope in terms of using this geological supercomputing system to determine where to drill next.
In Angola, Total was able to analyze its seismic data from the Kaombo project in only 9 days, while it would have taken over four months with its old system.
Pangea processes geological data to help determine with much greater accuracy where the best place is to explore and drill—fast. Pangea can process at 2.3 quadrillion operations/second. This is as powerful as the simultaneous operation of 27,000 regular computers. Total says it’s among the Top 10 most powerful computing systems in the world.
Repsol

Spain’s Repsol has one of Europe’s most powerful supercomputers--the Kaleidoscope system, which has helped it identify finds in the Gulf of Mexico and in Brazil’s deep waters. The system takes a 3D geological snapshot of waters up to 2 kilometers deep and rock that runs 15 kilometers in thickness. The target here is getting under the salt domes to the pre-salt plays, for which Brazil is becoming very popular.
And because we already know that Brazil is a geological analog to Angola, Repsol is putting its system to work there, as well, hoping that Angola’s Kwanza basin will come up with a similarly massive find at Brazil’s Santos and Campos basins (35 billion barrels).
(Incidentally, Repsol’s supercomputer is housed in Barcelona’s deconsecrated Torre Girona church.)
Total says its system is unrivaled, but we’re not sure about that: Repsol might be that rival.
BP Plc
British giant BP is also playing around with its own upgraded supercomputing system, the facility for which it hopes to open in Houston later this year. The system would be able to process geological data at 2 quadrillion operations/second. Its current system can handle only 40% of this capacity. Over a period of five years, BP will invest $100 million in advancing its supercomputing capabilities.
ExxonMobil Corp.
ExxonMobil is a bit vaguer about its supercomputing capabilities, telling reporters that it has “next generation” operations capacity, but leaving out the details.
Further into the Future, Deeper into the Earth
There are still plenty of mysteries under the Earth’s surface—and the sophistication of super computers will continue to grow to absorb additional capacity and remove even more of the guesswork necessary to interpret all of this data.
New fossil fuels discoveries are being announced on a weekly basis, if not more frequently, and this is largely because of supercomputing technology that makes it easier to find the sweet spots in deep-water plays that no-one knew existed before.
These improvements have been a key factor in why operators have been able to move into increasingly complex plays in the deep water. Supercomputing advances also remove a great deal of the risk involved in undertaking expensive drilling when you’re not sure what’s there. Supercomputing essentially puts the idea of peak oil to bed for the foreseeable future.