AsianScientist (Jan. 25, 2018) – From the moment you wake up and brush your teeth with a plastic toothbrush, to the end of the day when you catch an Uber home from work, you are relying on the remains of ‘dead dinosaurs’—fossil fuels. All this adds up to a global consumption of 96 million barrels of oil per day, a figure that is expected to increase by 40 percent by 2035 despite greater awareness of the need for renewable energy sources.
At the same time, oil and gas reserves have become harder to find. The days of ‘easy’ oil are long gone; companies now have to turn to extreme environments such as the deep sea or polar regions in search of this elusive black gold.
Due to the high risk, high reward nature of oil exploration, the oil and gas industry seizes any opportunity to reduce the risks and uncertainties they face. For that reason, energy companies are among the largest commercial users of supercomputers, harnessing their ability to perform the complex simulations required at just about every step of the process, from the discovery of new oil fields to the design of rigs and even the protection of their infrastructure from cyberattacks.
- Designing rigs
More than 70 percent of the Earth’s surface is covered in water, so quite naturally, oil companies need to search the seas as well as the land. Peering past the watery depths, however, is an extremely complicated affair. To visualize what lies beneath the ocean floor, companies use a technique called seismic imaging, sending sound waves into the ground using a device called an air gun, and collecting the sound waves that bounce back using a device called a geophone.
Each geophone collects data from several thousand sampling points, while each air gun used to generate the sound waves is in turn linked to thousands of geophones. Supercomputers are absolutely essential to help companies make sense of this large amount of raw data—which easily reaches hundreds of terabytes—and turn it into an actionable competitive advantage.
Seven or eight of the largest publicly owned oil and gas companies are known as supermajors, and collectively known as Big Oil. The Pangea supercomputer owned by French supermajor Total, for example, is used by the company to improve the accuracy of their subsurface imaging and save time by reducing the need for further exploration. Pangea is the world’s largest industrial-use supercomputer, and with a peak performance of 6.7 petaFLOPS comes in at a respectable 21st position on the TOP500 list.
After seismic imaging helps to locate a potential oil reservoir, the reservoir is extensively mapped using computer simulations designed to help engineers make decisions such as where to place oil wells and how to design facilities.
Reservoir modeling must take into account the complex flows of oil, water and natural gas, as well as the porosity and permeability of the surrounding rock. In the simulation, the entire reservoir is represented as a series of cells, with each cell having a set of values for measurements such as pressure and the relative concentrations of oil, water and gas. The more cells a reservoir model has, the higher its resolution and accuracy.
In February 2017, US oil giant ExxonMobil announced that they had successfully run a billion-cell reservoir model on the Blue Waters supercomputer at the National Center for Supercomputing Applications, using over 700,000 processor cores. Just two months later, US technology company IBM announced that they had also run a billion-cell simulation, but this time using only 60 processors and 120 graphics processing units (GPUs), reducing the simulation run time from 20 hours to just one-and-a-half hours.
Supercomputers don’t stop being useful once the oil is found; they remain a vital part of daily operations even after the oil has been brought to the surface. One way that supercomputers are used in processing is in the design of separators, a piece of equipment that is used to separate the oil, water and gas into different streams for downstream processing.
In 2017, the Saudi Arabian Oil Company (Aramco) worked with engineering simulation specialist ANSYS to simulate a multi-phase gravity separation vessel. Although more realistic than individual computational fluid dynamics analyses, multi-phase models like the one used by ANSYS are more complex and difficult to scale.
With the help of the Shaheen II supercomputer at the King Abdullah University of Science and Technology, the team nevertheless managed to reduce the time taken for the simulation from several weeks to an overnight run.
Aramco, the largest oil and gas company on the world, said that this kind of simulation helps them reduce development time and allows operators to better predict the performance of equipment under varying conditions. Poor separation can reduce the productivity of an oil well by as much as 50 percent, making separator design critical for improving yields.
Offshore rigs are significantly more expensive than land-based rigs; a basic offshore rig costs about US$200 million while rigs designed for more challenging environments can go up to almost a billion dollars each. Aside from being eye-wateringly expensive, offshore rigs are an engineering marvel, boasting thousands of kilograms of steel that somehow remain in the right position while floating in the middle of the ocean.
Floating rigs need to withstand extreme weather conditions and remain stable even in the face of strong ocean currents. In particular, vortex-induced motion poses a safety risk, adversely impacting the mooring systems of the offshore drilling system. As vortex-induced motion is difficult to reproduce in the lab, it must be studied through simulation instead.
In 2015, a team of researchers from the Los Alamos National Laboratory in the US used supercomputers to perform a comprehensive computational fluid dynamics analysis of vortex-induced motion, testing different turbulence models and confirming their model with experimental data. Their findings could be used to improve rig safety, as well as inform the design of other large floating structures.
For as long as valuable goods have been transported over the sea, piracy has been a problem. According to a report by UK-based think tank Chatham House, Nigeria loses about US$1.5 billion a month to piracy and theft targeting the oil industry. These days, however, oil companies have to contend with much more than physical attacks by machine gun-toting pirates; they also have to worry about cybersecurity.
Three quarters of energy companies surveyed by the US-based privacy and data protection think tank Ponemon Institute experienced at least one cyberattack in 2016, and the energy industry is second only to the financial industry in terms of being prone to cyberattacks.
Alongside more traditional security measures, supercomputers can also play a role. For example, US-headquartered HPC provider Cray offers an analytics platform that combines supercomputing with enterprise-standard security frameworks to accelerate the detection of threats in cyberspace.
This article was first published in the print version of Supercomputing Asia, January 2018. Click here to subscribe to Supercomputing Asia in print.
Copyright: Asian Scientist Magazine.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.