AsianScientist (Jan. 25, 2018) – For centuries, man has build all manner of ships, submersibles and structures to explore the breadth and depth of the ocean and harvest its bounty. Yet—as captured in Herman Melville’s description of the Pacific Ocean in the novel Moby-Dick—the wild blue yonder continues to both fascinate and confound us with its vast, mercurial nature:
“There is, one knows not what sweet mystery about this sea, whose gently awful stirrings seem to speak of some hidden soul beneath…”
Whalers and sailors may be physically buffeted about by the ocean, but physics, too, has a hard time dealing with its unpredictability. The field of fluid dynamics, which describes the flow of fluids—in this case, the motion of water within the ocean—is governed by mathematical equations that are notoriously difficult to solve. Researchers have thus resorted to numerical simulations to predict the behavior of currents, waves and eddies.
Complex behavior begets complex simulations; these are so computationally intensive that it is only recently, with the advent of powerful supercomputers, that researchers have been able to model fluid dynamics faster and more accurately.
With their newfound ability to read the seas, supercomputers are now being put to work across a wide range of marine-related research efforts—modeling global ocean climate change, understanding the hydrodynamics of coral reef ecosystems, and even sculpting perfect waves for professional surfers, just to name a few.
Weathering a downturn
One group with enormous commercial interests in these recent advances is the offshore and marine industry—a broad sector that includes oil and gas companies, shipyards, and firms that engineer and build oil rigs, platforms and other offshore structures. The industry has been in the doldrums since oil prices crashed in mid-2014; since then, nearly US$400 billion worth of proposed energy projects worldwide have been put on hold, estimates UK-based energy research and consulting firm Wood Mackenzie.
“At the moment it is still not very clear when oil prices are going to pick up significantly, so the [offshore and marine] market as a whole has slowed down in many ways,” said Professor Chan Eng Soon, CEO of the Technology Centre for Offshore and Marine, Singapore (TCOMS), a research and development centre co-founded by the Agency for Science, Technology and Research and the National University of Singapore in 2016 to help the industry take advantage of emerging digital technologies.
“On the other hand, because of this, there is a push for technology and solutions to disrupt current practices where possible, and in more cost-effective ways,” he said.
Singapore has in many ways always been a maritime nation, added Chan. The country has one of the busiest ports in the world, and commands about 70 percent of the global market for both jack-up rigs and floating production storage and offloading (FPSO) platforms. In 2014, annual output from its offshore and marine sector was nearly S$25 billion (~US$18.6 billion), making up nearly two percent of the country’s gross domestic product.
But depressed oil prices have since taken their toll: output from the sector in October 2017 was less than half what it was in the same month three years ago, and Singapore companies Keppel Corp and Sembcorp Marine, the world’s largest builders of oil rigs, have seen share prices and earnings fall.
A digital revival
The future looks less bleak—with oil prices inching up towards US$60 a barrel as of December 2017, the industry is showing signs of rallying. Still, weathering the downturn has made companies more eager than ever to make the most of digital technologies. Supercomputing is a large part of this push, particularly for projects involving computational fluid dynamics, such as rig and vessel design.
“Conventional design approaches require significant resources and testing of physical models, and are thus time-consuming and expensive. Critically, knowledge and skills are developed in isolation and are non-transferable,” said Mr. Aziz Merchant, executive director of the Keppel Offshore & Marine Technology Centre (KOMtech), Keppel O&M’s research and development arm, in an interview with Supercomputing Asia.
Keppel O&M is drawing on the National Supercomputing Centre Singapore’s (NSCC) one petaFLOPS ASPIRE 1 supercomputer to run advanced computational fluid dynamics simulations, which numerically capture the environmental loads on vessels and rigs, as well as how these structures respond.
This data has allowed the company’s engineers to optimize their designs and develop innovative new ones, said Merchant. Keppel O&M has, for example, developed technologies that reduce motion and improve safety on semisubmersibles and accommodation rigs; unique ‘iceclass’ vessels that can access frozen areas of the ocean; and optimized, low-resistance hull forms that make supply vessels more fuel-efficient.
Simulating the sea
TCOMS is also in the business of enhancing simulation capabilities. A key feature of the Centre is its deepwater ocean basin facility, slated for completion in 2019. Using wave generators and supercomputer-powered computational fluid dynamics, the facility’s 50-meterdeep central pit can simulate marine environments down to 3,000 meters, said Chan. For comparison, the world’s deepest oil and gas project, Shell’s Stones, operates in some 2,900 meters of water in the Gulf of Mexico.
Despite its name, the basin can also be used to model shallower waters, added Chan. Researchers can thus study the hydrodynamics that surface vessels—those that lay pipes or supply the rigs and platforms, for example—are subject to, and use the information to develop improved or autonomous versions.
But in order to build truly detailed simulations—digital twins—of the rigs or vessels being studied, other types of input are also needed. In addition to parameters from the basin, real-world data on waves and currents, collected by sensors mounted on vessels and rigs, can also be weaved into the simulacra.
“The oil and gas industry has been pushing the use of big data for a number of years, with many groups pursuing sensing, data analytics and deep learning, all geared towards the overall concept of digitalization,” said Chan, adding that TCOMS is keen to partner with industry players to gather data and develop solutions with real-world applications.
Another goal is to be able to understand the behaviour of more complex, non-linear systems—an ecosystem of rig, platform, vessels and submersibles, for example.
“What has evolved out of digitalization is the ability to address complexities at the system-of-systems level,” Chan explained. “The behaviour of complex systems tends to be very non-linear and to some extent still unpredictable, so there are technical challenges in that sense. We have embarked on the journey to push boundaries and advance the state-of-the-art.”
When it comes to supercomputing, the marine and offshore industry in Singapore is a late bloomer as compared to early adopters in finance, aerospace, healthcare and other sectors, said Merchant.
“The marine and offshore industry recognizes that it is lagging behind adjacent industries and needs to ramp up quickly,” he noted.
Chan is optimistic that supercomputing initiatives will help move the industry forward.
“We are focused on technologies that are geared towards future smart systems. We are not doing more of the same; instead, we are looking at how we can deepen our fundamental knowledge and derive innovative solutions in partnership with the industry,” he said.
The state of the ocean
Supercomputers can do more than help mine the ocean for resources; they also help oceanographers and climatologists understand its workings on a grander scale. One of these researchers is Dr. Shuhei Masuda, a group leader and senior scientist who studies ocean circulation at the Japan Agency for Marine-Earth Science and Technology’s (JAMSTEC) Research and Development Center for Global Change.
By storing solar radiation and helping to distribute heat around the globe, the ocean plays a crucial role in keeping the planet warm. To better understand these global phenomena, Masuda builds computational models of the ocean state—the overall circulation and climate patterns of huge bodies of water.
“Our main goal is to comprehend past and current ocean states, and to clarify the mechanism of ocean climate changes. This leads to a better understanding of the systems which gently regulate the Earth’s climate,” said Masuda. “Such knowledge is definitely required when making a future projection on the state of the global climate, for instance, in conjunction with global warming.”
Masuda draws on a wide variety of ocean observations made by ships, moored buoys and floats; one of his resources is the Argo array of nearly 4,000 floats which gathers data on temperature, salinity and ocean velocity. Argo is run by an international consortium of climate research institutions, including JAMSTEC.
A sea of code
On the aptly named Earth Simulator, JAMSTEC’s terascale supercomputer, Masuda synthesizes real-world data points into computational models to make estimates of the ocean state. These can then be used to study how ocean climate has changed over time, and to make projections of what it will be like in the future, he explained.
Masuda has used computational models to suggest why a major, widely predicted El Niño event failed to materialize in 2014. Existing models did not take into account longer-term, five-to-ten-year variations in tropical seasonality, which affect the amount of heat energy transferred to the oceans, he and his collaborators found. By incorporating these variations into ocean state estimates, the researchers were able to accurately ‘hindcast’ past El Niño events, and also—hopefully—better predict future ones.
JAMSTEC now houses a new supercomputer at its Yokohama Research Institute location. At 19 petaFLOPS and 20 million cores, the Gyoukou supercomputer—codeveloped by ExaScaler and PEZY Computing, and unveiled in November 2017—now sits in fourth position on the TOP500 ranking of the world’s fastest supercomputers.
In Asia, Singapore and Japan are far from alone in their efforts at using supercomputers to describe and predict the behavior of the ocean. Twin petascale supercomputers Miri and Nuri power the Korea Meteorological administration’s climate research; meanwhile, in China, an exascale supercomputer dedicated to analyzing ocean data and expanding the country’s maritime presence may be ready as early as 2019.
These new levels of computing power will enable researchers to develop higher-resolution models that can project patterns out over longer time frames, said JAMSTEC’s Masuda. Instead of Titanic-sized ocean liners, the volatility of the ocean, it seems, may be better tamed by lines of code.
This article was first published in the print version of Supercomputing Asia, January 2018. Click here to subscribe to Asian Scientist Magazine in print.
———
Copyright: Asian Scientist Magazine.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.