High Performance Healthcare

Thanks to all the data that is already available, the healthcare sector is ripe for a data-driven transformation. Here’s how supercomputers can make a difference.

Of racecars and readmissions

Gene sequencing and analysis is just one type of data useful for improving healthcare. Even more mundane measures like blood pressure and oxygen levels can tell us a great deal about patients. In fact, the supercomputer at the Beth Israel Deaconess Medical Center can use those parameters to accurately predict the probability of death 96 percent of the time.

Similarly, researchers in Singapore are beginning to embark on a project to use supercomputers to predict which patients will be readmitted to the hospital and when, shared Tai. According to a 2015 study by the US-based Center for Health Information and Analysis in Massachusetts, hospital readmissions cost the US government US$26 billion yearly, of which US$17 billion could have been saved in preventable cases.

Predicting which patients are likely to be readmitted requires processing a large amount of timeseries data and producing the results fast, precisely what supercomputers are good at.

“You can’t put the data on a cluster and wait a week for the results; you need to know the answer while the patient is still in hospital so that something can be done about it,” Tai continued. “Being able to get computationally-intensive analyses done in an efficient way is going to make a big difference.”

The healthcare sector is getting help from an unexpected source: motorsports. McLaren Applied Technologies, the data science team that helps McLaren race cars perform at their peak, has partnered with the National Neuroscience Institute in Singapore to use predictive analytics to improve patient care.

“Over the course of a two-hour race, the McLaren team captures more than 12 billion real-time data points from over 300 sensors embedded on their race cars, running over a hundred simulations with that data,” Tai said. “All this is time-series data; the same analytical methods can be applied to healthcare settings.”



Starting conversation, building collaborations

As transformative as the use of data can be, not all problems will require or are even best solved by supercomputers, Tai cautioned.

“The question is: which problems are the most data intensive and will require high performance computing?” he asked.

“If the code isn’t designed for high performance computing, it doesn’t run any faster on the supercomputer than it does on a cluster in your own backyard,” he shared with a laugh. “I think there’s a lot of work that needs to be done; in particular, we need to understand how to optimally write our code to take advantage of parallelization.”

While clinicians have a good understanding of the problems in healthcare that need to be addressed, they may not always understand what supercomputing is about and what the opportunities are. What is required, Tai said, is an ongoing conversation between clinicians and programmers.

“Making healthcare better takes a partnership,” he said. “We need programmers who are not only capable of writing code that takes advantage of parallelization, but also capable of sitting down with either clinicians or investigators to design solutions collaboratively.”

Once the problems have been identified, the next challenge is sharing the data for analysis. Genomic and healthcare data are typically stored at hospitals or medical centers, whereas supercomputing resources tend to be located offsite, necessitating high-speed and secure transfers. One approach has been to build networks like Singapore’s InfiniCortex which would allow users to store data locally but analyze it in a distributed manner.

“We’ve only begun to scratch the surface.”

Another strategy would be to de-identify and aggregate the data before making it available online. This is what Global Gene Corp has done with its beacon network, which allows individual researchers, clinicians or country-level users to query whether gene variants are present in its database.

“It allows researchers to find each other and build collaborations,” Jamuar said. “Improving healthcare can’t be an individual effort, or the effort of a single company. It has to be a community effort, there has to be buy-in from four groups of people: the clinicians and researchers; the regulators, i.e., the government and related regulatory agencies; the payors, typically the insurance companies; and the patients themselves who have to be willing to participate.”


Unlocking the value of data

When it comes to healthcare, Tai noted, massive amounts of data are already there, collected over the years not for supercomputing purposes but simply for inventory management and billing.

“We’ve only begun to scratch the surface,” said Tai. “We already have very detailed information on what services people consume, including 20 years of drug data for every Singaporean who has used the public health system. Exploiting all this data we’re sitting on requires us to think a little bit about what the problems are, how we can solve them, and which of these are most effectively dealt with HPC,” he added.

The future of healthcare looks set to be an exciting time, Jamuar stated, given the rapid advances in both technology and policy.

“The way I see it is that today’s healthcare allows you to live long, but with diseases and disabilities. Data-driven healthcare not only allows you to live long, but also allows you to live healthy, and that is the ultimate goal of using supercomputers in healthcare,” he concluded.



This article was first published in the print version of Supercomputing Asia, July 2017. Click here to subscribe to Supercomputing Asia in print.

———

Copyright: Asian Scientist Magazine.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.

Rebecca did her PhD at the National University of Singapore where she studied how macrophages integrate multiple signals from the toll-like receptor system. She was formerly the editor-in-chief of Asian Scientist Magazine.

Related Stories from Asian Scientist

  • Powering The Little Red Dot Powering The Little Red Dot Home to the only petaFLOPS-scale system in Southeast Asia, the National Supercomputing Centre Singapore handles large volumes of data generated by the small island nation.
  • Bringing Lifesaving Research From The Bench to Bedside Bringing Lifesaving Research From The Bench to Bedside To overcome translational challenges, Vietnam’s international healthcare system, Vinmec, combines research and clinical expertise under the same roof for quicker and more effective outcomes.
  • Getting A Headstart On HPC Getting A Headstart On HPC As high performance computing grows in relevance in various industries, Singapore Polytechnic is prepping its students for the future from the comfort of their classrooms.
  • CitySim CitySim Simulations and edge computing could help to tame some of the complexities of cities and make them more livable places for all, says Charles Catlett, director of the Urban Center for […]
  • The Race To Exascale The Race To Exascale Professor Lu Yutong shares how China’s past experiences have shaped the country’s supercomputing efforts, and calls for more collaboration among Asian countries.
  • Supporting Science At All Scales Supporting Science At All Scales Supercomputers are helping scientists investigate phenomena at all scales, from the cosmos to quantum mechanics.