Raising A City’s IQ

Smart cities represent a tremendous opportunity to improve the lives of millions, but many technical challenges stand in the way. Here’s how high performance computing is being used to address several of the most difficult ones.

AsianScientist (Aug. 15, 2018) – Since the dawn of human civilization, cities have been synonymous with opportunity. By bringing both people and resources together, cities enjoy economies of scale and accelerated economic growth. Drawn by these opportunities, an estimated three million people move into cities every week. By 2050, cities are expected to house an additional 2.5 billion people, roughly equivalent to the entire population of both India and China, the two most populous countries in the world.

The unprecedented speed and scale of urbanization poses an urgent and complex challenge to governments and city planners. In the absence of adequate action, sprawling slums and deteriorating infrastructure can transform once pleasant cities into unlivable ghettos, with knock-on effects on health, security and other measures of quality of life.

In an attempt to forestall or circumvent these consequences, leaders around the world have turned to technology. Equipped with an array of Internet of Things (IoT) sensors and increasingly powerful analytic capabilities, the smart cities movement promises to make city life safer, smoother and more sustainable. If executed well, smart cities have the potential to impact millions, if not billions, of lives.

“The timely analysis of sensor and IoT data would allow city managers to solve problems ranging from traffic ediction to flood monitoring and even food safety,” said Professor Wang Lizhe of the Institute of Remote Sensing and Digital Earth at the Chinese Academy of Sciences. “With their ability to analyze data at lightning-fast speeds, high performance computing (HPC) makes smart cities possible.”

The power of knowing

The first step toward building smarter cities is collecting information about what happens in it, thereby reducing the number of unknowns faced by decision makers. Currently, many decisions, even in well-developed cities, are made in the absence of accurate information—sometimes with tragic results.

In 2015, a hundred-meter-high construction landfill in Shenzhen, China, collapsed, causing a landslide that claimed the lives of 74 people and destroyed 33 buildings.

“If officials had enough information about such a situation before it happened, for example if they had models that could warn them about the possibility of such an occurrence, they could have taken action and prevented such man-made disasters,” Wang said.

“On the other hand, if you have no geological and geographical knowledge about the city, you will not be able to manage the city well.”

To remedy this situation, Wang and his team have been working on developing a decision support system for the management of smart cities. The objective of the project, supported by the National Natural Science Foundation of China, is to help administrators control and maintain their cities, as well as plan for the future, he said.

Wang is now building up a comprehensive understanding of a city from four different sources of data: satellite data, ground-level data, underground data and crowd-sourced data. This data is then combined with simulations to calculate warning levels and projections to enable city planners to make informed decisions.

“The model can also help balance between all kinds of conflicts by calculating what would happen if we removed any of them,” Wang explained. “Decision makers can use our simulations to test what can be done before committing resources to a particular course of action.”

Last but not least, the decision support system can also help with resource allocation and planning, he said.

“If limited resources such as space are wasted, there will not be enough space to build critical infrastructure such as subways or water pipes in the future.”

Data from the ground up

Building this smart city decision support system was no mean feat, with one of the main challenges being having to deal with the vast amount of data needed to understand something as complex as an entire city.

“A single satellite can generate several terabytes of high-resolution data within a day,” Wang explained.

He himself collects not only optical data based on visible wavelengths, but also synthetic aperture radar, light detection and ranging (LIDAR) and global positioning system data.

Furthermore, Wang and his team do not stop at a detailed understanding of the surface of a city’s terrain but literally go deeper, incorporating underground data from subway systems and modern buildings and even crowd-sourced data such as pictures of subway incidents or building collapses uploaded by internet users.

“Well-developed cities—whether in Shenzhen, Shanghai or Singapore—all face the problem of limited space. Although we may not have paid as much attention to it in the past 20 years when there was still room to grow, underground data is increasingly important for large cities,” Wang said.

To add to the challenge of dealing with large amounts of heterogeneous data, the data collected must be coupled with complex physical models like those used in climate prediction, he continued. Supercomputers are required at both steps: processing large amounts of data and combining the data with physical models.

“Sometimes the model is not correct as it is only an approximation of reality. Other times, the observations are not correct as they contain errors. If we simply put the model and data together we will definitely not get a good result,” Wang said. “We first need to use a lot of computing resources to remove the errors, a process that we call data assimilation.”

Transforming transportation

Another area where supercomputing is poised to make a positive impact on smart cities is the realm of transportation. For example, a study by market intelligence firm Juniper Research found that smart mobility technology like intelligent traffic systems and route optimization based on open data would save nearly 60 hours per person each year.

Yet the reality for many cities—particularly in Asia—is snarling gridlocks on a daily basis. Bangkok, which has consistently been ranked as one of the most congested cities in Asia, saw its residents spend more than 64 hours stuck in traffic in 2017, with Jakarta not far behind at 63 hours. In addition to the short-term loss of productivity, bad traffic also has long-term consequences on health and the environment.

Even in Singapore, where people spend just ten hours stuck in traffic a year, incidents can still happen. In 2015, the country’s Mass Rapid Transit system experienced the worst breakdown in its 30-year history, when both the North-South and East-West lines broke down and left over 250,000 commuters stranded at rush hour. Although shuttle buses were deployed, movements of the unprecedented crowd were unpredictable, resulting in chaos and congestion.

“The Land Transport Authority (LTA) found that they did not have enough visibility into the crowd levels, especially when an incident happens. Because the crowd was not very well understood, the response could not happen as effectively as LTA wanted it to,” said Dr. Laura Wynter, head of AI at the IBM Research Singapore Lab.

“To know how to deal with crowds when an incident happens, you will need to understand what the crowd levels and characteristics are, what the crowd is like usually, and how it is changing,” she continued. “The problem was that there was no obvious data source.”

Unlike the trains themselves, which are scrupulously monitored by an array of devices, people don’t come equipped with sensors. Till now, there has not been a feasible way for LTA—or any metro system anywhere in the world—to know the exact number of people at each location in the transport network, Wynter added.

A faster feel of the ground

To get a better sense on the size of crowds at each station and the movement of people through the network, LTA teamed up with IBM Research on a project known as Fusion AnalyticS for public Transport Emergency Response, or FASTER. FASTER uses WiFi and other sources of data to provide rail operators with both situational awareness and decision support.

Before the FASTER project, the only sources of information on the crowd levels were farecard data and video feeds. Farecard data only provides information on entry and exit points, while video processing is very computationally intensive.

“Since these sources of information are either not very timely or incomplete, FASTER has simulation capabilities to ingest these partial pieces of information and complete the picture,” Wynter explained.

At the moment, FASTER relies to a large degree on anonymized WiFi data, where commuters’ mobile phones provide a rich source of geospatial information on the whereabouts of each anonymized device. Data from the free Wireless@SG broadband network supplies the researchers on the FASTER project with broad coverage.

“Right now, WiFi data has the lowest latency. But as other data sources such as farecard and video streams become available with low latencies, they will also play an important role in real-time analyses,” Wynter said.

Beyond giving transport operators real-time insights into the levels of crowding, FASTER is also able to simulate ‘what if’ scenarios and suggest the best course of action in the face of service disruptions. For example, FASTER is able to predict the expected delays that would occur if a section of track becomes unusable, as well as suggest optimal shuttle service deployment routes.

These capabilities rest on agent-based simulations of about 400,000 agents at any point in time, a computationally intensive task that requires parallelization. The challenge, Wynter said, is developing a simulation that is faithful to how the transport system actually works but remains manageable and requires only minutes to run.

“While situational awareness works at specific locations, simulations for planning purposes work at a network-wide level. Both these applications have their own computing challenges; situational awareness needs to be quick, while long-term simulations need to be more accurate,” said Mr. Hassan Poonawala, a data scientist at IBM Singapore and one of the project leads on FASTER.

Making data-driven decisions

Whether the decisions are about when and where to deploy shuttle buses in the event of a train breakdown or how to plan cities in general, HPC can help city planners make sense of a large amount of information and make better decisions as a result.

“In smart city settings, a decision maker isn’t just a chief executive or operations manager; everybody is a decision maker at their own level. We hope to help people leverage data and give them tools for effective data-driven decision support,” said Wynter.

To get there, however, will require a commitment to developing not just big data, cloud computing and IoT capabilities, but also supercomputing core facilities capable of turning data into insight.

“Ever-larger HPC engines are necessary to meet the tremendous computational needs introduced by the huge scale of data involved in complex, long-term modeling.” said Wang. “Supercomputers also accelerate the integration of multidisciplinary sources of information and nurture emerging disciplines, directly addressing the technical challenges faced in implementing smart cities.”

This article was first published in the print version of Supercomputing Asia, July 2018.
Click here to subscribe to Asian Scientist Magazine in print.


Copyright: Asian Scientist Magazine; Photo: Shutterstock.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.

Rebecca did her PhD at the National University of Singapore where she studied how macrophages integrate multiple signals from the toll-like receptor system. She was formerly the editor-in-chief of Asian Scientist Magazine.

Related Stories from Asian Scientist