AsianScientist (May 21, 2019) – By Sim Shuzhen – For decades, human drivers have gotten by with scanning the roads with their eyes and ears, using only the most rudimentary of technologies—rear- and side-view mirrors—to help augment their senses.
In contrast, autonomous vehicles (AVs) are stuffed to the gills with sensors. To do what millions of humans do every day, AVs must integrate information from a range of sensing technologies, including LiDAR, radar, cameras and ultrasound.
Of course, the chief promise of AVs is that they will make our roads much safer. To that effect, McKinsey & Company predicts that autonomous driving will reduce road accidents by a staggering 90 percent by the middle of this century. Based on the current global estimate of 1.2 million traffic fatalities every year, that translates to some 10 million lives saved per decade—a huge public health triumph that would be on par with the advent of modern vaccines.
But to deliver on this life-saving promise, the industry needs better sensing and decision-making technologies that will not only help AVs ‘see’ further and more clearly, but also overcome blind spots and deal with ‘edge cases’—extreme or unusual circumstances which may trip up autonomous systems.
“You don’t want your vehicle to be like a human. You want it to be something more—like a superhuman,” says Mr Ram Vignesh Palaniswamy, co-founder of Singapore-based startup Hertzwell.
Formed in 2017 through the Entrepreneur First (EF) Singapore company builder programme, Hertzwell is putting aerospace-grade radar systems to work in AVs so that they can navigate in all weather conditions.
From airplane to automobile
Most AVs use a combination of cameras, LiDAR and radar to navigate and detect obstacles, says Mr Palaniswamy. But each of these comes with technical limitations. LiDAR, which uses lasers to put together a high-resolution map of the AV’s surroundings (and which you might recognise as the spinning, bucket-shaped piece of equipment mounted on top of driverless cars), is handicapped by rain, dust, fog or even very bright light. Radar, which relies on sound waves, is unperturbed by bad weather; however, radar systems currently used in cars lack the resolution needed to distinguish between obstacles.
To address this flaw, Hertzwell is adapting high-resolution radar technology from the aerospace industry for the automotive industry.
“Radar is a very good technology, and it can do everything that LiDAR does. The only problem is resolution, and that is the area we’re focusing on,” says Mr Palaniswamy.
Since radar system components used by the aerospace industry are much too big to be installed in ground vehicles, Hertzwell has used miniaturisation techniques to develop a prototype radar that is only about the size of an iPhone. But the company is not simply in the business of shrinking an airplane radar to fit in a car; it also employs various other techniques to improve range and resolution, says Mr Palaniswamy. For example, Hertzwell uses deep learning software to help clients distinguish between obstacles such as pedestrians, bicycles and trucks.
Hertzwell’s prototype is now being put through its paces in AV testbeds in Singapore, and the company is also in talks with top original equipment manufacturers (OEMs) around the world, says Mr Palaniswamy. He hopes that Hertzwell’s high-resolution radar will eventually be good enough to supplant LiDAR in AVs.
“At the start, we want to complement LiDAR, but our ultimate vision is that a camera and our radar should be enough to work in all AVs.”
Mr Palaniswamy, who felt a strong calling to start his own company, credits SGInnovate and EF with connecting him to many other like-minded people, including his co-founder Mr Bhaskar Jyoti Dutta.
“I was very clear in my mind that I wanted to do a startup. The only thing I didn’t have was the right infrastructure or network to meet the right people. EF gave me that exact thing I needed,” said Mr Dutta.
Learning from the problem
In March 2018, public confidence in the safety of AVs was shaken after the death of a pedestrian hit by an Uber self-driving test car.
“The AV industry faces many challenges, such as safety, security, regulations, infrastructure and ethical dilemmas. Among these, safety is the biggest challenge and also the key to address the other challenges,” says Dr Marcus Chen, co-founder of VEBITS, another startup from the 2017 EF Singapore cohort.
VEBITS is tackling one of the biggest safety issues facing the industry—the need for AVs to cope with unpredictable behaviour as they share the roads with human drivers, who do not always follow traffic regulations.
The company’s strategy is to learn from the problem itself—that is, the human drivers. Retrofitted on human-driven vehicles (a fleet of trucks or buses, for instance), its system collects real data on driving contexts and drivers’ corresponding actions on a massive scale; machine learning algorithms then go over the information to understand how drivers react under various conditions.
The resulting insights are not only useful for teaching AVs to drive alongside humans, but also help humans drive more safely by providing them with real-time feedback and better situational awareness.
VEBITS’ unique strategy has several advantages, says Dr Chen.
“First, human drivers collectively drive trillions of miles every year compared to millions of miles driven by a typical AV fleet, and so they encounter far more unpredictable contexts. Second, by providing real-time feedback to drivers, VEBITS validates and improves technologies without the risk of accidents. Third, instead of paying millions of dollars to collect data with drivers behind the wheel in AVs, VEBITS is paid to collect massive amounts of data.”
While AV technology attracts huge investments in the US, China and Europe, VEBITS’ co-founders decided to base the company in Singapore.
“As a regional hub, Singapore allows our business to readily expand to Southeast Asia, which has business opportunities of more than S$1 billion for us,” he says, adding that strong interest from local venture capital firms and top-quality talent were also part of the draw to stay in Singapore.
The road from idea to reality
Any AV technology must be thoroughly tested before it finds its way into a vehicle, says Mr Niels de Boer, programme director of the Centre of Excellence for Testing & Research of Autonomous Vehicles—Nanyang Technological University (CETRAN), an AV testbed in Singapore.
“CETRAN is developing technical standards and test methods for AVs to ensure that they can be deployed safely on Singapore roads,” he explains.
The testbed’s assessment pipeline combines functional safety desk reviews, circuit-based testing and road tests, which then qualify AVs for open-road testing, adds Mr de Boer.
“We then actively monitor the open-road testing to increase confidence levels.”
AV companies would do well to factor this rigorous process into their business plans, as VEBITS and Hertzwell have done.
“We want to mature our technologies and build a sustainable business by helping large commercial vehicle companies improve their profits and safety. In the meantime, we will clock billions of miles to understand local driver behavior in different situations,” says Dr Chen.
“Our long-term goal is to monetise this driving data for intelligent human-centric transportation, and eventually put AVs on the road faster,” says Dr Chen.
By the time Hertzwell completes industrial validation, Mr Palaniswamy predicts that the company will hit the AV market at a sweet spot.
“Right now, it’s at a nascent stage, but in 2020, we’re expecting the market to pick up and the number of cars being manufactured for autonomous driving to increase. We’re hoping that we will be getting our product to market at the right time.”
SGInnovate is excited about what the future of mobility could be like. Autonomous technology is one of the areas SGInnovate has invested in. Both Hertzwell and Vebtis are portfolio companies of SGInnovate.
Asian Scientist Magazine is a content partner of the SGInnovate.
Copyright: SGInnovate. Read the original article here.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.