AUTONOMOUS CARS

What Makes the Weather Such a Big Problem for Autonomous Driving?

Difficult weather conditions, like rain, wet or frozen road surfaces, are a well-known factor in car accidents. Weather is a problem for both human drivers and autonomous vehicles because of its unpredictability.

Almost any autonomous driving development wrestles with the one unexpected, random, and complex element that continues to challenge even the most experienced engineers: the weather.

Harri Santamala, the CEO of Sensible 4

“Why would we develop a system that only works in optimal conditions, in warm, clear, and sunny weather, as that’s not the reality for most of the people?” questions Harri Santamala, the CEO of Sensible 4.

Darkness, rain, and wind are everyday conditions for many. Besides, burning heat or cold are the conditions when the vehicles are probably most needed.

“Our basic concept is that the vehicle must work in everyday conditions without purpose-built infrastructure. We believe this is the way autonomous driving becomes mainstream”, says Santamala. 

Different Sensors Perform in Different Ways

An autonomous vehicle sees the world through various sensors, typically cameras, laser scanners, i.e. LiDAR sensors, and radars. Harsh weather may be an issue, especially for LiDAR and cameras. Small particles in the air, like raindrops, fog, and falling leaves prevent the light and laser beam from travelling in the air.

“Cameras are good in detecting and classifying objects in the vicinity of the vehicle, but they see poorly in the dark or through rain and fog, and they also have difficulty in accurately assessing distances to objects. LiDAR scanners work regardless of lighting conditions and can measure object distance and size down to millimetres, but their performance also suffers from rain and fog”, Santamala explains.

In the dark, cameras mostly have to rely on the vehicle’s lights or streetlights. In daylight, if the sun is to the horizon or the terrain is covered by snow, cameras may be blinded by the brightness or lack of any contrast or shape in the view. Various objects can also look quite different under varying lighting conditions, making it all more difficult for machine vision to operate.

The third sensor type used is a radar which works best when measuring large, moving metallic objects, i.e. determining the position and direction of other vehicles. Radar’s work even in the worst weather and their ability to measure through water and fog is a significant benefit. Interpreting the radar image remains a challenge: an aluminium can on a sidewalk may give out an entirely different and stronger signal than a person standing next to it.

Sensor Fusion Combines the Power of Different Sensors

As different sensors have their strengths, the software’s job is to combine the sensor data and utilize it for safe autonomous driving.

“Weather-related challenges can be tackled with sensor fusion and proper placement of sensors”, Santamala says and continues: “Artificial intelligence has advanced rapidly, and like interpreting a visual image, AI can also be taught to interpret radar data. I assume radars will play an even larger role in the future of autonomous driving.”

Even though Sensible 4 positioning is based on LiDAR, it’s very robust against the elements.

The Autonomous Toyota Proace

“Our solution to this problem is an advanced probabilistic model of positioning, which measures the shape of terrain and buildings surrounding the vehicle, and effectively filters out noise and outliers. It works even when the weather distorts over 50% of the measured information”, Santamala mentions.

Faster Growth and Software Development With Investments

Dawn, the World’s first commercial all-weather Level 4 self-driving software by Sensible 4 will be released in 2022. Currently, the early version of the software is being tested in Norway, in a one-year trial near Oslo and at Gjesdal, near Stavanger. Sensible 4 just completed a two-week test in Finnish Lapland in freezing, snowy conditions. The second closing of A-round funding which is ending soon enables further growth of Sensible.

“We’re preparing our software product for the emerging market of autonomous transportation“, Santamala describes.

NEXT UP IN NEWS
Hassan Soukar

Hassan Soukar is a veteran automotive, and tech journalist, as well as a professional photographer. A man with a keen sense for innovation and a great sense of humour. He is the pro that reviews the latest gadgets from smartphones and VR headsets to fitness bands. Hassan has a generous number of lists, reviews and latest industry leaks.

Recent Posts

High-Tech Glass and Gaming – IDTechEx Explores Automotive Displays

Could video games and online shopping be possible while the car is on the move?…

20 hours ago

Understanding Auto Care: Keeping Your New Car Pristine

When buying a new car, your most important job is keeping it in good condition.…

21 hours ago

Motorists Are Unaware Their No Claims Discount Is Invalid

Motorists are warned they could still pay for invalidated protection, as incorrect details could void…

21 hours ago

Hyundai Motor America Reports April 2024 Sales

Hyundai Motor America reported total April sales of 68,603 units, a 3% decrease compared with…

6 days ago

Lynk & Co Cyan Racing Returns to the Streets of Morocco for Kumho Fia TCR World Tour Round Two

Lynk & Co Cyan Racing returns to Morocco and the streets of Marrakech this weekend…

6 days ago

Talking Cars – IDTechEx Explores Software-Defined Vehicles

With software-defined vehicles (SDVs), in-car payments and cars that communicate are becoming possible, enhancing experiences…

6 days ago