Why autonomous cars may stuck into wet cement?

Ray Hu
3 min readSep 14, 2023

--

In August 17, 2023, a Cruise autonomous taxi stuck in wet cement on the street of San Francisco, raising a new round of questions about the confidence of true true autonomous vehicles. While any toddler can do this perception jobs in merely seconds, it is so hard for AI to do the same job.

Autonomous vehicles are a testament to the power of high-level logical programs that orchestrate complex driving decisions. These self-driving cars rely on multiple deep learning models, each with a specific role: one to detect safe, drive-able paths, and another to identify areas to avoid. Underpinning this intricate algorithm are a myriad of sensors, including LiDAR, cameras, radar, and infrared. While these sensors can detect various aspects of the road environment, their ability to discern wet surfaces is not always straightforward.

LiDAR sensors are primarily designed to measure distances by emitting laser pulses and measuring the time it takes for those pulses to bounce back, thus create a point cloud of data. While LiDAR sensors excel at providing accurate depth and distance information and can detect certain characteristics of objects and surfaces, such as shape and geometry, they are not typically used to directly measure wetness or moisture content. They have no idea about color or texture of surfaces.

Radar sensors are generally better at detecting hard and reflective surfaces than LiDAR. But they are also designed to detect objects and obstacles on the road. The wet, soft, or non-reflective materials are less reliable on Radar. But it can tell the surface is wet.

Infrared sensors can detect variations in temperature. Wet or moist road surfaces often have different temperatures compared to dry surfaces. Water has a high heat capacity, so it can absorb or release heat more slowly than the road pavement. As a result, when road surfaces are wet, they may have a slightly different temperature profile. IR sensors can detect these temperature variations.

Cameras capture visual information, and they can identify differences in color, texture, and reflectivity that indicate the presence of wet or damp areas, such as wet cement. But, wet surface can be perfectly to drive. In rainy days, the autonomous driving cars still cruise the streets. Cameras cannot tell if the surface is soft or hard.

Yet, the true magic happens in the fusion of data and algorithms. It’s not a single sensor but a combination of these sensors, working in harmony, that makes inferences about road conditions. This fusion extends beyond wetness to encompass aspects like softness or the presence of ice. Furthermore, it draws upon a wealth of contextual knowledge. When it is a sunny day, the Infrared sensors determine a piece of road surface has a abnormal temperature, the cameras tells the texture are not like the other road surface, with traffic cones around it, construction workers wearing uniforms nearby, the autonomous system may wisely infer that it’s a section not meant for driving. In the intricate symphony of autonomous driving, these sensors play unique roles, but it’s their collective harmony and the wisdom of the algorithm that truly is holy grail for the industry top talents to pursue.

--

--

Ray Hu
Ray Hu

Written by Ray Hu

nobody satirist with abnormal knowledge of current affairs

No responses yet