We are in the midst of a digital revolution in the automotive industry. Over the past decade we have seen more and more sensors being placed in vehicles, providing rich vehicle data on vehicle state and behavior. In recent years, new car sensors were added that point outwards, towards the vehicle’s environment, offering support to drivers and providing vehicle data that can empower the entire mobility economy.
Understanding the surroundings with vehicle data
These so-called ADAS (Advanced Driver Assistance Systems) rely on cameras, radar, LiDAR, ultrasonic, and infrared (IR) sensors to get a read on the vehicle’s surroundings and provide a clear picture of it. They are the precursors of fully autonomous cars (so-called level 4 and 5 vehicles), but they already provide value to drivers and their data can power valuable insights and use cases – including real-time accident detection, road-sign information, and more.
One key sensor that is soon set to rise in popularity is the imaging radar. High-resolution radars use a variety of technologies to increase the typical 6 channels of the current car radars to over 192 – providing a finer, detailed image and greater range.
High-resolution imaging radars are coupled with laser-based LiDAR sensors (Light Detection and Ranging) that operate on much shorter wavelengths and provide better accuracy and precision. Additionally, improvements are being advanced in the “regular” cameras that vehicles are equipped with. There is a move toward 8 MP cameras that will replace the common 1.3 MP in current makes. The higher resolution cameras also boast a wider dynamic range that will enable better target detection at a much longer distance, and under tougher lighting conditions.
This combination provides vehicle AI with spatial awareness that translates into the emergence of Super Cruise Control – hands-free and partially automated highway driving assist systems (providing level 2 – or what some dubbed level 2+ – autonomous capabilities).
Overcoming weather and other car sensor disruptions
One major challenge that still plagues these sensors is the weather. Driving under less than ideal conditions severely affects the effectiveness of many of these sensors. Radar sensors are the least susceptible and can handle even harsh conditions well, but even newer imaging radars do not provide enough information for it to work without the assistance of other sensor systems. LiDAR can be offset to perform in harsh weather, whilst cameras face the most challenges. Significant amounts of dirt and mud can cake and block apertures without the driver being aware that the sensors have been compromised.
Better safety with driver-centric detectors
Looking further into the future, new pressure-detection sensors embedded in the driver and passenger seats will be able to detect drivers’ status, including alerting them when they become drowsy or sick, and taking proactive action in cases like heart attacks, seizures, and more.