adas user-experience

The Many Advantages of Radar

The Many Advantages of Radar

The main sensors in use today on vehicles are radar and cameras, with ultrasonics playing a role in short distances at low speeds and lidar used in autonomous driving. Radar is widely used because it has some unique strengths.

Radar can reliably indicate how far away an object is. Typical long-range automotive radars can provide range measurements on objects that are as much as 300 meters to 500 meters away.

Cameras, by contrast, have to try to estimate how far away an object is based on the size of the object in the camera’s image and other factors. Even leveraging a stereoscopic approach, this can be challenging. Further, resolution becomes an issue, as a single pixel in a camera image is very broad at long range, making it harder for a camera to discern those objects. Focusing optics can help, but they limit the field of view, leading to a challenging compromise typical of camera-based perception systems.

At the same time, radar makes inherent measurements of relative speed, so at the same time it is providing a range measurement, it can also tell how quickly something is moving toward the vehicle or away from it. Cameras and lidars may need to take multiple images over time to estimate relative speed.

Because radar uses radio waves instead of light to detect objects, it works well in rain, fog, snow and smoke. This stands in contrast to optical technologies such as cameras – or in the future, lidar – which are generally susceptible to the same challenges as the human eye. Consider the last time you were blinded by direct sunlight while driving, or tried to see clearly through a windshield covered with dirt and grime. Optical sensors have the same challenges, but radars can still see well in those cases. And unlike cameras, radar does not need a high-contrast scene or illumination to sense well at night.

Radar also provides an OEM significant packaging flexibility, thanks to its ability to work when placed behind opaque surfaces. Optical technologies need to be able to “see” the road, which requires them to be visible from the outside of a vehicle – preferably at a high point so they can have good line of sight and stay clear of road dirt and grime. Radar, by contrast, can be placed behind vehicle grilles, in bumpers, or otherwise hidden away, giving designers significant flexibility to focus on vehicle aesthetics.

The Many Advantages of Radar

Where to use optical sensors

Cameras are well suited for object classification. Only a camera can read street signs, and a camera is best at telling if an object is another vehicle, a pedestrian, a bicycle or even a dog. Each of those objects is going to behave differently, so the vehicle’s system will be better able to anticipate movements if it knows exactly what it is looking at.

Lidar has drawn attention because it offers some unique strengths. It can take direct range measurements at high resolution and form a grid, where each grid cell has a particular distance associated with it. Because lidar operates at a much higher frequency, it has a much shorter wavelength than traditional radar – and that means it can provide higher angle resolution than radar, allowing lidar to identify the edges of objects more precisely.

One downside of lidar is that it needs to have a clean and clear surface in front of it to be effective, which of course can be especially problematic on a moving vehicle. One unfortunate yet well-placed beetle could render a vehicle sightless.

An equally significant issue is that lidar is a less mature technology than radar, which means it’s much more expensive. The expense limits how widely lidar can be used in today’s high-volume automotive marketplace.

To ensure a reliable and safe solution, a vehicle should have access to a combination of different sensing technologies and then use sensor fusion to bring those inputs together to gain the best possible understanding of the environment. But even if that isn’t possible – if the cameras are smudged and the lidar is having bug-splatter issues – the radars in the vehicle can deliver excellent information, especially when paired with the right machine learning algorithms.

LEARN MORE ABOUT MACHINE LEARNING IN OUR WHITE PAPER

The main sensors in use today on vehicles are radar and cameras, with ultrasonics playing a role in short distances at low speeds and lidar used in autonomous driving. Radar is widely used because it has some unique strengths.

Radar can reliably indicate how far away an object is. Typical long-range automotive radars can provide range measurements on objects that are as much as 300 meters to 500 meters away.

Cameras, by contrast, have to try to estimate how far away an object is based on the size of the object in the camera’s image and other factors. Even leveraging a stereoscopic approach, this can be challenging. Further, resolution becomes an issue, as a single pixel in a camera image is very broad at long range, making it harder for a camera to discern those objects. Focusing optics can help, but they limit the field of view, leading to a challenging compromise typical of camera-based perception systems.

At the same time, radar makes inherent measurements of relative speed, so at the same time it is providing a range measurement, it can also tell how quickly something is moving toward the vehicle or away from it. Cameras and lidars may need to take multiple images over time to estimate relative speed.

Because radar uses radio waves instead of light to detect objects, it works well in rain, fog, snow and smoke. This stands in contrast to optical technologies such as cameras – or in the future, lidar – which are generally susceptible to the same challenges as the human eye. Consider the last time you were blinded by direct sunlight while driving, or tried to see clearly through a windshield covered with dirt and grime. Optical sensors have the same challenges, but radars can still see well in those cases. And unlike cameras, radar does not need a high-contrast scene or illumination to sense well at night.

Radar also provides an OEM significant packaging flexibility, thanks to its ability to work when placed behind opaque surfaces. Optical technologies need to be able to “see” the road, which requires them to be visible from the outside of a vehicle – preferably at a high point so they can have good line of sight and stay clear of road dirt and grime. Radar, by contrast, can be placed behind vehicle grilles, in bumpers, or otherwise hidden away, giving designers significant flexibility to focus on vehicle aesthetics.

The Many Advantages of Radar

Where to use optical sensors

Cameras are well suited for object classification. Only a camera can read street signs, and a camera is best at telling if an object is another vehicle, a pedestrian, a bicycle or even a dog. Each of those objects is going to behave differently, so the vehicle’s system will be better able to anticipate movements if it knows exactly what it is looking at.

Lidar has drawn attention because it offers some unique strengths. It can take direct range measurements at high resolution and form a grid, where each grid cell has a particular distance associated with it. Because lidar operates at a much higher frequency, it has a much shorter wavelength than traditional radar – and that means it can provide higher angle resolution than radar, allowing lidar to identify the edges of objects more precisely.

One downside of lidar is that it needs to have a clean and clear surface in front of it to be effective, which of course can be especially problematic on a moving vehicle. One unfortunate yet well-placed beetle could render a vehicle sightless.

An equally significant issue is that lidar is a less mature technology than radar, which means it’s much more expensive. The expense limits how widely lidar can be used in today’s high-volume automotive marketplace.

To ensure a reliable and safe solution, a vehicle should have access to a combination of different sensing technologies and then use sensor fusion to bring those inputs together to gain the best possible understanding of the environment. But even if that isn’t possible – if the cameras are smudged and the lidar is having bug-splatter issues – the radars in the vehicle can deliver excellent information, especially when paired with the right machine learning algorithms.

LEARN MORE ABOUT MACHINE LEARNING IN OUR WHITE PAPER

How helpful was this article?
i

 

×

Please let us know how helpful this article was, so we can provide you with the best content possible. If you have more feedback to share, please feel free to contact us.
Thank you!

Authors
Rick Searcy
Rick Searcy
Advanced Radar Systems Manager

Careers


Shape the future of mobility. Join our team to help create vehicles that are safer, greener and more connected.

View Related Jobs

Subscribe


All Attachments (1)