Lets talk about sensors

Juan Carlos Aguilera

As Introduced in the post "What technologies are behind the autonomous vehicles?", there are a lot of sensors that are required for the operation of an autonomous vehicle.

Here we are going to have a brief introduction on all the different sensors that are commonly used.

 

Lets start by the list of the (arguable) most relevant sensors:

  • Inertial Measurement Unit ( IMU )

  • Camera

  • Radar 

  • Ultrasonic

  • Lidar

  • Global Navigation Satellite System (GNSS)

IMU: Is a group of sensors that measure forces that are interacting with such a device at any given time. It usually conformed by:

  • Accelerometer: It measures the acceleration ( change of velocity ) 

    • An IMU for an autonomous vehicle usually has three accelerometers one for each axis, x,y, and z.

  • Gyroscope: It measures the angular velocity (Figure 4)

    • An IMU for an autonomous vehicle usually has three gyroscopes one for each axis, x,y, and z. Movements are called Roll, Pitch, and Yaw. (Figure 5)

    • Commonly in an autonomous vehicle, the most relevant measurement of these three will be the Yaw.

  • Magnetometer(Compass): Measures the direction, strength, or change of a magnetic field.

The inertial unit will provide the change of velocity, change of direction as well as the current orientation of the vehicle, based on a relative position usually given by the place that the IMU has been placed in the vehicle.

Cameras

A camera is a well-known device that provides optical information of the world around us, mounted on a vehicle a pair of cameras on a stereo configuration can provide 3D information of the world around the vehicle, it will also provide enough information to detect and classify the objects that are near the vehicle which will give critical information at the moment to take decisions.

 

An autonomous vehicle might have more than two cameras but it will always have at least two cameras.

Computer vision is also a complete field of study on its own which we are not going to go into detail right now but it’s important to know that there are a lot of different algorithms that will allow to extract the relevant information.

StereoVision.jpg

Figure 6: Stereo Vision Setup

Ultrasonic

The ultrasonic sensors are cheap and reliable and can attain high accuracy on most scenarios, are used for close object detection, it usually has a sound emitter at the side of a sound receiver. The emitter sends a sound beam and the receiver is going to calculate the amount of time that it takes to come back after hitting an object, with that time is possible to calculate the distance between the vehicle and the object. Some important limitations of the ultrasonic sensors are due to how the sound waves propagate through the environment, the sound can be easily affected by weather conditions such as rain, fog, or snow, which can cause that the time of travel of the sound wave could be affected and therefore the distance is going to be miscalculated. 

RADAR

Radio Detecting and Ranging (RADAR): A RADAR is an electromagnetic sensor used for close object detection, a radar usually has an emitter and a receiver. The emitter sends an electromagnetic signal which will travel through the environment and be reflected by a nearby object if it exists, just as the ultrasonic sensor the radar is going to calculate the time of travel of that signal and calculate the distance of the object, In contrast with the ultrasonic sensor the radar is not affected by the weather changes, but it can be affected by electromagnetic fields, is not as common but several things like engines or some other electrical devices could cause electromagnetic field disturbances which may affect the RDAR measurements. 

On Autonomous vehicles, there are around eight different radars with different ranges for different purposes, ultra-short-range (20 m) Parking Assist, short-range ( 80m ) Blindspot detect, cross-traffic alert, Mid-range (120m) Automatic emergency brake, and Long-range (250m) adaptive cruise control.

The RADAR system usually has a symmetrical configuration from left to right and from front to back, the only main difference is the adaptive cruise control setup which is usually only at the front.

RADAR_System.jpg

Figure 7: Car RADAR System

Why using Ultrasonic sensors and RADARs? , well on a safety-critical system such as an autonomous vehicle is always important to have redundancy on the system, for that reason multiple sensors that can detect the same thing under different circumstances is something that is required, is probable that for 90% of the applications both sensors are going to return the same result but on the remaining 10% it can return different information and is important to be able to get the right information on 99.9999% of the cases.

LIDAR

Light Detection and Ranging (LIDAR): The concept of the LIDAR is the same as the RADAR but instead of using ultrasonic beams it uses a laser or light beam, which will be sent by the emitter and then calculate the time that the light takes to come back.

 

One of the biggest benefits of the LIDAR is that it can produce millions of samples a second and those samples can be sent in two dimensions by moving up and down, left and right, a mirror close to the emitter changing the direction of the beam which will provide a high-resolution 3D point cloud of the object nearby the vehicle.

LIDAR is a technology that is currently very costly but is highly effective and provides a lot of information that is hard to obtain in any other way. TESLA currently claims that is possible to have a fully autonomous vehicle by replacing the LIDAR with cameras to reduce the total price of the vehicle, and almost all other companies say that is not possible and that a LIDAR is required. 

 

This is an ongoing topic in the autonomous vehicles community. Nevertheless, I believe that achieving the level of confidence that the LIDAR provides using cameras could be very costly and could take a long time especially because of particular corner cases that might not be so easy to replicate, understand or see, but those specific corner cases are probably going to be required to comply with safety standards. 

 

As it might be possible to replace the LIDAR with the cameras, I would question,  How much time and how many resources are going to be needed to make that happen? Is it really worth it?

A little bit more information about LiDAR, can be found here

 

Sensor Fusion

Sensor Fusion is a term used for the different techniques that exist to get all the information of the different sensors RADARs, LIDARs, Cameras, IMU, etc and put it all together to make that information coherent and understandable. 

 

Imagine that there is a car in front of the vehicle, that car may be detected by the RADAR, LIDAR, and Cameras, but each one might throw slightly different information at slightly different times. You can get a different conclusion of each one of them, maybe it can be determined based on the RADAR information something can be detected in front of the vehicle at 20m but can’t be determined what it is. That same thing can be detected with the LIDAR at 21m and it can also provide the volume object but it could be a wall or a car, you don’t know exactly. Finally, the cameras recognize it is a car if all that information is put together can classify and take a decision based on what information is more reliable for each sensor. Use the classification based on the images retrieved by the camera and the volume of the LIDAR but take the distance thrown by the RADAR and with that information take a better decision.

That is why sensor fusion is a critical part of an autonomous vehicle algotithms, some of the most common sensor fusion control algorithms are Kalman Filter and Unscented Kalman Filter, the information will need to be preprocessed before send to this algorithms. 

Global Navigation Satellite System (GNSS)

GNSS Is a group of sensors that enable the possibility of communicating with multiple Sattelite Systems such as GPS, GLONASS, Galileo, Baidou, and some other Satellite system, this will allow the vehicle to get its exact position in real-time, no matter where it is and with high precision.

GPS is a U.S.A satellite system and it is probably the most known around the world, but it is not the only one, there are some other systems owned by other countries or groups of countries like GLONASS which is the Russian Satellite System or Galileo which is the European Satellite System, this redundancy of information improve drastically the performance of the GNSS.

tesla_map.jpg

Figure 11: Tesla GPS Map.

Source: Unsplash

Author: Brecht Denil

Keep an eye on the incoming posts, as we are going to keep diving deeper into the different procedures and technologies of autonomous vehicles. ​

Please feel free to leave any comments with questions or suggestions in the section below.