News
Industrial
Introduction to three sensor technologies in self-driving cars
Autonomous driving is a rapidly evolving technology and a controversial topic. At one extreme, it is believed that autonomous vehicles will ensure a better future by increasing road safety, reducing infrastructure costs, and enhancing the self-care ability of children, the elderly, and the disabled. At the other extreme, many people fear car hacking, the risk of fatal car accidents, and the loss of driving-related jobs. A survey by the Pew Research Center found that 54% of adults are worried about the development of autonomous vehicles, and only 40% of respondents are optimistic about the potential development of automotive automation. Research has also shown that people have very different perceptions and attitudes towards autonomous vehicles.
There is no doubt that autonomous driving is a complex and controversial technology. To understand the safety of self-driving cars, it's important to figure out how they work and what types of autonomous vehicle sensors can help them drive and identify objects on the road to prevent accidents. But first, let's look at the different levels of autonomous vehicles.
5 levels of driving automation
From driver assistance to fully automatic, there are five recognized levels of autonomous vehicles. They were developed by the Society of Automotive Engineers (SAE) and vary according to the level of human involvement in driving. In fact, there are 6 levels in their classification, but level 0 means there is no automation and the vehicle is completely controlled by humans.
L1: Driver assistance
The human driver is responsible for all car operation tasks, including acceleration, steering, braking, and monitoring of the surrounding environment. There is a driver assistance system in the car that can help steer or accelerate, but these two cannot be performed at the same time, such as the common constant speed cruise and automatic parking.
L2: Partial automation
At this level, the car can assist both steering and acceleration while the driver remains responsible for most safety-critical functions and environmental monitoring. Currently, Level 2 autonomous vehicles are most common on the road.
L3: Conditional Automation
Starting at level 3, the car itself uses autonomous vehicle sensors to monitor the environment and perform other dynamic driving tasks such as braking. In the event of a system failure or other unexpected situation, you must be prepared for manual intervention.
L4: Highly automated
Level 4 means a high degree of automation, and even in extreme cases, the car can complete the entire journey without driver intervention. However, there are some restrictions: the driver can switch the vehicle to this mode only when the system detects that the traffic condition is safe and there is no traffic jam.
L5: Fully automated
Fully automated cars do not yet exist, but automakers strive to achieve Level 5 autonomous driving, where the driver only needs to specify a destination and the vehicle is fully responsible for all driving modes. Therefore, Tier 5 cars do not have manual controls such as steering wheels or pedals.
It's all about sensors
Without IoT sensors, self-driving cars are impossible: they enable cars to see and sense everything on the road, and gather the information they need to drive safely. In addition, this information is processed and analyzed to construct a path from point A to point B and to send appropriate instructions to the vehicle control device, such as steering, acceleration, and braking. (From iothome.com) In addition, the information collected by IoT sensors, including actual paths, traffic jams, and obstacles on the road, can be shared among IoT cars. This is called vehicle-to-vehicle communication and helps improve driving automation.
Most automakers today typically use three types of self-driving car sensors: camera, radar, and lidar.
▲ Camera sensor
How they work
Just like the eyes of human drivers, self-driving cars use cameras to observe and interpret objects on the road. By equipping cars with cameras at various angles, these vehicles can maintain a 360 ° view of the external environment and provide a wider picture of the surrounding traffic conditions. Today, 3D cameras can be used to display very detailed, lifelike images. The image sensor automatically detects the object, classifies it, and determines the distance to the object. For example, cameras can identify other cars, pedestrians, cyclists, traffic signs and signals, road markings, bridges, and guardrails.
Areas for improvement
Unfortunately, the camera sensor is not perfect. Adverse weather conditions, such as rain, fog, or snow, prevent the camera from clearly seeing things on the road, increasing the chance of an accident. In addition, camera images are usually not enough for the computer to make the right decision, for example, the driving algorithm may fail when the color of the object is similar to the background or the contrast is low.
▲ Radar sensor
How they work
Radar (radio detection and ranging) sensors make a vital contribution to the overall functionality of autonomous driving: they emit radio waves to detect objects and measure their distance and speed in real time. Short-range and long-range radar sensors are usually deployed throughout the car and have different functions. Short-range (24 GHz) radar applications can implement blind spot monitoring, lane keeping assistance, and parking assistance, while long-range (77 GHz) radar sensors include automatic distance control and brake assistance. Unlike cameras, radar systems usually have no problems identifying objects in foggy or rainy weather.
Areas for improvement
Because the current automotive radar sensors can only correctly recognize 95% of pedestrians, which is not enough to ensure safety, the pedestrian recognition algorithm needs further improvement. In addition, the widely used 2D radar can only scan horizontally, so the height of the object cannot be determined, which may cause problems when driving under the bridge. The 3D radar currently under development is expected to solve this problem.
▲ Lidar sensor
How they work
Lidar (light detection and ranging) sensors work like radar systems, the only difference is that they use lasers instead of radio waves. In addition to measuring the distance to various objects on the road, lidar can also create 3D images of the detected objects and map their surroundings. (From IoT Home Network) In addition, Lidar can be configured to create a complete 360-degree map around the vehicle, instead of relying on a narrow field of view. These two advantages led self-driving car manufacturers such as Google, Uber, and Toyota to choose lidar systems.
Areas for improvement
Since the production of lidar sensors requires rare earth metals, they are much more expensive than radar sensors. The cost of the systems required for autonomous driving can well exceed $ 10,000, while the top IoT sensors used by Google and Uber can cost as much as $ 80,000. Another problem is that snow or fog can block lidar sensors and negatively affect their ability to detect objects.
The future of self-driving cars
Self-driving car sensors play a vital role in autonomous driving: they enable cars to monitor the surrounding environment, detect obstacles, and plan roads. Combined with automotive software and computers, they will allow the system to take full control of the vehicle, saving people a lot of time to perform more efficient tasks. Considering the fact that an average driver spends about 50 minutes a day in a car, imagine how valuable an autonomous car is to the fast-paced world we live in.
Despite the rapid development of autonomous driving technology, there is no level 4 standard required for commercial vehicles to pass autonomous driving. To ensure road safety, manufacturers still need to take huge areas of technological improvement seriously.