Airplane pilot is still a career that elementary school kids tell their teachers they want to be when they grow up, alongside firefighters and President of the United States. Pilots command a lot of respect from the public for their skills, but what most people do not know is that the majority of the flight is controlled by computers rather than the pilot. Once the cockpit doors close, the pilots could take a nap and the passengers would not know the difference.
However, when it comes to driving vehicles, handing over control to autonomous vehicles will be a much more difficult transition, as most of us drive on a regular basis and are used to having cars respond to our every command. Statistics show that machine-driven vehicles boast better safety records than human-driven vehicles, but these have had little impact on human perception. Some of the companies developing autonomous vehicle technology have driven their cars millions of miles with fewer at fault accidents than you can count on one hand. Humans have a hard time meeting that standard.
The reason we trust airplanes is that we see a human in command and assume that a human mind is at work keeping us safe. Humans like to be in control, as evidenced by the fact that manual-transmission cars are still common despite the developments of advanced automatic transition technology.
To gain the trust of humans enough to let computers drive us around, we need to abandon any assumptions that we will make an instant switch from driver to driverless and accept that there will be a transitional period, starting with basic functions like automatic braking, lane keeping assistance and hands-off parking. Even airplanes took many years to transition functions away from pilots, who controlled takeoffs and landings manually until very recently. In order to gain the trust of passengers we need to make sure that the underlying technology of cars is working at a high standard of excellence.
The same way the DMV makes all prospective drivers take an eye exam, automated car developers need to make sure their vehicles are not operating blindly. There are several different technologies being leveraged to help cars “see” the world around them and react to the changing environment. Each has its own set of limitations and requirements for reliability.
- Cameras: The most simple of the current methods for sensing the world, cameras use light waves to detect the world. They must be supported by advanced machine-learning techniques for image recognition, but they are the most effective at recognizing shapes, such as obstacles in the road or letters on a sign. However, their effectiveness can be limited by conditions such as light, strong rains or darkness at night that affect a vehicle’s ability to accurately detect and respond to changes in its environment.
- Radar: Based on the same technology used for airplane navigation, these systems provide the most accurate representation of the distance of physical objects by using electromagnetic waves. The more sensors put on a vehicle the more accurate the picture, but this increases the likelihood that they will interfere with each other. Different frequencies can be used to develop different understandings of the world from general long-range impressions to detailed close-range pictures.
- LIDAR: These systems use pulsed laser light and their reflections to develop precise understandings of the world. They are best-used for high-definition close-range imaging. Like cameras, they are highly susceptible to environmental issues, especially as changing light levels require different calibrations.
- V2X: Vehicle to vehicle, the vehicle to signs, vehicle to the cloud, vehicle to everything communications. Connecting vehicles to the world around them by using wireless communications holds great potential for safer driving. With these kinds of connections, cars can know the intentions of other cars around them and adjust more efficiently than human reactions can. They also inform on changes in traffic flow beyond the line of sight of their other sensors, such as stopped or slowed traffic around a corner. Implementing this requires a great investment in infrastructure and standardized communication from all vehicles. It will also place stringent requirements on improving the reliability of wireless connections, especially while moving at high speeds.
While some companies invest more in their preferred sensor approach, the truth is that before cars can be fully autonomous they need all of the information they can gather. The same way that we use our sight and our hearing to understand what is happening in the world in front of us, supplemented with traffic updates from the radio, our automated cars will need multiple senses in order to operate effectively. Each sense needs to meet the highest standards for accuracy and reliability. The road can be unpredictable, even if all vehicles are moving in sync with each other and tiny differences in the way that a signal is received can make a significant difference in how the car responds.
Current versions of these sensing systems have as many as 24 unique radar sensors. Each needs to be carefully tuned to make sure it does not interfere with the others around it, while still working to receive the highest fidelity signal it can produce. To do this, auto manufacturers will need to employ rigorous testing against a range of potential environments. These sensors and signals can be affected by the chaos of signals around them, weather conditions like snow or rain as well as hardware limitations. Despite these challenges, they must produce consistent and accurate results every time for humans to fully trust these systems to take the wheel.
Car manufacturers are now facing increasing demand for innovation and dependability of electronics engineering components of their vehicles, even more than the mechanical. Getting these sensors right are key to the future of autonomous driving, they will help ensure operational efficiency of driverless vehicles while ensuring human trust.