Giving self-driving cars the ability to see in rain, snow and fog
CAMBRIDGE, MA—One of the biggest challenges to developing reliable self-driving cars is equipping them with technology so that they can “see” in rain, snow or fog. When a self-driving car relies on optical imaging technologies to distinguish between common objects like vehicles and pedestrians, major problems can arise when the car encounters misty conditions on the road.
Sabrina Mansur, program manager in autonomous systems at Draper, says imaging systems for self-driving cars are improving, but more needs to be done. “LiDAR is one of the more critical technologies for automated driving, and automakers are asking for performance improvements in two areas: range and signal processing. In other words, currently available LiDAR sensors typically can’t see far enough down the road or differentiate what they see well enough in bad weather to be used reliably in self-driving cars.”
LiDAR works by emitting pulsed laser light and measuring the time it takes for the light to return after being reflected from objects in the light’s path. A LiDAR system’s sensor creates 3D images of the objects and their locations, such as cars on a highway. However, rain, snow and fog scatter the laser light and create false signals, effectively blinding the LiDAR sensor. Until now, LiDAR sensors could be used only during clear weather because no one has been able to solve the problem.
Recently, a team of Draper engineers tackled this problem and successfully demonstrated a LiDAR system that can see through dense fog. In one test, the team filled a hockey rink with fog so dense human vision could barely see an object 30 meters away. Draper’s LiDAR system was able to see objects 54 meters away, almost twice that distance, on the other side of the rink.
“LiDAR needs to record not just the first thing it hits, but subsequent things, so that it can reconstruct a whole field of vision accurately,” said Eric Balles, director of transport and energy at Draper. “Our system can detect and analyze the entire range of signals, which is something most LiDARs can’t do right now.”
Draper’s all-weather LiDAR technology, named Hemera, has been developed for compatibility with most existing LiDAR platforms, thus enabling vendors to extend their technology development investments. Draper is actively engaging with carmakers and suppliers to develop systems for self-driving cars and helping them to add Hemera to existing LiDAR systems. Hemera received investment from Draper as an internal research and development program. The project is led by Joseph Hollmann, Ph.D., senior scientist for computational imaging systems development at Draper.
Draper will introduce the new LiDAR technology at the Automated Vehicles Symposium 2018, July 9 to 12, in San Francisco. The new offering, which is available to license, adds to Draper’s growing portfolio of autonomous system and self-driving car capabilities. The portfolio includes the Draper APEX Gyroscope—a MEMS gyroscope that provides centimeter-level localization accuracy. It also includes LiDAR-on-a-Chip—a chip-scale MEMS-based LiDAR that is an affordable and scalable solution with the performance to enable a driverless car to travel safely at highway speeds.