Back to News & Media
Monday, July 9, 2018

Draper Debuts All-Weather Detection for LiDAR

Giving self-driving cars the ability to see in rain, snow and fog

CAMBRIDGE, MAOne of the biggest challenges to developing reliable self-driving cars is equipping them with technology so that they can “see” in rain, snow or fog. When a self-driving car relies on optical imaging technologies to distinguish between common objects like vehicles and pedestrians, major problems can arise when the car encounters misty conditions on the road.

Sabrina Mansur, program manager in autonomous systems at Draper, says imaging systems for self-driving cars are improving, but more needs to be done. “LiDAR is one of the more critical technologies for automated driving, and automakers are asking for performance improvements in two areas: range and signal processing. In other words, currently available LiDAR sensors typically can’t see far enough down the road or differentiate what they see well enough in bad weather to be used reliably in self-driving cars.”

LiDAR works by emitting pulsed laser light and measuring the time it takes for the light to return after being reflected from objects in the light’s path. A LiDAR system’s sensor creates 3D images of the objects and their locations, such as cars on a highway. However, rain, snow and fog scatter the laser light and create false signals, effectively blinding the LiDAR sensor. Until now, LiDAR sensors could be used only during clear weather because no one has been able to solve the problem.

Recently, a team of Draper engineers tackled this problem and successfully demonstrated a LiDAR system that can see through dense fog. In one test, the team filled a hockey rink with fog so dense human vision could barely see an object 30 meters away. Draper’s LiDAR system was able to see objects 54 meters away, almost twice that distance, on the other side of the rink.

“LiDAR needs to record not just the first thing it hits, but subsequent things, so that it can reconstruct a whole field of vision accurately,” said Eric Balles, director of transport and energy at Draper. “Our system can detect and analyze the entire range of signals, which is something most LiDARs can’t do right now.”

Draper’s all-weather LiDAR technology, named Hemera, has been developed for compatibility with most existing LiDAR platforms, thus enabling vendors to extend their technology development investments. Draper is actively engaging with carmakers and suppliers to develop systems for self-driving cars and helping them to add Hemera to existing LiDAR systems. Hemera received investment from Draper as an internal research and development program. The project is led by Joseph Hollmann, Ph.D., senior scientist for computational imaging systems development at Draper.

Draper will introduce the new LiDAR technology at the Automated Vehicles Symposium 2018, July 9 to 12, in San Francisco. The new offering, which is available to license, adds to Draper’s growing portfolio of autonomous system and self-driving car capabilities. The portfolio includes the Draper APEX Gyroscope—a MEMS gyroscope that provides centimeter-level localization accuracy. It also includes LiDAR-on-a-Chip—a chip-scale MEMS-based LiDAR that is an affordable and scalable solution with the performance to enable a driverless car to travel safely at highway speeds.

Draper engineers have developed a LiDAR imaging system called Hemera that can gauge the distance of objects shrouded by fog up to 54 meters away. Engineers placed a target on a vehicle grille 54.5 meters from Draper’s new LiDAR system, and then filled the room with obscuring fog. (Left) Ideal LiDAR image of target, (center) LiDAR image prior to processing with Draper’s Hemera LiDAR technology, (right) LiDAR image with Hemera processing reveals target and additional detail of vehicle’s grille in background. The new Hemera LiDAR sensor adds to Draper’s growing portfolio of autonomous system and self-driving car capabilities that includes the Draper APEX Gyroscope and Draper’s LiDAR-on-a-Chip.
Capabilities Used
Autonomous Systems

Draper combines mission planning, PN&T, situational awareness, and novel GN&C designs to develop and deploy autonomous platforms for ground, air, sea and undersea needs. These systems range in complexity from human-in-the-loop to systems that operate without any human intervention. The design of these systems generally involves decomposing the mission needs into sets of scenarios that result in trade studies that lead to an optimized solution with key performance requirements.  Draper continues to advance the field of autonomy through research in the areas of mission planning, sensing and perception, mobility, learning, real-time performance evaluation and human trust in autonomous systems.

Microsystems

Draper has designed and developed microelectronic components and systems going back to the mid-1980s. Our integrated, ultra-high density (iUHD) modules of heterogeneous components feature system functionality in the smallest form factor possible through integration of commercial-off-the-shelf (COTS) technology with Draper-developed custom packaging and interconnect technology. Draper continues to pioneer custom Microelectromechanical Systems (MEMS), Application-Specific Integrated Circuits (ASICs) and custom radio frequency components for both commercial (microfluidic platforms organ assist, drug development, etc.) and government (miniaturized data collection, new sensors, Micro-sats, etc.) applications.  Draper features a complete in-house iUHD and MEMS fabrication capability and has existing relationships with many other MEMS and microelectronics fabrication facilities. 

Image & Data Analytics

Draper combines specific domain expertise and knowledge of how to apply the latest analytics techniques to extract meaningful information from raw data to better understand complex, dynamic processes. Our system design approach encompasses effective organization and processing of large data sets, automated analysis using algorithms and exploitation of results. To facilitate user interaction with these processed data sets, Draper applies advanced techniques to automate understanding and correlation of patterns in the data. Draper’s expertise encompasses machine learning (including deep learning), information fusion from diverse and heterogeneous data sources, optimized coupling of data acquisition and analysis and novel methods for analysis of imagery and video data.

Media Contact

Media Relations

Contact Info: 
Strategic Communication

Media Relations

P: 
617-258-2464
C: 
617-429-2883