A Robot to Manipulate Deformable Materials
CAMBRIDGE, MA—Robots have been a major focus in the technology world for decades, but they are just beginning to penetrate our everyday world. Before we see widespread adoption of robots for common tasks, they must be able to work in dynamic, changing environments.
Robots first made a dramatic impact in factory settings as they could be used to automate fixed processes in controlled environments. Next, automation came to aircraft navigation and control, where the environment is relatively simple and there are very few obstacles to avoid.
Today, we see mobile delivery robots and self-driving cars beginning to navigate crowded streets and sidewalks. However, even in these complex navigation tasks, autonomous robots have very limited interactions with their environment—they largely avoid obstacles. The next major challenge in robotics is to manipulate objects in these unstructured environments, effectively bringing the tremendous power of factory automation to everyday life.
“Equipping a robot to make decisions and accomplish tasks in everyday environments, which have not been designed around the robot as they have in factories, has been an open challenge in the robotics community for quite some time,” said David M.S. Johnson, a robotic systems engineer who led development of the system at Draper. “Over the last few years, the field has made dramatic progress, and we have deployed autonomous systems in complex, real-world environments.”
Robot density worldwide is indeed on the rise. On the job, 74 robots are currently at work for every 10,000 employees, a 20 percent jump since 2015, reports the International Federation of Robotics. In the coming years, one of the top growth sectors for robots, according to McKinsey, will be food service—a job that is currently too complex and requires decisions too difficult for robots.
In order to demonstrate real-world applicability, Draper adapted techniques from an on-going research project in object manipulation, to food service. Once built, Draper’s robot was put to work demonstrating its ability to manipulate dynamic environments by performing a classic summer job —scooping ice cream—a job the robot, named Alfred, performed hundreds of times at the annual Hannover Messe show, where it was a finalist in the KUKA Innovation Award, which honors outstanding concepts in the area of human-machine collaboration outside the industrial environment.
“People lined up just to be served ice cream by a robot,” Johnson said. “It’s a challenge for a robot to plan a task in a dynamic environment, let alone do it hundreds of times while customers are waiting for ice cream.”
Draper’s approach to autonomously operate in unstructured environments depends on the robot’s ability to map the location of the robot with respect to objects of interest, terrain and obstacles. This sensing and perception system uses a learned algorithm to recognize and classify each element in its environment based on how it relates to the current task.
The system allows a robot to recognize and locate a large library of objects relevant to an everyday task—for example, scooping and serving ice cream or preparing a salad. Based on the objects’ locations and its knowledge of how to perform the task, the robot creates a detailed motion plan to accomplish each step of the task—its own custom set of instructions. These plans are adaptable to the environment and change in response to external events.
At Hannover Messe, KUKA described Draper’s new system as “the first ergonomic and dynamically reconfigurable human-robot collaboration framework.” Alfred is an outgrowth of an on-going research collaboration between Draper, Prof. Russ Tedrake’s Robot Locomotion Group at MIT, Prof. Leslie Kaelbling and Prof. Tomas Lozano-Perez’s Learning and Intelligent Systems Group at MIT, Prof. Antonio Torralba’s Vision Research Group at MIT and Prof. Scott Kuindersma at the Agile Robotics Lab at the Harvard School of Engineering and Applied Sciences.
Released July 17, 2018