Dexterous Decision-Making for Real-World Robotic Manipulation

Event Date/Time

Location

Maeder Hall Auditorium

Series/Event Type


For a robot to prepare a meal or clean a room, it must make a large array of decisions, such as what objects to clean first, where to grasp each ingredient and tool, how to open a heavy, overstuffed cabinet, and so on. To enable robots to tackle these tasks, I decompose the problem into two interdependent layers: generating a series of subgoals (i.e., a strategy) and solving for the robot behavior that achieves each of these subgoals. Critically, to accomplish a rich set of manipulation tasks, these subgoal solvers must account for force, motion, deformation, contact, uncertainty and partial observability.

My research contributes models and algorithms that enable robots to reason over both the geometry and physics of the world in order to solve long-horizon manipulation tasks. In this talk, I will first discuss how this approach has enabled robots to perform tasks that require reasoning over and exerting force, like opening a childproof medicine bottle with a single arm. Next, I will present an abstraction for the complex physics of frictional pushing and demonstrate its application within the context of in-hand manipulation. Finally, I will illustrate how robots can make robust choices in the face of uncertainty. For example, this empowers robots to reliably chop up fruit of unknown ripeness!

Speaker Bio

Rachel Holladay is a Ph.D. student in the Electrical Engineering and Computer Science Department at the Massachusetts Institute of Technology. Her research focuses on developing algorithms and models that enable robots to robustly perform long-horizon, contact-rich manipulation tasks in everyday environments. She received her B.S. in Computer Science and Robotics from Carnegie Mellon University.
Image
Rachel Holladay
Rachel Holladay, Massachusetts Institute of Technology

Hosting Group

MAE

Semester