Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren't rigidly attached to their hands.
To manipulate said tools more robustly, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen.
The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag inflating and deflating) and apply force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, back and forth, roll, pitch, and yaw. The closed-loop controller - a self-regulating system that maintains a desired state without human interaction - uses SEED and visuotactile feedback to adjust the position of the robot arm in order to apply the desired force.
This could be useful, for example, for someone using tools when there's uncertainty in the height of a table, since a pre-programmed trajectory might miss the table completely. "We've been heavily relying on the work of Mason, Raibert, and Craig on what we call a hybrid force position controller," says Hyung Ju Suh, a PhD student in electrical engineering and computer science at MIT, CSAIL affiliate, and lead author on a new paper about SEED. "That's the idea, that if you actually had three dimensions to move in when you're writing on a chalkboard, you want to be able to control position on some of the axes, while controlling force on the other axis."
Rigid-bodied robots and their counterparts can only take us so far; softness and compliance affords the luxury and ability to deform, to sense the interaction between the tool and the hand.
With SEED, every execution the robot senses is a recent 3D image from the grippers, thereby tracking in real-time how the grippers are changing shape around an object. These images are used to reconstruct the position of the tool, and the robot uses a learned model to map the position of the tool to the measured force. The learned model is obtained using the robot's previous experience, where it disturbs a force torque sensor to figure out how stiff the bubble grippers are. Now, once the robot has sensed the force, it will compare that with the force that the user commands, and maybe say to itself, "it turns out the force that I'm sensing right now is not quite there. I need to press harder." It would then move in the direction to increase the force, all done over 6D space.
During the "squeegee task," SEED was provided the right amount of force to wipe up some liquid on a plane, where baseline methods struggled to get the right sweep. When asked to put paper to pen, the bot effectively wrote out "MIT," and it was also able to apply the right amount of force to drive a screw.
While SEED was aware of the fact that it needs to command the force or torque for a given task, if grasped too hard, the item would inevitably slip, so there's an upper limit on that exerted hardness. Also, if you're a stiff robot, you can simulate softer systems than your natural mechanical stiffness - but not the other way around.
Currently, the system assumes a very specific geometry for the tools: it has to be cylindrical, and there are still many limitations on how it may generalize when it meets new types of shapes. Forthcoming work might involve generalizing the framework to different shapes so it can handle arbitrary tools in the wild.
"Nobody will be surprised that compliance can help with tools, or that force sensing is a good idea; the question here is where on the robot the compliance should go and how soft it should be," says paper co-author Russ Tedrake, the Toyota Professor of Electrical Engineering and Computer Science, Aeronautics and Astronautics, and Mechanical Engineering at MIT and a principal investigator at CSAIL. "Here we explore regulating a quite-soft six degree-of-freedom stiffness directly at the hand/tool interface, and show that there are some nice advantages to do that."
Suh wrote the paper alongside Naveen Kuppuswamy, a senior research scientist at Toyota Research Institute; Tao Pang, mechanical engineering PhD student at MIT and CSAIL affiliate; Paul Mitiguy and Alex Alspach of the TRI; and Tedrake. They will present the work at the IEEE/RSJ International Conference on Intelligent Robots and Systems conference in October.
Toyota Research Institute provided funds to support this work.