This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Cambridge researchers gain upper hand in robotics design
Researchers from the University of Cambridge have designed what they claim to be a low-cost energy efficient robotic hand that can grasp a range of objects using just the movement in its wrist and feeling in its ‘skin’.
As experts working in the field acknowledge, grasping objects of different sizes, shapes and textures may be easy for a human hand, but it is a major challenge for a robot.
While humans instinctively know how to gently handle an egg without shattering it, robots require training to recognise the right amount of force required. Typically, research in this area has involved hands with motorised fingers.
Researchers at Professor Fumiya Iida’s Bio-Inspired Robotics Laboratory – which looks at how robotics can be improved by taking inspiration from nature – have designed a soft, 3D-printed robotic hand trained to grasp different objects.
While the hand cannot independently move its fingers, it was still able to carry out a range of complex movements and it was able to predict whether it would drop them by using the information provided by tactile sensors placed on its ‘skin’.
The scientists argued in the journal Advanced Intelligent Systems that this type of passive movement makes the robot easier to control and far more energy efficient than expensive robots with fully motorised fingers.
The lab’s research team, which is part of Cambridge’s engineering department, carried out more than 1200 tests with the robotic hand, observing its ability to grasp small objects without dropping them.
The robot was initially trained using small 3D-printed plastic balls, and grasped them using a pre-defined action obtained through human demonstrations.
“This kind of hand has a bit of springiness to it: it can pick things up by itself without any actuation of the fingers,” said one of the report’s authors, Dr Kieran Gilday.
“The tactile sensors give the robot a sense of how well the grip is going, so it knows when it’s starting to slip. This helps it to predict when things will fail,” he added.
The robot used trial and error to learn what kind of grip would be successful. After finishing the training with the balls, it then attempted to grasp different objects including a peach, a computer mouse and a roll of bubble wrap.
In these tests, the hand was able to successfully grasp 11 out of 14 objects.
“The sensors, which are sort of like the robot’s “skin”, measure the pressure being applied to the object,” explained report co-author George-Thuruthel.
“We can’t say exactly what information the robot is getting, but it can theoretically estimate where the object has been grasped and with how much force.”
“The robot learns that a combination of a particular motion and a particular set of sensor data will lead to failure, which makes it a customisable solution. The hand is very simple, but it can pick up a lot of objects with the same strategy,” Gilday added.
The researchers added that, in the future, the system could be expanded in several ways, adding computer vision capabilities, for instance, or teaching the robot to exploit its environment, which would enable it to grasp a wider range of objects.
The work was funded by national funding agency UK Research and Innovation and UK-based software design firm Arm Ltd.
#BeInformed
Subscribe to our Editor's weekly newsletter