Abstract:
Robots need to use their end-effectors not only to grasp and manipulate objects but also to understand the environment surrounding them. Object identification is of paramount importance in robotics applications, as it facilitates autonomous object handling, sorting, and quality inspection. In this paper, we present a new hyper-adaptive robot hand that is capable of discriminating between different everyday objects, as well as `model' objects with the same external geometry but varying material, density, or volume, with a single grasp. This work leverages all the benefits of simple, adaptive grasping mechanisms (robustness, simplicity, low weight, adaptability), a Random Forests classifier, tactile modules based on barometric sensors, and radar technology offered by the Google Soli sensor. Unlike prior work, the method does not rely on object exploration, object release or re-grasping and works for a wide variety of everyday objects. The feature space used consists of the Google Soli readings, the motor positions and the contact forces measured at different time instances of the grasping process. The whole approach is model-free and the hand is controlled in an open-loop fashion, achieving stable grasps with minimal complexity. The efficiency of the designs, sensors, and methods has been experimentally validated with experimental paradigms involving model and everyday objects.