Single-grasp, model-free object classification using a hyper-adaptive hand, Google Soli, and tactile sensors

ResearchSpace/Manakin Repository

Show simple item record

dc.contributor.author Flintoff, Z en
dc.contributor.author Johnston, B en
dc.contributor.author Liarokapis, Minas en
dc.coverage.spatial Madrid, Spain en
dc.date.accessioned 2019-05-28T01:55:55Z en
dc.date.issued 2018 en
dc.identifier.citation 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018), Madrid, Spain, 01 Oct 2018 - 05 Oct 2018. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE. Piscataway, NJ. 1943-1950. 2018 en
dc.identifier.isbn 978-1-5386-8094-0 en
dc.identifier.issn 2153-0858 en
dc.identifier.uri http://hdl.handle.net/2292/46647 en
dc.description.abstract Robots need to use their end-effectors not only to grasp and manipulate objects but also to understand the environment surrounding them. Object identification is of paramount importance in robotics applications, as it facilitates autonomous object handling, sorting, and quality inspection. In this paper, we present a new hyper-adaptive robot hand that is capable of discriminating between different everyday objects, as well as `model' objects with the same external geometry but varying material, density, or volume, with a single grasp. This work leverages all the benefits of simple, adaptive grasping mechanisms (robustness, simplicity, low weight, adaptability), a Random Forests classifier, tactile modules based on barometric sensors, and radar technology offered by the Google Soli sensor. Unlike prior work, the method does not rely on object exploration, object release or re-grasping and works for a wide variety of everyday objects. The feature space used consists of the Google Soli readings, the motor positions and the contact forces measured at different time instances of the grasping process. The whole approach is model-free and the hand is controlled in an open-loop fashion, achieving stable grasps with minimal complexity. The efficiency of the designs, sensors, and methods has been experimentally validated with experimental paradigms involving model and everyday objects. en
dc.description.uri https://www.iros2018.org/ en
dc.publisher IEEE en
dc.relation.ispartof 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018) en
dc.relation.ispartofseries IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) en
dc.rights Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher. en
dc.rights.uri https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm en
dc.rights.uri https://www.ieee.org/publications/rights/author-posting-policy.html en
dc.title Single-grasp, model-free object classification using a hyper-adaptive hand, Google Soli, and tactile sensors en
dc.type Conference Item en
dc.identifier.doi 10.1109/IROS.2018.8594166 en
pubs.begin-page 1943 en
dc.rights.holder Copyright: IEEE en
pubs.author-url https://ieeexplore.ieee.org/document/8594166 en
pubs.end-page 1950 en
pubs.finish-date 2018-10-05 en
pubs.start-date 2018-10-01 en
dc.rights.accessrights http://purl.org/eprint/accessRights/OpenAccess en
pubs.subtype Proceedings en
pubs.elements-id 765849 en
pubs.org-id Engineering en
pubs.org-id Mechanical Engineering en
dc.identifier.eissn 2153-0866 en
pubs.record-created-at-source-date 2019-08-12 en


Full text options

Find Full text

This item appears in the following Collection(s)

Show simple item record

Share

Search ResearchSpace


Advanced Search

Browse