Efficient Visuo-Haptic Object Shape Completion for Robot Manipulation
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F23%3A00370922" target="_blank" >RIV/68407700:21230/23:00370922 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/10.1109/IROS55552.2023.10342200" target="_blank" >https://doi.org/10.1109/IROS55552.2023.10342200</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/IROS55552.2023.10342200" target="_blank" >10.1109/IROS55552.2023.10342200</a>
Alternative languages
Result language
angličtina
Original language name
Efficient Visuo-Haptic Object Shape Completion for Robot Manipulation
Original language description
For robot manipulation, a complete and accurate object shape is desirable. Here, we present a method that combines visual and haptic reconstruction in a closed-loop pipeline. From an initial viewpoint, the object shape is reconstructed using an implicit surface deep neural network. The location with highest uncertainty is selected for haptic exploration, the object is touched, the new information from touch and a new point cloud from the camera are added, object position is re-estimated and the cycle is repeated. We extend Rustler et al. (2022) by using a new theoretically grounded method to determine the points with highest uncertainty, and we increase the yield of every haptic exploration by adding not only the contact points to the point cloud but also incorporating the empty space established through the robot movement to the object. Additionally, the solution is compact in that the jaws of a closed two-finger gripper are directly used for exploration. The object position is re-estimated after every robot action and multiple objects can be present simultaneously on the table. We achieve a steady improvement with every touch using three different metrics and demonstrate the utility of the better shape reconstruction in grasping experiments on the real robot. On average, grasp success rate increases from 63.3 % to 70.4 % after a single exploratory touch and to 82.7% after five touches. The collected data and code are publicly available (https://osf.io/j6rkd/, https://github.com/ctu-vras/vishac).
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
20204 - Robotics and automatic control
Result continuities
Project
<a href="/en/project/EF16_019%2F0000765" target="_blank" >EF16_019/0000765: Research Center for Informatics</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
ISBN
978-1-6654-9190-7
ISSN
2153-0858
e-ISSN
2153-0866
Number of pages
8
Pages from-to
3121-3128
Publisher name
IEEE
Place of publication
Piscataway
Event location
Detroit, MA
Event date
Oct 1, 2023
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
001133658802045