Multisensorial robot calibration framework and toolbox
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F21%3A00352664" target="_blank" >RIV/68407700:21230/21:00352664 - isvavai.cz</a>
Alternative codes found
RIV/68407700:21730/21:00352664
Result on the web
<a href="https://doi.org/10.1109/HUMANOIDS47582.2021.9555803" target="_blank" >https://doi.org/10.1109/HUMANOIDS47582.2021.9555803</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/HUMANOIDS47582.2021.9555803" target="_blank" >10.1109/HUMANOIDS47582.2021.9555803</a>
Alternative languages
Result language
angličtina
Original language name
Multisensorial robot calibration framework and toolbox
Original language description
The accuracy of robot models critically impacts their performance. With the advent of collaborative, social, or soft robots, the stiffness of the materials and the precision of the manufactured parts drops and CAD models provide a less accurate basis for the models. On the other hand, the machines often come with a rich set of powerful yet inexpensive sensors, which opens up the possibility for self-contained calibration approaches that can be performed autonomously and repeatedly by the robot. In this work, we extend the theory dealing with robot kinematic calibration by incorporating new sensory modalities (e.g., cameras on the robot, whole-body tactile sensors), calibration types, and their combinations. We provide a unified formulation that makes it possible to combine traditional approaches (external laser tracker, constraints from contact with the external environment) with self-contained calibration available to humanoid robots (self-observation, self-contact) in a single framework and single cost function. Second, we present an open source toolbox for Matlab that provides this functionality, along with additional tools for preprocessing (e.g., dataset visualization) and evaluation (e.g., observability/identifiability). We illustrate some of the possibilities of this tool through calibration of two humanoid robots (iCub, Nao) and one industrial manipulator (dual-arm setup with Yaskawa-Motoman MA1400).
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
20204 - Robotics and automatic control
Result continuities
Project
<a href="/en/project/GX20-24186X" target="_blank" >GX20-24186X: Whole-body awareness for safe and natural interaction: from brains to collaborative robots</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2021
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)
ISBN
978-1-7281-9372-4
ISSN
2164-0572
e-ISSN
2164-0580
Number of pages
8
Pages from-to
459-466
Publisher name
IEEE
Place of publication
Piscataway
Event location
Munich
Event date
Jul 19, 2021
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
000728400200054