All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Online planning for multi-robot active perception with self-organising maps

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F18%3A00316691" target="_blank" >RIV/68407700:21230/18:00316691 - isvavai.cz</a>

  • Result on the web

    <a href="https://link.springer.com/article/10.1007/s10514-017-9691-4" target="_blank" >https://link.springer.com/article/10.1007/s10514-017-9691-4</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/s10514-017-9691-4" target="_blank" >10.1007/s10514-017-9691-4</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Online planning for multi-robot active perception with self-organising maps

  • Original language description

    We propose a self-organising map (SOM) algorithm as a solution to a new multi-goal path planning problem for active perception and data collection tasks. We optimise paths for a multi-robot team that aims to maximally observe a set of nodes in the environment. The selected nodes are observed by visiting associated viewpoint regions defined by a sensor model. The key problem characteristics are that the viewpoint regions are overlapping polygonal continuous regions, each node has an observation reward, and the robots are constrained by travel budgets. The SOM algorithm jointly selects and allocates nodes to the robots and finds favourable sequences of sensing locations. The algorithm has a runtime complexity that is polynomial in the number of nodes to be observed and the magnitude of the relative weighting of rewards. We show empirically the runtime is sublinear in the number of robots. We demonstrate feasibility for the active perception task of observing a set of 3D objects. The viewpoint regions consider sensing ranges and self-occlusions, and the rewards are measured as discriminability in the ensemble of shape functions feature space. Exploration objectives for online tasks where the environment is only partially known in advance are modelled by introducing goal regions in unexplored space. Online replanning is performed efficiently by adapting previous solutions as new information becomes available. Simulations were performed using a 3D point-cloud dataset from a real robot in a large outdoor environment. Our results show the proposed methods enable multi-robot planning for online active perception tasks with continuous sets of candidate viewpoints and long planning horizons.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GJ15-09600Y" target="_blank" >GJ15-09600Y: Adaptive Informative Path Planning in Autonomous Data Collection in Dynamic Unstructured Environments</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2018

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Autonomous Robots

  • ISSN

    0929-5593

  • e-ISSN

    1573-7527

  • Volume of the periodical

    2018

  • Issue of the periodical within the volume

    42

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    24

  • Pages from-to

    715-738

  • UT code for WoS article

    000427378300003

  • EID of the result in the Scopus database

    2-s2.0-85038028527