Vše

Co hledáte?

Vše
Projekty
Výsledky výzkumu
Subjekty

Rychlé hledání

  • Projekty podpořené TA ČR
  • Významné projekty
  • Projekty s nejvyšší státní podporou
  • Aktuálně běžící projekty

Chytré vyhledávání

  • Takto najdu konkrétní +slovo
  • Takto z výsledků -slovo zcela vynechám
  • “Takto můžu najít celou frázi”

Augmented Reality Spatial Programming Paradigm Applied to End-User Robot Programming

Identifikátory výsledku

  • Kód výsledku v IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F24%3APU151173" target="_blank" >RIV/00216305:26230/24:PU151173 - isvavai.cz</a>

  • Výsledek na webu

    <a href="https://www.fit.vut.cz/research/publication/12818/" target="_blank" >https://www.fit.vut.cz/research/publication/12818/</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.rcim.2024.102770" target="_blank" >10.1016/j.rcim.2024.102770</a>

Alternativní jazyky

  • Jazyk výsledku

    angličtina

  • Název v původním jazyce

    Augmented Reality Spatial Programming Paradigm Applied to End-User Robot Programming

  • Popis výsledku v původním jazyce

    The market of collaborative robots is thriving due to their increasing affordability. The ability to program a collaborative robot without requiring a highly skilled specialist would increase their spread even more. Visual programming is a prevalent contemporary approach for end-users on desktops or handheld devices, allowing them to define the program logic quickly and easily. However, separating the interface from the robot's task space makes defining spatial features difficult. At the same time, augmented reality can provide spatially situated interaction, which would solve the issue and allow end-users to intuitively program, adapt, and comprehend robotic programs that are inherently highly spatially linked to the real environment. Therefore, we have proposed Spatially Anchored Actions to address the problem of comprehension, programming, and adaptation of robotic programs by end-users, which is a form of visual programming in augmented reality. It uses semantic annotation of the environment and robot hand teaching to define spatially important points precisely. Individual program steps are created by attaching parametrizable, high-level actions to the points. Program flow is then defined by visually connecting individual actions. The interface is specifically designed for tablets, which provide a more immersive experience than phones and are more affordable and wellknown by users than head-mounted displays. The realized prototype of a handheld AR user interface was compared against a commercially available desktop-based visual programming solution in a user study with 12 participants. According to the results, the novel interface significantly improves comprehension of pick and place-like programs, improves spatial information settings, and is more preferred by users than the existing tool.

  • Název v anglickém jazyce

    Augmented Reality Spatial Programming Paradigm Applied to End-User Robot Programming

  • Popis výsledku anglicky

    The market of collaborative robots is thriving due to their increasing affordability. The ability to program a collaborative robot without requiring a highly skilled specialist would increase their spread even more. Visual programming is a prevalent contemporary approach for end-users on desktops or handheld devices, allowing them to define the program logic quickly and easily. However, separating the interface from the robot's task space makes defining spatial features difficult. At the same time, augmented reality can provide spatially situated interaction, which would solve the issue and allow end-users to intuitively program, adapt, and comprehend robotic programs that are inherently highly spatially linked to the real environment. Therefore, we have proposed Spatially Anchored Actions to address the problem of comprehension, programming, and adaptation of robotic programs by end-users, which is a form of visual programming in augmented reality. It uses semantic annotation of the environment and robot hand teaching to define spatially important points precisely. Individual program steps are created by attaching parametrizable, high-level actions to the points. Program flow is then defined by visually connecting individual actions. The interface is specifically designed for tablets, which provide a more immersive experience than phones and are more affordable and wellknown by users than head-mounted displays. The realized prototype of a handheld AR user interface was compared against a commercially available desktop-based visual programming solution in a user study with 12 participants. According to the results, the novel interface significantly improves comprehension of pick and place-like programs, improves spatial information settings, and is more preferred by users than the existing tool.

Klasifikace

  • Druh

    J<sub>imp</sub> - Článek v periodiku v databázi Web of Science

  • CEP obor

  • OECD FORD obor

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Návaznosti výsledku

  • Projekt

  • Návaznosti

    R - Projekt Ramcoveho programu EK

Ostatní

  • Rok uplatnění

    2024

  • Kód důvěrnosti údajů

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Údaje specifické pro druh výsledku

  • Název periodika

    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING

  • ISSN

    0736-5845

  • e-ISSN

    1879-2537

  • Svazek periodika

    89

  • Číslo periodika v rámci svazku

    89

  • Stát vydavatele periodika

    GB - Spojené království Velké Británie a Severního Irska

  • Počet stran výsledku

    13

  • Strana od-do

    1-13

  • Kód UT WoS článku

    001229669000001

  • EID výsledku v databázi Scopus

    2-s2.0-85190260946