Self-touch and other spontaneous behavior patterns in early infancy
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F22%3A00362311" target="_blank" >RIV/68407700:21230/22:00362311 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/10.1109/ICDL53763.2022.9962203" target="_blank" >https://doi.org/10.1109/ICDL53763.2022.9962203</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/ICDL53763.2022.9962203" target="_blank" >10.1109/ICDL53763.2022.9962203</a>
Alternative languages
Result language
angličtina
Original language name
Self-touch and other spontaneous behavior patterns in early infancy
Original language description
Children are not born tabula rasa. However, interacting with the environment through their body movements in the first months after birth is critical to building the models or representations that are the foundation for everything that follows. We present longitudinal data on spontaneous behavior of three infants observed between about 8 and 25 weeks of age in supine position. We combined manual scoring of video recordings with an automatic extraction of motion data in order to study infants’ behavioral patterns and developmental progression such as: (i) spatial distribution of self-touches on the body, (ii) spatial patterns and regularities of hand movements, (iii) midline crossing, (iv) preferential use of one arm, and (v) dynamic patterns of movements indicative of goal-directedness. From the patterns observed in this pilot data set, we can speculate on the development of first body and peripersonal space representations. Several methods of extracting 3D kinematics from videos have recently been made available by the computer vision community. We applied one of these methods on infant videos and provide guidelines on its possibilities and limitations—a methodological contribution to automating the analysis of infant videos. In the future, we plan to use the patterns we extracted from the recordings as inputs to embodied computational models of learning of body representations in infancy.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
50103 - Cognitive sciences
Result continuities
Project
<a href="/en/project/GX20-24186X" target="_blank" >GX20-24186X: Whole-body awareness for safe and natural interaction: from brains to collaborative robots</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2022
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
2022 IEEE International Conference on Development and Learning (ICDL)
ISBN
978-1-6654-1311-4
ISSN
—
e-ISSN
—
Number of pages
8
Pages from-to
148-155
Publisher name
IEEE
Place of publication
Piscataway
Event location
London
Event date
Sep 12, 2022
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—