Leaf area index estimation of pergola-trained vineyards in arid regions using classical and deep learning methods based on UAV-based RGB images
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F60460709%3A41330%2F23%3A97560" target="_blank" >RIV/60460709:41330/23:97560 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1016/j.compag.2023.107723" target="_blank" >http://dx.doi.org/10.1016/j.compag.2023.107723</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.compag.2023.107723" target="_blank" >10.1016/j.compag.2023.107723</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Leaf area index estimation of pergola-trained vineyards in arid regions using classical and deep learning methods based on UAV-based RGB images
Popis výsledku v původním jazyce
Timely and accurate mapping of leaf area index (LAI) in vineyards plays an important role for management choices in precision agricultural practices. However, only a little work has been done to extract the LAI of pergola-trained vineyards using higher spatial resolution remote sensing data. The main objective of this study was to evaluate the ability of unmanned aerial vehicle (UAV) imageries to estimate the LAI of pergola-trained vineyards using shallow and deep machine learning (ML) methods. Field trials were conducted in different growth seasons in 2021 by collecting 465 LAI samples. Firstly, this study trained five classical shallow ML models and an ensemble learning model by using different spectral and textural indices calculated from UAV imageries, and the most correlated or useful features for LAI estimations in different growth stages were differentiated. Then, due to the classical ML approaches need the arduous computation of multiple indices and feature selection procedures, another ResNet-based convolutional neural network (CNN) model was constructed which can be directly fed by cropped images. Furthermore, this study introduced a new image data augmentation method which is applicable to regression problems. Results indicated that the textural indices performed better than spectral indices, while the combination of them can improve estimation results, and the ensemble learning method showed the best among classical ML models. By choosing the optimal input image size, the CNN model we constructed estimated the LAI most effectively without extracting and selecting the features manually. The proposed image data augmentation method can generate new training images with new labels by mosaicking the original ones, and the CNN model showed improved performance after using this method compared to those using only the original images, or augmented by rotation and flipping methods. This data augmentation method can be applied to other regression models to extract growth parameters of crops using remote sensing data, and we conclude that the UAV imagery and deep learning methods are promising in LAI estimations of pergola-trained vineyards.
Název v anglickém jazyce
Leaf area index estimation of pergola-trained vineyards in arid regions using classical and deep learning methods based on UAV-based RGB images
Popis výsledku anglicky
Timely and accurate mapping of leaf area index (LAI) in vineyards plays an important role for management choices in precision agricultural practices. However, only a little work has been done to extract the LAI of pergola-trained vineyards using higher spatial resolution remote sensing data. The main objective of this study was to evaluate the ability of unmanned aerial vehicle (UAV) imageries to estimate the LAI of pergola-trained vineyards using shallow and deep machine learning (ML) methods. Field trials were conducted in different growth seasons in 2021 by collecting 465 LAI samples. Firstly, this study trained five classical shallow ML models and an ensemble learning model by using different spectral and textural indices calculated from UAV imageries, and the most correlated or useful features for LAI estimations in different growth stages were differentiated. Then, due to the classical ML approaches need the arduous computation of multiple indices and feature selection procedures, another ResNet-based convolutional neural network (CNN) model was constructed which can be directly fed by cropped images. Furthermore, this study introduced a new image data augmentation method which is applicable to regression problems. Results indicated that the textural indices performed better than spectral indices, while the combination of them can improve estimation results, and the ensemble learning method showed the best among classical ML models. By choosing the optimal input image size, the CNN model we constructed estimated the LAI most effectively without extracting and selecting the features manually. The proposed image data augmentation method can generate new training images with new labels by mosaicking the original ones, and the CNN model showed improved performance after using this method compared to those using only the original images, or augmented by rotation and flipping methods. This data augmentation method can be applied to other regression models to extract growth parameters of crops using remote sensing data, and we conclude that the UAV imagery and deep learning methods are promising in LAI estimations of pergola-trained vineyards.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10511 - Environmental sciences (social aspects to be 5.7)
Návaznosti výsledku
Projekt
—
Návaznosti
S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2023
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Computers and Electronics in Agriculture
ISSN
0168-1699
e-ISSN
0168-1699
Svazek periodika
207
Číslo periodika v rámci svazku
107723
Stát vydavatele periodika
GB - Spojené království Velké Británie a Severního Irska
Počet stran výsledku
15
Strana od-do
1-15
Kód UT WoS článku
000991765800001
EID výsledku v databázi Scopus
2-s2.0-85149177353