Elevating Crop Classification Performance Through CNN-GRU Feature Fusion
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F61989100%3A27240%2F24%3A10256796" target="_blank" >RIV/61989100:27240/24:10256796 - isvavai.cz</a>
Result on the web
<a href="https://ieeexplore.ieee.org/document/10689581" target="_blank" >https://ieeexplore.ieee.org/document/10689581</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/ACCESS.2024.3467193" target="_blank" >10.1109/ACCESS.2024.3467193</a>
Alternative languages
Result language
angličtina
Original language name
Elevating Crop Classification Performance Through CNN-GRU Feature Fusion
Original language description
Crop classification and its area estimation hold significant importance in agricultural production and management. It serves as a crucial tool for efficiently identifying various crops and contributes to the formulation of tailored strategies for crop-specific production and estimation purposes. Numerous deep learning techniques, with a particular emphasis on Convolutional Neural Networks (CNNs), have found application in the analysis of satellite data for crop classification. Nevertheless, this approach has its limitations, notably the challenge of sequential information loss, which hinders its ability to accurately extract features that rely on temporal changes over time. On the other hand, another deep learning technique, the Gated Recurrent Unit (GRU), excels at handling time series data but falls short in its ability to extract intricate patterns within the data, a strength of CNNs. This study explores the capabilities of these methodologies for crop classification in the Charsadda district of Pakistan, a region renowned for its diverse agricultural landscape dedicated to the cultivation of economically valuable crops. We introduce a dual-channel CNN-GRU feature fusion architecture, a method that merges individually extracted features from both CNNs and GRUs. The proposed framework is specifically designed for data obtained from Sentinel-2 and PlanetScope imagery, complemented by NDVI timeseries data collected from the study site. The model's performance is assessed in comparison to the individual performance of CNN and GRU models. The models were subjected to evaluation using a variety of metrics, encompassing precision, recall, F1 Score, and overall accuracy. The outcomes reveal that the proposed model surpasses the performance of the individual models. The feature fusion-based model emerged as the top performer, achieving an overall accuracy of 96.47 percent, an F1 Score of 93.16 percent, a precision of 93.93 percent, and a recall of 92.69 percent. This research offers substantial evidence supporting the potential of the CNN-GRU feature fusion model for crop identification. By harnessing the strengths of both architectural approaches, it demonstrates its effectiveness as a valuable tool for crop classification.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
20200 - Electrical engineering, Electronic engineering, Information engineering
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2024
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
IEEE Access
ISSN
2169-3536
e-ISSN
2169-3536
Volume of the periodical
12
Issue of the periodical within the volume
1
Country of publishing house
US - UNITED STATES
Number of pages
13
Pages from-to
141013-141025
UT code for WoS article
001329032300001
EID of the result in the Scopus database
—