Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.13091/5006
Title: Sugar beet farming goes high-tech: a method for automated weed detection using machine learning and deep learning in precision agriculture
Authors: Ortataş, Fatma Nur
Özkaya, Umut
Şahin, Muhammet Emin
Ulutaş, Hasan
Keywords: Weed detection
Ensemble model
Feature extraction
Image processing
Machine learning
Neural-Networks
Identification
Issue Date: 2023
Publisher: Springer London Ltd
Abstract: The main objective of this study is to develop a method for the automated detection and classification of weeds and sugar beets. Precision agriculture is an essential area of research that aims to optimize farming practices and reduce the use of harmful chemicals. For this purpose, the Faster RCNN and Federating Learning (FL)-based ensemble models were utilized to classify a specific dataset. In the first stage of the study, feature extraction is performed from the images in the dataset and classified by machine learning algorithms. Then, classification is carried out with the help of FL based deep learning ensemble models. Within the scope of the study, grid search is used for hyperparameter optimization and the results are obtained by a tenfold cross-validation method. Among all tested algorithms, the FL-based ensemble model constructed using the ResNet50 model exhibited the highest accuracy rate of 99%. This system has the potential to significantly reduce the use of herbicides and other chemicals in agricultural practices, promoting a more sustainable form of agriculture.
URI: https://doi.org/10.1007/s00521-023-09320-3
https://hdl.handle.net/20.500.13091/5006
ISSN: 0941-0643
1433-3058
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections

Show full item record



CORE Recommender

Page view(s)

2
checked on Feb 26, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.