Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.13091/3158
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAslan, Muhammet Fatih-
dc.contributor.authorDurdu, Akif-
dc.contributor.authorYusefi, Abdullah-
dc.contributor.authorYılmaz, Alper-
dc.date.accessioned2022-11-28T16:54:43Z-
dc.date.available2022-11-28T16:54:43Z-
dc.date.issued2022-
dc.identifier.issn0893-6080-
dc.identifier.issn1879-2782-
dc.identifier.urihttps://doi.org/10.1016/j.neunet.2022.09.001-
dc.identifier.urihttps://doi.org/10.1016/j.neunet.2022.09.001-
dc.identifier.urihttps://hdl.handle.net/20.500.13091/3158-
dc.description.abstractSensor fusion is used to solve the localization problem in autonomous mobile robotics applications by integrating complementary data acquired from various sensors. In this study, we adopt Visual- Inertial Odometry (VIO), a low-cost sensor fusion method that integrates inertial data with images using a Deep Learning (DL) framework to predict the position of an Unmanned Aerial System (UAS). The developed system has three steps. The first step extracts features from images acquired from a platform camera and uses a Convolutional Neural Network (CNN) to project them to a visual feature manifold. Next, temporal features are extracted from the Inertial Measurement Unit (IMU) data on the platform using a Bidirectional Long Short Term Memory (BiLSTM) network and are projected to an inertial feature manifold. The final step estimates the UAS position by fusing the visual and inertial feature manifolds via a BiLSTM-based architecture. The proposed approach is tested with the public EuRoC (European Robotics Challenge) dataset and simulation environment data generated within the Robot Operating System (ROS). The result of the EuRoC dataset shows that the proposed approach achieves successful position estimations comparable to previous popular VIO methods. In addition, as a result of the experiment with the simulation dataset, the UAS position is successfully estimated with 0.167 Mean Square Error (RMSE). The obtained results prove that the proposed deep architecture is useful for UAS position estimation. (c) 2022 Elsevier Ltd. All rights reserved.en_US
dc.language.isoenen_US
dc.publisherPergamon-Elsevier Science Ltden_US
dc.relation.ispartofNeural Networksen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectBiLSTMen_US
dc.subjectIMUen_US
dc.subjectRNNen_US
dc.subjectROSen_US
dc.subjectUASen_US
dc.subjectVIOen_US
dc.subjectUnknown Environmentsen_US
dc.subjectBidirectional Lstmen_US
dc.subjectSlamen_US
dc.subjectVisionen_US
dc.subjectStereoen_US
dc.subjectRobusten_US
dc.subjectNetworksen_US
dc.subjectFusionen_US
dc.subjectFilteren_US
dc.titleHVIOnet: A deep learning based hybrid visual-inertial odometry approach for unmanned aerial system position estimationen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.neunet.2022.09.001-
dc.identifier.pmid36152378en_US
dc.identifier.scopus2-s2.0-85138452774en_US
dc.departmentFakülteler, Mühendislik ve Doğa Bilimleri Fakültesi, Elektrik-Elektronik Mühendisliği Bölümüen_US
dc.authoridDurdu, Akif/0000-0002-5611-2322-
dc.authoridYusefi, Abdullah/0000-0001-7557-8526-
dc.authoridASLAN, Muhammet Fatih/0000-0001-7549-0137-
dc.authorwosidDurdu, Akif/AAQ-4344-2020-
dc.authorwosidYusefi, Abdullah/GVT-0630-2022-
dc.identifier.volume155en_US
dc.identifier.startpage461en_US
dc.identifier.endpage474en_US
dc.identifier.wosWOS:000867366000002en_US
dc.institutionauthorDurdu, Akif-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.authorscopusid57205362915-
dc.authorscopusid55364612200-
dc.authorscopusid57221601191-
dc.authorscopusid57067174700-
dc.identifier.scopusqualityQ1-
item.cerifentitytypePublications-
item.grantfulltextembargo_20300101-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.openairetypeArticle-
item.fulltextWith Fulltext-
crisitem.author.dept02.04. Department of Electrical and Electronics Engineering-
Appears in Collections:Mühendislik ve Doğa Bilimleri Fakültesi Koleksiyonu
PubMed İndeksli Yayınlar Koleksiyonu / PubMed Indexed Publications Collections
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections
Files in This Item:
File SizeFormat 
1-s2.0-S0893608022003355-main.pdf
  Until 2030-01-01
1.76 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

1
checked on Apr 13, 2024

WEB OF SCIENCETM
Citations

15
checked on Apr 13, 2024

Page view(s)

118
checked on Apr 15, 2024

Download(s)

6
checked on Apr 15, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.