Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.13091/684
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGürtürk, Mert-
dc.contributor.authorYusefi, Abdullah-
dc.contributor.authorAslan, Muhammet Fatih-
dc.contributor.authorSoycan, Metin-
dc.contributor.authorDurdu, Akif-
dc.contributor.authorMasiero, Andrea-
dc.date.accessioned2021-12-13T10:29:48Z-
dc.date.available2021-12-13T10:29:48Z-
dc.date.issued2021-
dc.identifier.issn0263-2241-
dc.identifier.issn1873-412X-
dc.identifier.urihttps://doi.org/10.1016/j.measurement.2021.109878-
dc.identifier.urihttps://hdl.handle.net/20.500.13091/684-
dc.description.abstractVisual Simultaneous Localization and Mapping (VSLAM) and Visual Odometry (VO) are fundamental problems to be properly tackled for enabling autonomous and effective movements of vehicles/robots supported by vision -based positioning systems. This study presents a publicly shared dataset for SLAM investigations: a dataset collected at the Yildiz Technical University (YTU) in an outdoor area by an acquisition system mounted on a terrestrial vehicle. The acquisition system includes two cameras, an inertial measurement unit, and two GPS receivers. All sensors have been calibrated and synchronized. To prove the effectiveness of the introduced dataset, this study also applies Visual Inertial Odometry (VIO) on the KITTI dataset. Also, this study proposes a new recurrent neural network-based VIO rather than just introducing a new dataset. In addition, the effectiveness of this proposed method is proven by comparing it with the state-of-the-arts ORB-SLAM2 and OKVIS methods. The experimental results show that the YTU dataset is robust enough to be used for benchmarking studies and the proposed deep learning-based VIO is more successful than the other two traditional methods.en_US
dc.description.sponsorshipScientific and Technological Research Council of Turkey (TUBITAK)Turkiye Bilimsel ve Teknolojik Arastirma Kurumu (TUBITAK); [FDK-2019-3593]en_US
dc.description.sponsorshipThis research was supported by the project number FDK-2019-3593, which was accepted by the Yildiz Technical University Scientific Research Projects Commission. Authors are grateful to the RAC-LAB ( www.rac-lab.com ) for training and support. The first author of this paper was also awarded '2214-A Abroad Research Scholarship' by The Scientific and Technological Research Council of Turkey (TUBITAK) and concluded her research at University of Florence.en_US
dc.language.isoenen_US
dc.publisherELSEVIER SCI LTDen_US
dc.relation.ispartofMEASUREMENTen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectDeep Learningen_US
dc.subjectVioen_US
dc.subjectSlamen_US
dc.subjectYtu Dataseten_US
dc.subjectVersatileen_US
dc.subjectSlamen_US
dc.titleThe YTU dataset and recurrent neural network based visual-inertial odometryen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.measurement.2021.109878-
dc.identifier.scopus2-s2.0-85111871132en_US
dc.departmentFakülteler, Mühendislik ve Doğa Bilimleri Fakültesi, Harita Mühendisliği Bölümüen_US
dc.identifier.volume184en_US
dc.identifier.wosWOS:000704866300003en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.authorscopusid57226360527-
dc.authorscopusid57221601191-
dc.authorscopusid57205362915-
dc.authorscopusid10339673300-
dc.authorscopusid55364612200-
dc.authorscopusid57198996466-
dc.identifier.scopusqualityQ2-
item.cerifentitytypePublications-
item.grantfulltextembargo_20300101-
item.languageiso639-1en-
item.openairetypeArticle-
item.fulltextWith Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
crisitem.author.dept02.04. Department of Electrical and Electronics Engineering-
Appears in Collections:Mühendislik ve Doğa Bilimleri Fakültesi Koleksiyonu
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections
Files in This Item:
File SizeFormat 
1-s2.0-S0263224121008198-main.pdf
  Until 2030-01-01
6.59 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

5
checked on Oct 12, 2024

WEB OF SCIENCETM
Citations

17
checked on Oct 12, 2024

Page view(s)

256
checked on Oct 14, 2024

Download(s)

8
checked on Oct 14, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.