Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.13091/5616
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYusefi, A.-
dc.contributor.authorDurdu, A.-
dc.contributor.authorToy, I.-
dc.date.accessioned2024-06-01T08:58:13Z-
dc.date.available2024-06-01T08:58:13Z-
dc.date.issued2024-
dc.identifier.isbn9798350329940-
dc.identifier.urihttps://doi.org/10.1109/INFOTEH60418.2024.10495974-
dc.identifier.urihttps://hdl.handle.net/20.500.13091/5616-
dc.descriptionDigitalni ozon Banja Luka;DWELT Software Banja Luka;et al.;MTEL Banja Luka;Municipality of East Ilidza;Municipality of East Stari Graden_US
dc.description23rd International Symposium INFOTEH-JAHORINA, INFOTEH 2024 -- 20 March 2024 through 22 March 2024 -- 199053en_US
dc.description.abstractThis research presents a novel approach for autonomous navigation of Unmanned Ground Vehicles (UGV) using a camera and LiDAR sensor fusion system. The proposed method is designed to achieve a high rate of obstacle detection, distance estimation, and obstacle avoidance. In order to thoroughly study the form of things and decrease the problem of object occlusion, which frequently happens in camera-based object recognition, the 3D point cloud received from the LiDAR depth sensors is used. The proposed camera and LiDAR sensor fusion design balance the benefits and drawbacks of the two sensors to produce a detection system that is more reliable than others. The UGV's autonomous navigation system is then provided with the region proposal to re-plan its route and navigate appropriately. The experiments were conducted on a UGV system with high obstacle avoidance and fully autonomous navigation capabilities. The outcomes demonstrate that the suggested technique can successfully maneuver the UGV and detect impediments in actual situations. © 2024 IEEE.en_US
dc.description.sponsorshipKonya Teknik Üniversitesi, KTÜNen_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.relation.ispartof2024 23rd International Symposium INFOTEH-JAHORINA, INFOTEH 2024 - Proceedingsen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectAutonomous Navigationen_US
dc.subjectCamera/LiDAR Sensor Fusionen_US
dc.subjectDeep Learningen_US
dc.subjectObstacle Avoidanceen_US
dc.subjectYOLOv7en_US
dc.subjectAir navigationen_US
dc.subjectCollision avoidanceen_US
dc.subjectDeep learningen_US
dc.subjectGround vehiclesen_US
dc.subjectIntelligent vehicle highway systemsen_US
dc.subjectNavigation systemsen_US
dc.subjectObject recognitionen_US
dc.subjectObstacle detectorsen_US
dc.subjectOptical radaren_US
dc.subjectAutonomous navigationen_US
dc.subjectCamera/LiDAR sensor fusionen_US
dc.subjectDeep learningen_US
dc.subjectDistance estimationen_US
dc.subjectHigh rateen_US
dc.subjectObstacles avoidanceen_US
dc.subjectObstacles detectionen_US
dc.subjectSensor fusionen_US
dc.subjectSensor fusion systemsen_US
dc.subjectYOLOv7en_US
dc.subjectCamerasen_US
dc.titleCamera/LiDAR Sensor Fusion-based Autonomous Navigationen_US
dc.typeConference Objecten_US
dc.identifier.doi10.1109/INFOTEH60418.2024.10495974-
dc.identifier.scopus2-s2.0-85192144342en_US
dc.departmentKTÜNen_US
dc.identifier.wosWOS:001215550500054en_US
dc.institutionauthorYusefi, A.-
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.authorscopusid57221601191-
dc.authorscopusid55364612200-
dc.authorscopusid57222083572-
item.grantfulltextnone-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.fulltextNo Fulltext-
item.openairetypeConference Object-
item.cerifentitytypePublications-
crisitem.author.dept02.04. Department of Electrical and Electronics Engineering-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collections
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collections
Show simple item record



CORE Recommender

Page view(s)

28
checked on Aug 5, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.