Hvionet: a Deep Learning Based Hybrid Visual-Inertial Odometry Approach for Unmanned Aerial System Position Estimation

No Thumbnail Available

Date

2022

Journal Title

Journal ISSN

Volume Title

Publisher

Pergamon-Elsevier Science Ltd

Open Access Color

Green Open Access

No

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Top 1%
Influence
Top 10%
Popularity
Top 10%

Research Projects

Journal Issue

Abstract

Sensor fusion is used to solve the localization problem in autonomous mobile robotics applications by integrating complementary data acquired from various sensors. In this study, we adopt Visual- Inertial Odometry (VIO), a low-cost sensor fusion method that integrates inertial data with images using a Deep Learning (DL) framework to predict the position of an Unmanned Aerial System (UAS). The developed system has three steps. The first step extracts features from images acquired from a platform camera and uses a Convolutional Neural Network (CNN) to project them to a visual feature manifold. Next, temporal features are extracted from the Inertial Measurement Unit (IMU) data on the platform using a Bidirectional Long Short Term Memory (BiLSTM) network and are projected to an inertial feature manifold. The final step estimates the UAS position by fusing the visual and inertial feature manifolds via a BiLSTM-based architecture. The proposed approach is tested with the public EuRoC (European Robotics Challenge) dataset and simulation environment data generated within the Robot Operating System (ROS). The result of the EuRoC dataset shows that the proposed approach achieves successful position estimations comparable to previous popular VIO methods. In addition, as a result of the experiment with the simulation dataset, the UAS position is successfully estimated with 0.167 Mean Square Error (RMSE). The obtained results prove that the proposed deep architecture is useful for UAS position estimation. (c) 2022 Elsevier Ltd. All rights reserved.

Description

Keywords

BiLSTM, IMU, RNN, ROS, UAS, VIO, Unknown Environments, Bidirectional Lstm, Slam, Vision, Stereo, Robust, Networks, Fusion, Filter, Deep Learning, Neural Networks, Computer, Robotics, Reactive Oxygen Species, Memory, Long-Term

Turkish CoHE Thesis Center URL

Fields of Science

0209 industrial biotechnology, 02 engineering and technology

Citation

WoS Q

Q1

Scopus Q

Q1
OpenCitations Logo
OpenCitations Citation Count
30

Source

Neural Networks

Volume

155

Issue

Start Page

461

End Page

474
PlumX Metrics
Citations

CrossRef : 20

Scopus : 43

PubMed : 3

Captures

Mendeley Readers : 31

SCOPUS™ Citations

42

checked on Feb 03, 2026

Web of Science™ Citations

36

checked on Feb 03, 2026

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
15.89449577

Sustainable Development Goals

SDG data is not available