Presentation Title

Autonomous Landmark Based Visual Navigation For Unmanned Aerial Vehicles

Faculty Mentor

Alec Sim, Nick Farrell

Start Date

18-11-2017 10:00 AM

End Date

18-11-2017 11:00 AM

Location

BSC-Ursa Minor 99

Session

Poster 1

Type of Presentation

Poster

Subject Area

engineering_computer_science

Abstract

The objective of this research is to investigate the feasibility of autonomous navigation using an unmanned aerial vehicle (UAV) using only visual information. This approach to navigation would be especially useful in environments where traditional localization systems such as GPS are not available. Using a downward facing camera, the craft extracts relevant landmarks from the visible terrain and compares these to a global map using geometric hashing. After the system has acquired its initial position, it then uses a combination of optical flow, object tracking, and the interface package drone kit in order to update the UAV’s position in real time. After testing it was found that while continuing to update a crafts position using optical flow and object tracking, the process of comparing feature patterns to global hash maps becomes extremely computationally complex with larger global maps. The current implementation also does not set limitations for the search area and is thus prone to large jumps in predicted position when similar features exist in multiple positions on the map. Future plans include setting limitations on the hash table search to reduce the number comparisons that need to be made with each search. Additionally, a probabilistic map could be used to track the probability of the craft being at any point on the map instead of only tracking the point with the highest probability.

Summary of research results to be presented

After initial testing the proposed method for localizing the craft using geometric hashing proved to be relatively accurate at determining the correct location of the craft on the global map, however it proved to be very computationally demanding as the size of the map increased due to the numerous hash maps created. The most successful portion of the navigation system was tracking using optical flow as it operated without the need to search the global map. This research has shown that it is possible to continue tracking of a craft with a known starting position using terrain tracking, however acquiring an initial position without any additional information is very computationally demanding.

This document is currently not available here.

Share

COinS
 
Nov 18th, 10:00 AM Nov 18th, 11:00 AM

Autonomous Landmark Based Visual Navigation For Unmanned Aerial Vehicles

BSC-Ursa Minor 99

The objective of this research is to investigate the feasibility of autonomous navigation using an unmanned aerial vehicle (UAV) using only visual information. This approach to navigation would be especially useful in environments where traditional localization systems such as GPS are not available. Using a downward facing camera, the craft extracts relevant landmarks from the visible terrain and compares these to a global map using geometric hashing. After the system has acquired its initial position, it then uses a combination of optical flow, object tracking, and the interface package drone kit in order to update the UAV’s position in real time. After testing it was found that while continuing to update a crafts position using optical flow and object tracking, the process of comparing feature patterns to global hash maps becomes extremely computationally complex with larger global maps. The current implementation also does not set limitations for the search area and is thus prone to large jumps in predicted position when similar features exist in multiple positions on the map. Future plans include setting limitations on the hash table search to reduce the number comparisons that need to be made with each search. Additionally, a probabilistic map could be used to track the probability of the craft being at any point on the map instead of only tracking the point with the highest probability.