Waypoint planning for mono visual drone using the minimal bounded space method

Document Type

Article

Publication Date

1-1-2021

Abstract

Waypoint planning is a challenging task for a robot. Various depth sensors have been used for accurate obstacle detection and avoidance, such as LIDAR, sonar, stereo camera, and infrared. These rangefinders are costly in load and battery capacity, thus often not found on small drones. Small drones are usually equipped with a mono camera. However, without rangefinders, depth information is not available. Waypoint planning for mono visual drones rely heavily on an object-based approach, such as accurate feature extraction. Learning corners, edges, SIFT, and SURF features are challenging because object tracking requires heavy computation, limiting mono visual drone from performing wayfinding in a real-time environment. The space-based approach, such as the minimal bounded space (MBS) method, shows promising waypoint planning when tested on real-time mobile robots. Interestingly, the MBS works with or without depth information. Significantly, this work contributes to the waypoint planning for mono visual drone using the MBS method. The outcome of this work is two folds. We contribute a new segmentation process for a mono visual drone without depth information, and we show how the MBS performs waypoint planning so drones can fly autonomously in an outdoor environment. © 2021, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

Keywords

Cognitive robotics, Waypoints planning, Space-based approach

Divisions

fsktm

Publication Title

Lecture Notes in Mechanical Engineering

Publisher

Springer

This document is currently not available here.

Share

COinS