Results 1 - 10
of
15
Autonomous Flight in GPS-Denied Environments Using Monocular Vision
- and Inertial Sensors,” in Infotech@Aerospace, 2010
"... A vision-aided inertial navigation system that enables autonomous flight of an aerial vehicle in GPS-denied environments is presented. Particularly, feature point information from a monocular vision sensor are used to bound the drift resulting from integrating accelerations and angular rate measurem ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
A vision-aided inertial navigation system that enables autonomous flight of an aerial vehicle in GPS-denied environments is presented. Particularly, feature point information from a monocular vision sensor are used to bound the drift resulting from integrating accelerations and angular rate measurements from an Inertial Measurement Unit (IMU) forward in time. An Extended Kalman filter framework is proposed for performing the tasks of vision-based mapping and navigation separately. When GPS is available, multiple observations of a single landmark point from the vision sensor are used to estimate the point’s location in inertial space. When GPS is not available, points that have been sufficiently mapped out can be used for estimating vehicle position and attitude. Simulation and flight test results of a vehicle operating autonomously in a simplified loss-of-GPS scenario verify the presented method. I.
Gps-denied indoor and outdoor monocular vision aided navigation and control of unmanned aircraft
- Journal of Field Robotics
, 2013
"... GPS-denied closed-loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V-INS) have been too compu-tationally intensive or do not have sufficie ..."
Abstract
-
Cited by 6 (0 self)
- Add to MetaCart
(Show Context)
GPS-denied closed-loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V-INS) have been too compu-tationally intensive or do not have sufficient integrity for closed-loop flight. We provide an affirmative answer to the question of whether V-INS can be used to sustain prolonged real-world GPS-denied flight by presenting a V-INS that is validated through autonomous flight-tests over prolonged closed-loop dynamic operation in both indoor and outdoor GPS-denied environments with two rotorcraft UAS. The architecture efficiently combines visual feature information from a monocular camera with measurements from inertial sensors. In-ertial measurements are used to predict frame-to-frame transition of online selected feature locations, and the difference between predicted and observed feature locations is used to bind in real-time the inertial measurement unit drift, estimate its bias, and account for initial misalignment errors. A novel algorithm to manage a library of features online is presented that can add or remove features based on a measure of relative confidence in each feature location. The resulting V-INS is sufficiently efficient and reliable to enable real-time imple-mentation on resource constrained aerial vehicles. The presented algorithms are validated on multiple platforms in real-world conditions: through a 16 minute flight test, including an autonomous landing, of a 66 kg rotorcraft UAV operating in an unconctrolled outdoor envi-ronment without using GPS and through a Micro-UAV operating in a cluttered, unmapped,
Quadrocopter Hovering Using Position-estimation Information from Inertial Sensors and a High-delay Video System
"... Abstract The requirement that mobile robots be-come independent of external sensors, such as GPS, and are able to navigate in an environ-ment by themselves, means that designers have few alternative techniques available. An increas-ingly popular approach is to use computer vision as a source of info ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
(Show Context)
Abstract The requirement that mobile robots be-come independent of external sensors, such as GPS, and are able to navigate in an environ-ment by themselves, means that designers have few alternative techniques available. An increas-ingly popular approach is to use computer vision as a source of information about the surround-ings. This paper presents an implementation of computer vision to hold a quadrocopter aircraft in a stable hovering position using a low-cost, consumer-grade, video system. However, such a system is not able to stabilize the aircraft on its own and must rely on a data-fusion algorithm that uses additional measurements from on-board inertial sensors. Special techniques had to be im-plemented to compensate for the increased delay in the closed-loop system with the computer vi-sion system, i.e., video timestamping to determine the exact delay of the vision system and a slight modification of the Kalman filter to account for this delay. At the end, the validation results of the proposed filtering technique are presented along with the results of an autonomous flight as a proof of the proposed concept.
Predictive potential field-based collision avoidance for multicopters
- In: Proc. of the 2nd Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g
, 2013
"... Reliable obstacle avoidance is a key to navigating with UAVs in the close vicinity of static and dynamic obstacles. Wheel-based mobile robots are often equipped with 2D or 3D laser range finders that cover the 2D workspace sufficiently accurate and at a high rate. Micro UAV platforms operate in a 3D ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
Reliable obstacle avoidance is a key to navigating with UAVs in the close vicinity of static and dynamic obstacles. Wheel-based mobile robots are often equipped with 2D or 3D laser range finders that cover the 2D workspace sufficiently accurate and at a high rate. Micro UAV platforms operate in a 3D environment, but the restricted payload prohibits the use of fast state-of-the-art 3D sensors. Thus, perception of small obstacles is often only possible in the vicinity of the UAV and a fast collision avoidance system is necessary. We propose a reactive collision avoidance system based on artificial potential fields, that takes the special dynamics of UAVs into account by predicting the influence of obstacles on the estimated trajectory in the near future using a learned motion model. Experimental evaluation shows that the prediction leads to smoother trajectories and allows to navigate collision-free through passageways. 1
Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles
"... Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles As robotic platforms and unmanned aerial vehicles (UAVs) increase in sophistication and complexity, the ability to determine the spatial orientation and placement of the platform in real time (localization) becomes an important issue ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
(Show Context)
Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles As robotic platforms and unmanned aerial vehicles (UAVs) increase in sophistication and complexity, the ability to determine the spatial orientation and placement of the platform in real time (localization) becomes an important issue. Detecting and extracting locations of objects, barriers, and openings is required to ensure the overall effectiveness of the device. Current methods to achieve localization for UAVs require expensive external equipment and limit the overall applicable range of the platform. The system described herein incorporates leader-follower unmanned aerial vehicles using vision processing, radio-frequency data transmission, and additional sensors to achieve flocking behavior. This system targets search and rescue environments, employing controls, vision processing, and embedded systems to allow for easy deployment of multiple quadrotor UAVs while requiring the control of only one. The system demonstrates a relative localization scheme for UAVs in a leader-follower configuration, allowing for predictive maneuvers including
Towards Model-Free SLAM Using a Single Laser Range Scanner for Helicopter MAV
"... A new solution for the SLAM problem is presented which makes use of a scan matching algorithm, and does not rely on bayesian filters. The virtual map is represented in the form of an occupancy grid, which stores laser scans based on the estimated position. The occupancy grid is scanned by means of r ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
A new solution for the SLAM problem is presented which makes use of a scan matching algorithm, and does not rely on bayesian filters. The virtual map is represented in the form of an occupancy grid, which stores laser scans based on the estimated position. The occupancy grid is scanned by means of ray casting to get a scan of the virtual world, called ”virtual scan”. The virtual scan therefore contains data from all previously acquired laser measurements and hence serves as the best representation of the surroundings. New laser scans are matched against the virtual scan to get an estimate of the new position. The scan matching cost function is minimized via an adaptive direct search with boundary updating until convergence. The resulting method is model-free and can be applied to various platforms, including micro aerial vehicles that lack dynamic models. Experimental validation of the SLAM method is presented by mapping a typical office hallway environment with a closed loop, using a manually driven platform and a laser range scanner. The mapping results are highly accurate and the loop closure area appears to be seamless, in spite of no loop closure algorithms and no post-mapping correction processes.
A vision-based relative navigation approach for autonomous multirotor aircraft
, 2013
"... ..."
(Show Context)
Efficient Estimation for Autonomous Multi-Rotor Helicopters Operating in Unknown, Indoor Environments
, 2012
"... This Dissertation is brought to you for free and open access by BYU ScholarsArchive. It has been accepted for inclusion in All Theses and Dissertations ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
This Dissertation is brought to you for free and open access by BYU ScholarsArchive. It has been accepted for inclusion in All Theses and Dissertations
Omnidirectional Obstacle Perception and Collision Avoidance for Micro Aerial Vehicles
"... Abstract — In this paper, we propose a complete micro aerial vehicle platform—including hardware setup and processing pipeline—that is able to perceive obstacles in (almost) all directions in its surrounding. In order to compensate for deficiencies of individual obstacle sensors, we make use of diff ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract — In this paper, we propose a complete micro aerial vehicle platform—including hardware setup and processing pipeline—that is able to perceive obstacles in (almost) all directions in its surrounding. In order to compensate for deficiencies of individual obstacle sensors, we make use of different sensor modalities. Detected obstacles are fused and accumulated in a three-dimensional egocentric obstacle map. For avoiding collisions with detected obstacles, we employ a predictive potential field-based approach to relax the assumption of classic approaches that the vehicle can change its dynamic state instantaneously. We present results in simulation and with the integrated robot. I.
2.1.1 Coordinate Systems........................ 7
"... I, Paul-David Piotrowski, solemnly declare that I have written this diploma thesis independently, and that I have not made use of any aid other than those acknowledged in this diploma thesis. Neither this diploma thesis, nor any other similar work, has been previously submitted to any examination bo ..."
Abstract
- Add to MetaCart
(Show Context)
I, Paul-David Piotrowski, solemnly declare that I have written this diploma thesis independently, and that I have not made use of any aid other than those acknowledged in this diploma thesis. Neither this diploma thesis, nor any other similar work, has been previously submitted to any examination board.