Results 1 - 10
of
17
Visual odometry and mapping for autonomous flight using an RGB-D camera
- In Proc. of the Intl. Sym. of Robot. Research
, 2011
"... Abstract RGB-D cameras provide both a color image and per-pixel depth estimates. The richness of their data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapp ..."
Abstract
-
Cited by 77 (4 self)
- Add to MetaCart
(Show Context)
Abstract RGB-D cameras provide both a color image and per-pixel depth estimates. The richness of their data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on unreliable wireless links. We evaluate the effectiveness of our system for stabilizing and controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations. 1
Autonomous indoor helicopter flight using a single onboard camera
- In International Conference on Intelligent Robots and Systems (IROS
, 2009
"... Abstract — We consider the problem of autonomously flying a helicopter in indoor environments. Navigation in indoor settings poses two major challenges. First, real-time perception and response is crucial because of the high presence of obstacles. Second, the limited free space in such a setting pla ..."
Abstract
-
Cited by 16 (4 self)
- Add to MetaCart
(Show Context)
Abstract — We consider the problem of autonomously flying a helicopter in indoor environments. Navigation in indoor settings poses two major challenges. First, real-time perception and response is crucial because of the high presence of obstacles. Second, the limited free space in such a setting places severe restrictions on the size of the aerial vehicle, resulting in a frugal payload budget. We autonomously fly a miniature RC helicopter in small known environments using an on-board light-weight camera as the only sensor. We use an algorithm that combines data-driven image classification with optical flow techniques on the images captured by the camera to achieve real-time 3D localization and navigation. We perform successful autonomous test flights along trajectories in two different indoor settings. Our results demonstrate that our method is capable of autonomous flight even in narrow indoor spaces with sharp corners. I.
A.: Integrating sensor and motion models to localize an autonomous ar.drone
- International Journal of Micro Air Vehicles
, 2011
"... AR.Drone ..."
(Show Context)
A.: An elevation map from a micro aerial vehicle for urban search and rescue
- In: RoboCup 2012 CD (2012
"... Abstract. The developments in unmanned aerial vehicles make it possible to use this platform on a much larger scale. The current challenge is to use a team of flying robots to explore a city block, place lookouts at strategic points and if possible to enter some of the buildings in the block, to sea ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
Abstract. The developments in unmanned aerial vehicles make it possible to use this platform on a much larger scale. The current challenge is to use a team of flying robots to explore a city block, place lookouts at strategic points and if possible to enter some of the buildings in the block, to search for hazards and/or people. This challenge is still quite ambitious, but allows researchers to explore some of the aspects in simulation. This paper describes how to build a visual map of the environment including height information. This is an essential step towards more extensive applications, both in simulation and for a real platform. 1
Amsterdam Oxford Joint Rescue Forces Team Description Paper Virtual Robot competition Rescue Simulation League
"... Abstract. With the progress made in active exploration, the robots of the Joint Rescue Forces are capable of making deliberative decisions about the distribution of exploration locations over the team. Experiments have been done which include information exchange between team-members at rendez-vous ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
(Show Context)
Abstract. With the progress made in active exploration, the robots of the Joint Rescue Forces are capable of making deliberative decisions about the distribution of exploration locations over the team. Experiments have been done which include information exchange between team-members at rendez-vous points and dynamic role switching between relays and explorers. In the previous competition exploration was demonstrated with large robots with advanced mobility, such as the Kenaf and the AirRobot. This year our mapping algorithms are extended to be able to explore with the smaller AR.Drone, a flying robot used in the International Micro Air Vehicle competition. Further, progress will be demonstrated in automatic victim detection.
Saliency detection and model-based tracking: a two part vision system for small robot navigation in forested environment
- in Proceedings of SPIE
, 2012
"... Towards the goal of fast, vision-based autonomous flight, localization, and map building to support local planning and control in unstructured outdoor environments, we present a method for incrementally building a map of salient tree trunks while simultaneously estimating the trajectory of a quadrot ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
Towards the goal of fast, vision-based autonomous flight, localization, and map building to support local planning and control in unstructured outdoor environments, we present a method for incrementally building a map of salient tree trunks while simultaneously estimating the trajectory of a quadrotor flying through a forest. We make significant progress in a class of visual perception methods that produce low-dimensional, geometric information that is ideal for planning and navigation on aerial robots, while directing computational resources using motion saliency, which selects objects that are important to navigation and planning. By low-dimensional geometric information, we mean coarse geometric primitives, which for the purposes of motion planning and navigation are suitable proxies for real-world objects. Additionally, we develop a method for summarizing past image measurements that avoids expensive computations on a history of images while maintaining the key non-linearities that make full map and trajectory smoothing possible. We demonstrate results with data from a small, commercially-available quad-rotor flying in a challenging, forested environment. 1.
Virtual Robot competition
, 2011
"... Copyright It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content licence (like ..."
Abstract
- Add to MetaCart
(Show Context)
Copyright It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content licence (like