Real-time video stabilization without phantom movements for micro aerial vehicles
- PDF / 2,405,147 Bytes
- 13 Pages / 595 x 794 pts Page_size
- 5 Downloads / 189 Views
R ESEA R CH
Open Access
Real-time video stabilization without phantom movements for micro aerial vehicles Wilbert G Aguilar* and Cecilio Angulo
Abstract In recent times, micro aerial vehicles (MAVs) are becoming popular for several applications as rescue, surveillance, mapping, etc. Undesired motion between consecutive frames is a problem in a video recorded by MAVs. There are different approaches, applied in video post-processing, to solve this issue. However, there are only few algorithms able to be applied in real time. An additional and critical problem is the presence of false movements in the stabilized video. In this paper, we present a new approach of video stabilization which can be used in real time without generating false movements. Our proposal uses a combination of a low-pass filter and control action information to estimate the motion intention. Keywords: Video stabilization; Micro aerial vehicles; Real time; Filter; Motion intention
Introduction The growing interest in developing unmanned aircraft vehicles (UAVs) is due to their versatility in several applications such as rescue, transport, or surveillance. A particular type of UAV that becomes popular nowadays are micro aerial vehicles (MAVs) by their advantage to fly in closed and reduced spaces. Robust guidance, navigation, and control systems for MAVs [1] depend on the input information obtained from on-board sensors as cameras. Undesired movements are usually generated during the fly as a result of complex aerodynamic characteristics of the UAV. Unnecessary image rotations and translations appear in the video sequence, increasing the difficulty to control the vehicle. There are multiple techniques in the literature [2-5] designed to compensate the effects of undesired movements of the camera. Recently, the video stabilization algorithm ‘L1 Optimal’ provided by the YouTube editor was introduced in [6]. Another interesting proposal is the Parrot’s Director Mode, implemented as an iOS application (iPhone operative system) for post-processing of videos captured with Parrot’s AR.Drones.
*Correspondence: [email protected] Automatic Control Department, UPC-BarcelonaTech, Pau Gargallo Street 5, 08028 Barcelona, Spain
Usually, offline video stabilization techniques are divided in three stages: • Local motion estimation • Motion intention estimation • Motion compensation Local motion estimation
In this phase, the parameters that relate the uncompensated image and the image defined as reference are determined frame by frame. Optical flow [7,8] and geometric transformation models [9-11] are two common approaches for local motion estimation. Our algorithm uses the latter one. Geometric transformation models are based on the estimation of the motion parameters. For this estimation, interest points should be detected and described. A list of techniques performing this task can be found in the literature [12-14], but Binary Robust Invariant Scalable Keypoints (BRISK) [15], Fast Retina Keypoint (FREAK) [16], Oriented FAST and Rotated BRIEF (ORB) [1
Data Loading...