A key requirement for Unmanned Aerial Vehicles
(UAV) applications is the attitude stabilization of the aircraft,
which requires the knowledge of its orientation. It is now
well established that traditional navigation equipments, like
GPS or INS, suffer from several disadvantages. That is why
some works have suggested a vision-based approach of the
problem. Especially, catadioptric vision is more and more used
since it permits to gather much more information from the
environment, compared to traditional perspective cameras, and
therefore the robustness of the UAV attitude estimation is improved.
Rotation estimation from conventional and catadioptric
images has been extensively studied. Whereas interesting results
can be obtained, the existing methods have non-negligible limitations
such as difficult features matching (e.g. repeated texture,
blurring or illumination changing) or a high computational
cost (e.g. vanishing point extraction or analyze in frequency
domain). In order to overcome these limitations, this paper
presents a top-down approach for estimating the rotation and
extracting the vanishing points in catadioptric images. This new
framework is accurate and can run in real-time. To obtain the
ground truth data, we also calibrate our catadioptric camera
with a gyroscope. Finally, experimental results on a real video
sequence are presented and compared to the ground truth data
obtained by the gyroscope.