Brain-computer interfaces (BCI) have been shown as a powerful way to provide communication without motor actions. However, they are currently limited by their methodology and user aptitude. Motor imagery (MI) based BCIs require lenghthy training sessions to be effectively used, while steady state visual evoked potential (SSVEP) based BCIs are limited in number of classes by the refresh rate of the display monitor and visual space. By employing both MI and SSVEP-based BCIs in tandem with an augmented reality device, we designed a new asynchronous hybrid AR-BCI system for navigation of quadcopter flight. The use of hybrid BCI allowed us to reduce training time required for motor imagery based BCIs and increase number of total classes that can be used without need for extra visual clutter. This allows use of AR device to display SSVEP stimulus, which lets users to freely look around while maintaining mental control over the quadcopter.