Nowadays, the importance of the object recognition of an aerial vehicle has increased, and many studies have been conducted. For Urban Aerial Mobility (UAM), it is important to recognize other vehicles, such as drones and birds, and avoid collisions when flying. In this paper, two sensors are fused to detect objects. It is used by fusing a camera sensor that can be used with light and low power, and a lidar sensor that has high near-field reliability and can know the location information of an object. Typically, Radar is used to recognize objects on airplanes but has been replaced by riders to conduct research on the drone platform. By using the features of the two sensors, the recognition rate of objects at short and long distances is increased, and reliability is increased through sensor redundancy. In addition, it was possible to drive in real-time on the embedded board through system optimization. Existing vehicles recognize other vehicles using Radar and communication. However, through the sensor fusion presented in this paper, it is possible to increase the object recognition rate in stand-alone situations.