Vision-based UAV detection and tracking using motion signatures

2018 
This paper proposes a method for UAV detection and position extraction in video sequences obtained from a camera facing upwards towards the sky. The goal is for the presented model to act as groundwork for the development of a cooperative UAV autonomous landing system. It seeks to overcome most of the downfalls of pattern-based approaches by instead using the UAV's innate motion behaviours to perform detection. Seeing that the sky is a fairly stagnant environment, objects are detected through a Background Subtraction algorithm. As clouds generally present a slow and progressive movement they are to be considered part of the background and all other objects (e.g planes, birds, UAVs) as foreground. The irregular motion patterns of the UAV, especially of its propellers, are used to create a movement signature that distinguishes the UAV from other objects. The signature is classified as an entropy metric obtained from the resulting optical flow over a number of past frames. To further improve the detection rate, a tracking algorithm based on a Kalman filter was developed. Experimental results obtained from a dataset encompassing 12 diverse videos showed the ability of the computer vision algorithm to perform the tracking of the UAV with an average performance of 93.4%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    9
    Citations
    NaN
    KQI
    []