A YOLO-based Separation of Touching-Pigs for Smart Pig Farm Applications

2019 
For specific livestock such as pigs in a pigsty, many surveillance applications have been reported to consider their health for efficient livestock management. For pig surveillance applications, separating touching-pigs in real-time is an important issue for a final goal of 24-hour tracking of individual pigs. Although convolutional neural network (CNN)-based instance segmentation techniques can be applied to this separation problem, their collective performance of accuracy-time may not satisfy the required performance. In this study, we improve the collective performance of accuracy-time by combining the fastest CNN-based object detection technique (i.e., You Only Look Once, YOLO) with image processing techniques. We first apply image processing techniques to detect touching-pigs by using both infrared and depth information acquired from an Intel RealSense camera, then apply YOLO to separate the touching-pigs. Especially, in using YOLO as an object detector, we consider the target object as the boundary region of the touching-pigs, rather than the individual pigs of the touching-pigs. Finally, we apply image processing techniques to determine the final boundary line from the YOLO output. Our experimental results show that this method is effective to separate touching-pigs in terms of the collective performance of accuracy-time, compared to the recently reported CNN-based instance segmentation technique.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    7
    Citations
    NaN
    KQI
    []