Vision-based monitoring and measurement of bottlenose dolphins' daily habitat use and kinematics

2021 
This research presents a framework to enable computer-automated observation and monitoring of bottlenose dolphins (Tursiops truncatus) in a professionally managed environment. Results from this work provide insight into the dolphins movement patterns, kinematic diversity, and how changes in the environment affect their dynamics. Fixed overhead cameras were used to collect [~]100 hours of observations, recorded over multiple days including time both during and outside of formal training sessions. Animal locations were estimated using convolutional neural network (CNN) object detectors and Kalman filter post-processing. The resulting animal tracks were used to quantify habitat use and animal dynamics. Additionally, Kolmogorov-Smirnov analyses of the swimming kinematics were used for high-level behavioral mode classification. The detectors achieved a minimum Average Precision of 0.76. Performing detections and post-processing yielded 1.24x107 estimated dolphin locations. Animal kinematic diversity was found to be lowest in the morning and peaked immediately before noon. Regions of the habitat displaying the highest activity levels correlated to locations associated with animal care specialists, conspecifics, or enrichment. The work presented here demonstrates that CNN object detection is not only viable for large-scale marine mammal tracking, it also enables automated analyses of dynamics that provide new insight into animal movement and behavior.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    0
    Citations
    NaN
    KQI
    []