Energy Efficient Object Detection in Camera Sensor Networks

2017 
A wireless camera network can provide situation awareness information (e.g., humans in distress) in scenarios such as disaster recovery. If such camera sensors are battery operated, sending raw video feeds back to a central controller can be expensive in terms of energy consumption. Further, if all cameras were to use the optimal processing algorithm for object decision, they may also expend unnecessary energy. Stated otherwise, cameras that capture the same objects may not all have to use the optimal algorithm to achieve a desired accuracy, and this can save processing energy costs. In this paper, our objective is to design and implement a framework that can support coordination among cameras to deliver highly accurate detection of objects in an energy efficient way. The framework, which we call EECS (for energy efficient camera sensors), estimates the detection accuracy and energy costs incurred (both the processing and communication costs are taken into account) with each detection algorithm for each camera, and comes up with a choice of cameras for sending information pertaining to the object of interest. This set of cameras and the video processing algorithms that they must use, are chosen so as to minimize the energy expenditures, given a desired detection accuracy. We implement EECS on a camera network built with smartphones, and demonstrate that it reduces the energy consumption by up to 40% while ensuring a object detection accuracy of over 86%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    6
    Citations
    NaN
    KQI
    []