Ember - energy management of batteryless event detection sensors with deep reinforcement learning: demo abstract.

2020 
Batteryless sensors avoid battery replacement at the cost of slowing down or stopping their operations when there is not sufficient energy to harvest in the environment. While this strategy can work for some applications, event-based applications still remain a challenge as events arrive sporadically and energy availability is uncertain. One solution is to only turn On a sensor right before an event is happening to both detect the event and save as much energy as possible. Therefore, the system has to correctly predict events while managing limited resource availability. In this demo, we present Ember, an energy management system based on deep reinforcement learning to duty cycle event-driven sensors in low-energy conditions. We show how our system learns environmental patterns over time and makes decisions to maximize the event detection rate for batteryless energy-harvesting sensor nodes subject to low energy availability. Furthermore, we show a novel self-supervised data collection algorithm that helps Ember in discovering new environmental patterns over time. For more details, we refer readers to the full paper of Ember [2].
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    1
    Citations
    NaN
    KQI
    []