TinySurveillance: A Low-power Event-based Surveillance Method for Unmanned Aerial Vehicles
Unmanned Aerial Vehicles (UAVs) have always been faced with power management challenges. Managing power consumption becomes critical, especially in surveillance applications where the longer flight time results in wider coverage and a cheaper solution. While most current studies focus on utilizing new models for improving event detection without considering the power constraints, our design's first priority is our platform's power efficiency. Implementing an algorithm on a portable device with minimal access to power supply sources requires special hardware and software considerations. An improved algorithm may need more powerful hardware, which can surge power consumption. Therefore, we aim to propose a method to be suitable for such devices with power consumption constraints. In this work, we propose an event-driven surveillance method with an efficient video transmission algorithm that reduces power consumption while preserving image quality. The surveillance will start automatically once the low-power AI-based onboard processor detects the desired event. The drone repeatedly solves a classification problem by employing a lightweight deep learning algorithm. When the UAV detects the defined event, a sample image is sent to the server for validation. Afterwards, if the server validates the drone decision, the drone, which can be a UAV, starts sending a colored image accompanied by a group of N grayscale images. Then, in the server, the grayscale images will be colorized using a convolution neural network trained by the colored images. By adopting this method, the sent data rate decreases and the server's computation load increases. The former part results in a drop in the UAV's power consumption, which is our aim. In this work, an application of wildfire detection and surveillance has been implemented to show the proof of concept of the TinySurveillance method. Using four videos of similar scenarios with different spatial and temporal information that a UAV may face, with various spatial and temporal characteristics, we show the effectiveness of our method. Our results show that the power consumption of the onboard processing unit in detection mode will be reduced by at least 4 times, reaching a detection accuracy of 85%, while in surveillance mode, we can decrease the data transmission rate by almost 66% while achieving a competent image quality with PSNR_Avg of 41.35 dB, PSNR of 30.94 dB, and output frame rate of 5.2. Also, the reproduced images show the outstanding performance of the algorithm by generating colorized images identical to the original scenes. There are main features that affect our method's power consumption and output quality, like the number of grayscale images, sent video bitrate, learning rate, and video characteristics that are discussed comprehensively.
Surveillance, Power consumption, TinyML, Image Colorization, Wildfire
Master of Science (M.Sc.)
Electrical and Computer Engineering