Knowledge Agora



Similar Articles

Title Learning-Based Energy-Efficient Data Collection by Unmanned Vehicles in Smart Cities
ID_Doc 42990
Authors Zhang, B; Liu, CH; Tang, J; Xu, ZY; Ma, J; Wang, WD
Title Learning-Based Energy-Efficient Data Collection by Unmanned Vehicles in Smart Cities
Year 2018
Published Ieee Transactions On Industrial Informatics, 14, 4
Abstract Mobile crowdsourcing (MCS) is now an important source of information for smart cities, especially with the help of unmanned aerial vehicles (UAVs) and driverless cars. They are equipped with different kinds of high-precision sensors, and can be scheduled/controlled completely during data collection, which will make MCS system more robust. However, they are limited to energy constraint, especially for long-term, long-distance sensing tasks, and cities are almost too crowded to set stationary charging station. Towards this end, in this paper we propose to leverage emerging deep reinforcement learning (DRL) techniques for enabling model-free unmanned vehicles control, and present a novel and highly effective control framework, called "DRL-RVC." It utilizes the powerful convolutional neural network for feature extraction of the necessary information (including sample distribution, traffic flow, etc.), then makes decisions under the guidance of the deep Q network. That is, UAVs will cruise in the city without control and collect most required data in the sensing region, while mobile unmanned charging station will reach the charging point in the shortest possible time. Finally, we validate and evaluate the proposed framework via extensive simulations based on a real dataset in Rome. Extensive simulation results well justify the effectiveness and robustness of our approach.
PDF

Similar Articles

ID Score Article
41601 Yun, WJ; Ha, YJ; Jung, S; Kim, J Autonomous Aerial Mobility Learning for Drone-Taxi Flight Control(2021)
Scroll