Residential College | false |
Status | 已發表Published |
Energy-Efficient UAV-Enabled Data Collection via Wireless Charging: A Reinforcement Learning Approach | |
Fu Shu1; Tang Yujie2; Wu Yuan3![]() ![]() | |
2021-01 | |
Source Publication | IEEE Internet of Things Journal
![]() |
ISSN | 2327-4662 |
Volume | 8Issue:12Pages:10209-10219 |
Abstract | In this article, we study the application of unmanned aerial vehicle (UAV) for data collection with wireless charging, which is crucial for providing seamless coverage and improving system performance in the next-generation wireless networks. To this end, we propose a reinforcement learning-based approach to plan the route of UAV to collect sensor data from sensor devices scattered in the physical environment. Specifically, the physical environment is divided into multiple grids, where one spot for UAV hovering as well as the wireless charging of UAV is located at the center of each grid. Each grid has a spot for the UAV to hover, and moreover, there is a wireless charger at the center of each grid, which can provide wireless charging to UAV when it is hovering in the grid. When the UAV lacks energy, it can be charged by the wireless charger at the spot. By taking into account the collected data amount as well as the energy consumption, we formulate the problem of data collection with UAV as a Markov decision problem, and exploit Q -learning to find the optimal policy. In particular, we design the reward function considering the energy efficiency of UAV flight and data collection, based on which Q -table is updated for guiding the route of UAV. Through extensive simulation results, we verify that our proposed reward function can achieve a better performance in terms of the average throughput, delay of data collection, as well as the energy efficiency of UAV, in comparison with the conventional capacity-based reward function. |
Keyword | Data Collection Design Of Reward Function Energy Efficiency Q-learning Reinforcement Learning Unmanned Aerial Vehicle (Uav) |
DOI | 10.1109/JIOT.2021.3051370 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering ; Telecommunications |
WOS Subject | Computer Science, Information Systems ; Engineering, Electrical & Electronic ; Telecommunications |
WOS ID | WOS:000658354800065 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Scopus ID | 2-s2.0-85099538795 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) |
Corresponding Author | Liu Min |
Affiliation | 1.School of Microelectronics and Communication Engineering, Chongqing University, Chongqing, China 2.Department of Computer Science and Mathematics, Algoma University, Sault Ste. Marie, Canada 3.State Key Laboratory of Internet of Things for Smart City, Department of Computer Information Science, University of Macau, Macao 4.Department of Electrical and Computer Engineering, University of Windsor, Windsor, Canada 5.State Key Laboratory of Integrated Services Networks, Xidian University, Xi'an, China |
Recommended Citation GB/T 7714 | Fu Shu,Tang Yujie,Wu Yuan,et al. Energy-Efficient UAV-Enabled Data Collection via Wireless Charging: A Reinforcement Learning Approach[J]. IEEE Internet of Things Journal, 2021, 8(12), 10209-10219. |
APA | Fu Shu., Tang Yujie., Wu Yuan., Zhang Ning., Gu Huaxi., Chen Chen., & Liu Min (2021). Energy-Efficient UAV-Enabled Data Collection via Wireless Charging: A Reinforcement Learning Approach. IEEE Internet of Things Journal, 8(12), 10209-10219. |
MLA | Fu Shu,et al."Energy-Efficient UAV-Enabled Data Collection via Wireless Charging: A Reinforcement Learning Approach".IEEE Internet of Things Journal 8.12(2021):10209-10219. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment