Resource Awareness in Unmanned Aerial Vehicle-Assisted Mobile-Edge Computing Systems

This paper investigates an unmanned aerial vehicle (UAV)-assisted mobile-edge computing (MEC) system, in which the UAV provides complementary computation resource to the terrestrial MEC system. The UAV processes the received computation tasks from the mobile users (MUs) by creating the corresponding virtual machines. Due to finite shared I/O resource of the UAV in the MEC system, each MU competes to schedule local as well as remote task computations across the decision epochs, aiming to maximize the expected long-term computation performance. The non-cooperative interactions among the MUs are modeled as a stochastic game, in which the decision makings of a MU depend on the global state statistics and the task scheduling policies of all MUs are coupled. To approximate the Nash equilibrium solutions, we propose a proactive scheme based on the long short-term memory and deep reinforcement learning (DRL) techniques. A digital twin of the MEC system is established to train the proactive DRL scheme offline. Using the proposed scheme, each MU makes task scheduling decisions only with its own information. Numerical experiments show a significant performance gain from the scheme in terms of average utility per MU across the decision epochs.

Chen Xianfu, Chen Tao, Zhao Zhifeng, Zhang Honggang, Bennis Mehdi, Ji Yusheng

Publication type:
A4 Article in conference proceedings

Place of publication:
91st IEEE Vehicular Technology Conference, VTC Spring 2020

deep reinforcement learning, digital twin, long shortterm memory, mobile edge computing, resource awareness, Unmanned Aerial Vehicle


Full citation:
X. Chen, T. Chen, Z. Zhao, H. Zhang, M. Bennis and Y. JI, “Resource Awareness In Unmanned Aerial Vehicle-Assisted Mobile-Edge Computing Systems,” 2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring), Antwerp, Belgium, 2020, pp. 1-6, doi: 10.1109/VTC2020-Spring48590.2020.9128981


Read the publication here: