Paper The following article is Open access

Service Migration in Mobile Edge Computing Based on Reinforcement Learning

and

Published under licence by IOP Publishing Ltd
, , Citation Chen Fan and Li Li 2020 J. Phys.: Conf. Ser. 1584 012058 DOI 10.1088/1742-6596/1584/1/012058

1742-6596/1584/1/012058

Abstract

Mobile edge computing (MEC) provides users with cloud computing capabilities at the edge of the mobile network, which effectively reduces network latency and improves the experience of end-users. User mobility in MEC is a factor that cannot be ignored, and mobility management is an urgent problem to be solved. Service migration is an effective way to manage user mobility. However, it is not appropriate to perform migration too frequently due to expensive migration overhead. In this paper, we propose a service migration decision algorithm to decide whether to migrate or not when the user moves out of the coverage of the offloaded MEC server. Markov decision process (MDP) is used to model the service migration decision problem. We comprehensively consider the distance between users and services, resource requirements of the services, and resource availability of the MEC servers. On the premise of considering both migration costs and resource conditions, aiming at maximizing quality of service (QoS), the reward function of MDP is defined, and the migration decision strategy is solved by Q-learning algorithm. Finally, our proposed migration decision algorithm is validated by simulation.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1584/1/012058