Paper
7 August 2024 Double deep Q-network-based task offloading strategy in mobile edge computing
Mingchu Li, Zhi Yang
Author Affiliations +
Proceedings Volume 13224, 4th International Conference on Internet of Things and Smart City (IoTSC 2024); 1322418 (2024) https://doi.org/10.1117/12.3034986
Event: 4th International Conference on Internet of Things and Smart City, 2024, Hangzhou, China
Abstract
With the popularization of smartphones, mobile applications and mobile Internet, mobile devices have an increasing demand for real-time and low latency. However, MDs constrained in their computational power and resources cannot entirely dependent on cloud computing for their processing needs. In order to reduce network latency and improve user experience, Mobile Edge Computing has emerged, and the research on computation offloading lays the foundation for the realization of MEC. In this work, in scenarios involving multi-users and multi-edge servers, we adopt the double deep Qnetwork strategy to address the issue of task offloading. Our primary objective is to reduce the total system latency while considering device mobility, task urgency, and the heterogeneous tasks. We extend the DDQN algorithm by adding a prioritized experience reaply mechanism. Experimental results indicate that the improved DDQN method enhances the convergence speed and effectively reduces the task latency relative to other baseline algorithms.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Mingchu Li and Zhi Yang "Double deep Q-network-based task offloading strategy in mobile edge computing", Proc. SPIE 13224, 4th International Conference on Internet of Things and Smart City (IoTSC 2024), 1322418 (7 August 2024); https://doi.org/10.1117/12.3034986
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Mobile devices

Mendelevium

Mathematical optimization

Computing systems

Education and training

Performance modeling

Instrument modeling

Back to Top