论文标题

基于学习的任务在数字双胞胎授权的车辆互联网中卸载

Learning Based Task Offloading in Digital Twin Empowered Internet of Vehicles

论文作者

Zheng, Jinkai, Luan, Tom H., Gao, Longxiang, Zhang, Yao, Wu, Yuan

论文摘要

移动边缘计算已成为未来派自动驾驶汽车以卸载计算任务的有效且基本的范例。但是,由于车辆的移动性很高,无线条件的动力学以及到达计算任务的不确定性,因此很难确定最佳的卸载策略。在本文中,我们提出了一个数字双胞胎(DT)授权的任务卸载框架。作为居住在云中的软件代理,DT可以通过使用DT之间的通信来获得全局网络信息,以及使用双胞胎中的通信的车辆的历史信息。全球网络信息和历史车辆信息可以显着促进卸载。具体而言,为了将宝贵的计算资源保存在不同级别的最合适的计算任务中,我们基于DT中未来派计算任务的预测整合了学习方案。因此,我们将卸载调度流程建模为马尔可夫决策过程(MDP),以最大程度地减少任务延迟,能源消耗和云租赁成本之间的折磨的长期成本。仿真结果表明,与其他现有方法相比,我们的算法可以有效地找到最佳的卸载策略,并达到快速收敛速度和高性能。

Mobile edge computing has become an effective and fundamental paradigm for futuristic autonomous vehicles to offload computing tasks. However, due to the high mobility of vehicles, the dynamics of the wireless conditions, and the uncertainty of the arrival computing tasks, it is difficult for a single vehicle to determine the optimal offloading strategy. In this paper, we propose a Digital Twin (DT) empowered task offloading framework for Internet of Vehicles. As a software agent residing in the cloud, a DT can obtain both global network information by using communications among DTs, and historical information of a vehicle by using the communications within the twin. The global network information and historical vehicular information can significantly facilitate the offloading. In specific, to preserve the precious computing resource at different levels for most appropriate computing tasks, we integrate a learning scheme based on the prediction of futuristic computing tasks in DT. Accordingly, we model the offloading scheduling process as a Markov Decision Process (MDP) to minimize the long-term cost in terms of a trade off between task latency, energy consumption, and renting cost of clouds. Simulation results demonstrate that our algorithm can effectively find the optimal offloading strategy, as well as achieve the fast convergence speed and high performance, compared with other existing approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源