论文标题

群集D2D网络的绩效分析和优化的CACHE辅助COMP

Performance Analysis and Optimization of Cache-Assisted CoMP for Clustered D2D Networks

论文作者

Amer, Ramy, ElSawy, Hesham, Kibiłda, Jacek, Butt, M. Majid, Marchetti, Nicola

论文摘要

移动设备的缓存以及利用合作设备到设备(D2D)通信是两种有前途的方法,可以在无线网络上支持大量内容交付,同时减轻干扰的影响。为了显示合作通信对启用缓存D2D网络性能的影响,必须考虑到设备群集的概念以传达对网络性能的现实描述。在这方面,本文基于随机几何形状和使用群集设备的高速辅助协调多点(COMP)传输的优化框架开发了一种新颖的数学模型。设备在空间上分布到不相交的群集中,并假定按照随机的概率缓存方案,从已知库中的缓存文件具有多余的内存。可以通过来自相邻设备的D2D Comp Transmissions或作为最后一个度假村的D2D Comp Transmiss获得的未自动化的所需内容。对于此模型,我们分析将卸载增益和速率覆盖率概率作为系统参数的函数。然后将最佳的缓存策略定义为最大化卸载增益的内容放置方案。对于可拖动的优化框架,我们采用两种单独的方法来获得下限且准确的卸载增益近似值,这使我们能够获得优化的caching策略。

Caching at mobile devices and leveraging cooperative device-to-device (D2D) communications are two promising approaches to support massive content delivery over wireless networks while mitigating the effects of interference. To show the impact of cooperative communication on the performance of cache-enabled D2D networks, the notion of device clustering must be factored in to convey a realistic description of the network performance. In this regard, this paper develops a novel mathematical model, based on stochastic geometry and an optimization framework for cache-assisted coordinated multi-point (CoMP) transmissions with clustered devices. Devices are spatially distributed into disjoint clusters and are assumed to have a surplus memory to cache files from a known library, following a random probabilistic caching scheme. Desired contents that are not self-cached can be obtained via D2D CoMP transmissions from neighboring devices or, as a last resort, from the network. For this model, we analytically characterize the offloading gain and rate coverage probability as functions of the system parameters. An optimal caching strategy is then defined as the content placement scheme that maximizes the offloading gain. For a tractable optimization framework, we pursue two separate approaches to obtain a lower bound and a provably accurate approximation of the offloading gain, which allows us to obtain optimized caching strategies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源