The rapid development and usage of large-scale AI models by mobile users will dominate the traffic load in future communication networks. The advent of AI technology also facilitates a decentralized AI ecosystem where small organizations or even individuals can host AI services. In such scenarios, AI service (models) placement, selection, and request routing decisions are tightly coupled, posing a challenging yet fundamental trade-off between service quality and service latency, especially when considering user mobility. Existing solutions for related problems in mobile edge computing (MEC) and data-intensive networks fall short due to restrictive assumptions about network structure or user mobility. To bridge this gap, we propose a decentralized framework that jointly optimizes AI service placement, selection, and request routing. In the proposed framework, we use traffic tunneling to support user mobility without costly AI service migrations. To account for nonlinear queuing delays, we formulate a nonconvex problem to optimize the trade-off between service quality and end-to-end latency. We derive the node-level KKT conditions and develop a decentralized Frank--Wolfe algorithm with a novel messaging protocol. Numerical evaluations validate the proposed approach and show substantial performance improvements over existing methods.


翻译:暂无翻译

0
下载
关闭预览

相关内容

人工智能杂志AI(Artificial Intelligence)是目前公认的发表该领域最新研究成果的主要国际论坛。该期刊欢迎有关AI广泛方面的论文,这些论文构成了整个领域的进步,也欢迎介绍人工智能应用的论文,但重点应该放在新的和新颖的人工智能方法如何提高应用领域的性能,而不是介绍传统人工智能方法的另一个应用。关于应用的论文应该描述一个原则性的解决方案,强调其新颖性,并对正在开发的人工智能技术进行深入的评估。 官网地址:http://dblp.uni-trier.de/db/journals/ai/
Top
微信扫码咨询专知VIP会员