Digital Twin Channel-Enabled Online Resource Allocation for 6G: Principle, Architecture and Application

Li, Tongjie, Zhang, Jianhua, Yu, Li, Zhang, Yuxiang, Cai, Yunlong, Xu, Fan, Liu, Guangyi

arXiv.org Artificial Intelligence 

The emergence of sixth-generation (6G) networks is reshaping wireless communications to support mission-critical applications such as the Industrial Internet of Things (IIoT), autonomous driving, and smart manufacturing. Compared with 5G, 6G imposes significantly more stringent requirements on latency, reliability, adaptability, and end-to-end responsiveness [1, 2]. IIoT scenarios are particularly challenging due to the coexistence of complex radio propagation conditions and diverse service requirements. Dense deployments, metallic scatterers, and dynamic obstacles give rise to severe multipath fading, especially in high-frequency bands such as mmWave and terahertz, where signal stability is highly sensitive to physical structures [3, 4]. In parallel, service demands span multiple categories, such as periodic sensing, closed-loop control, event-triggered communication, and edge computing, each with distinct quality-of-service (QoS) requirements [5]. To address these multifaceted challenges, resource allocation mechanisms must be environment-aware, latency-sensitive, and capable of online adaptation across large-scale, dynamic deployments. Artificial intelligence (AI)-driven resource allocation has attracted growing interest due to its ability to learn underlying correlations from sensing data and historical records.