Optimizing Multi-DNN Inference on Mobile Devices through Heterogeneous Processor Co-Execution
Gao, Yunquan, Zhang, Zhiguo, Donta, Praveen Kumar, Dehury, Chinmaya Kumar, Wang, Xiujun, Niyato, Dusit, Zhang, Qiyang
–arXiv.org Artificial Intelligence
Abstract--Deep Neural Networks (DNNs) are increasingly deployed across diverse industries, driving a growing demand to enable their capabilities on mobile devices. However, existing mobile inference frameworks are often rely on a single processor to handle each model's inference, limiting hardware utilization and leading to suboptimal performance and energy efficiency . Expanding DNNs accessibility on mobile platforms requires more adaptive and resource-efficient solutions to meet increasing computational demands without compromising device functionality . Nevertheless, parallel inference of multiple DNNs on heterogeneous processors remains a significant challenge. Several works have explored partitioning DNN operations into subgraphs to enable parallel execution across heterogeneous processors. However, these approaches typically generate excessive subgraphs based solely on hardware compatibility, increasing scheduling complexity and memory management overhead. T o address these limitations, we propose an Advanced Multi-DNN Model Scheduling (ADMS) strategy that optimizes multi-DNN inference across heterogeneous processors on mobile devices. ADMS constructs an optimal subgraph partitioning strategy offline, considering both hardware support of operations and scheduling granularity, while employing a processor-state-aware scheduling algorithm that dynamically balances workloads based on real-time operational conditions. This ensures efficient workload distribution and maximizes the utilization of available processors. Experimental results show that, compared to vanilla inference frameworks, ADMS reduced multi-DNN inference latency by 4.04 T o reduce interaction latency and lower server-side computing costs, an increasing number of applications are shifting inference tasks to mobile devices. In many real-world scenarios, multiple independent or related DNN models run concurrently on mobile devices. For instance, in the smart agriculture scenario, farmers capture video frames using smartphone camera and perform real-time parallel inference with multiple DNN models. These models include crop identification [5], pest and disease detection [6], plant health assessment [7], and soil quality analysis [8]. Gao, X. Wang are with School of Computer Science and T echnology, Anhui Engineering Research Center for Intelligent Applications and Security of Industrial Internet, Anhui University of T echnology, Ma'anshan, Anhui, 243032, China.
arXiv.org Artificial Intelligence
Mar-26-2025
- Genre:
- Research Report > New Finding (0.87)
- Industry:
- Food & Agriculture > Agriculture (0.48)
- Information Technology (0.68)
- Technology:
- Information Technology
- Architecture (1.00)
- Artificial Intelligence > Machine Learning
- Neural Networks > Deep Learning (1.00)
- Communications > Mobile (1.00)
- Hardware (1.00)
- Information Technology