Optimizing Multi-DNN Inference on Mobile Devices through Heterogeneous Processor Co-Execution

Open in new window