Accelerating AI applications on Windows Subsystem for Linux with Intel's iGPU and OpenVINO toolkit
Are you tired of switching between Windows and Linux environments to perform machine learning (ML) tasks? Do you want to accelerate inference of your ML applications in an effective way? This blog post is intended to serve as a guide to configure your Windows based system to get the most out of your Intel Integrated Graphics Processing Unit (iGPU). Now let's see how Intel's iGPU works with a Linux distribution (such as Ubuntu, openSUSE, Kali, Debian, Arch Linux, and more) on WSL to see the performance benefits of OpenVINO . I gave this combination of tools a try, with the demo shown below, and was really amazed by how seamlessly it works -- not to mention, with added acceleration!
Aug-12-2022, 12:28:23 GMT
- Technology: