Anaconda Leverages Containers to Accelerate AI Development - Container Journal

#artificialintelligence 

Anaconda Inc. announced today it is leveraging Docker containers and Kubernetes clusters to accelerate the development of AI applications built and deployed using graphical processor units (GPUs) from NVIDIA. Previously, Anaconda added support for Docker and Kubernetes to version 5.0 of Anaconda Enterprise, a commercially supported instance of an open source platform for developing, governing and automating data science and AI pipelines on Intel processors. A version 5.2 of Anaconda Enterprise extends that platform to add support for GPUs. Matthew Lodge, senior vice president of products and marketing at Anaconda, says that training AI applications has been proven to be significantly faster on GPUs. But over time, developers of AI applications will be employing a broad range of algorithms across Intel processors, GPUs, field programmable gate arrays and new classes of processors such as the TPU processors developed by Google, which are designed specifically for AI applications.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found