Microsoft* Turbocharges AI with Intel FPGAs. You Can, Too.

#artificialintelligence 

Today, Microsoft* announced a public preview of Azure Machine Learning Hardware Accelerated Models powered by Project Brainwave*, a new AI inferencing service. The service uses Intel Arria 10 FPGAs, configured as "soft DNN processing units" highly-tuned to the ResNet-50 image recognition model, to provide extraordinary throughput levels. Microsoft calls it "real time AI." One year ago, Microsoft Azure CTO, Mark Russinovich, described their plan to build the Azure Cloud Services infrastructure with an FPGA in every node. Instead of creating node pools with specialized hardware accelerators for the wide-ranging workloads that are deployed in Azure, the Microsoft team went with the flexibility of FPGAs, which can be reconfigured to provide hardware acceleration perfectly aligned to nearly any task.