NetApp gives AI the FlexPod treatment
One of NetApp's biggest successes is its FlexPod reference architectures for on-premises infrastructure comprising NetApp arrays and data fabric software, plus Cisco servers and networking kit. Analyst firm IDC says the product accounts for a third of the converged systems market and over US$2 billion of annual revenue. FlexPods are so good at what they do that Microsoft will deploy them for its VMware-on-Azure service. This time around it's teamed with NVIDIA, which makes AI-centric servers called "DGX" that pack a pair of Xeons and up to 16 Tesla 100 GPUs. NetApp believes that users want to start testing and/or using AI, but are held back by on-premises infrastructure that's not up to the job and a fear that building the right hardware stack will be complex and costly. The new "ONTAP AI proven architecture" attempts to change that, by explaining how to build rigs based on NetApp's new high-end AFF A800 array, Cisco networking and NVIDIA's DGX servers.
Aug-8-2018, 23:53:36 GMT