AI, AI, Pure: Nvidia cooks deep learning GPU server chips with NetApp
NetApp and Nvidia have introduced a combined AI reference architecture system to rival the Pure Storage-Nvidia AIRI system. It is aimed at deep learning and, unlike FlexPod (Cisco and NetApp's converge infrastructure), has no brand name. Unlike AIRI, neither does it have its own enclosure. A NetApp and Nvidia technical whitepaper – Scalable AI Infrastructure Designing For Real-World Deep Learning Use Cases (PDF) – defines a reference architecture (RA) for a NetApp A800 all-flash storage array and Nvidia DGX-1 GPU server system. There is a slower and less expensive A700 array-based RA.
Jun-16-2018, 03:22:16 GMT
- Industry:
- Information Technology > Hardware (1.00)
- Technology:
- Information Technology
- Artificial Intelligence > Machine Learning
- Neural Networks > Deep Learning (0.86)
- Graphics (0.64)
- Hardware (0.64)
- Artificial Intelligence > Machine Learning
- Information Technology