Goto

Collaborating Authors

Why successful AI needs fast data access

#artificialintelligence

Sponsored Spending on artificial intelligence systems will grow from $37.5bn worldwide in 2019 to $97.9bn in 2023, according to IDC. And use cases cover everything from ERP, manufacturing software and content management to automated customer service agents, threat intelligence and fraud investigation. This will enable enterprises to navigate local data protection regulation and host AI workloads in edge computing environments to optimise application performance. Also, they expressed concerns that organisations cannot store, process and analyse sufficiently large quantities of good quality information to build successful use cases. Gartner notes that many organisations struggle to scale their AI pilot projects into enterprise-wide production implementations, which inevitably limits the technology's business value.


Want optimized AI? Rethink your storage infrastructure and data pipeline

#artificialintelligence

Most discussions of AI infrastructure start and end with compute hardware -- the GPUs, general-purpose CPUs, FPGAs, and tensor processing units responsible for training complex algorithms and making predictions based on those models. But AI also demands a lot from your storage. Keeping a potent compute engine well-utilized requires feeding it with vast amounts of information as fast as possible. Anything less and you clog the works and create bottlenecks. Optimizing an AI solution for capacity and cost, while scaling for growth, means taking a fresh look at its data pipeline.


Want optimized AI? Rethink your storage infrastructure and data pipeline

#artificialintelligence

Most discussions of AI infrastructure start and end with compute hardware -- the GPUs, general-purpose CPUs, FPGAs, and tensor processing units responsible for training complex algorithms and making predictions based on those models. But AI also demands a lot from your storage. Keeping a potent compute engine well-utilized requires feeding it with vast amounts of information as fast as possible. Anything less and you clog the works and create bottlenecks. Optimizing an AI solution for capacity and cost, while scaling for growth, means taking a fresh look at its data pipeline.


insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads - insideBIGDATA

#artificialintelligence

Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads, we'll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation. The intended audience for this important new technology guide includes enterprise thought leaders (CIOs, director level IT, etc.), along with data scientists and data engineers who are a seeking guidance in terms of infrastructure for AI and DL in terms of specialized hardware. The emphasis of the guide is "real world" applications, workloads, and present day challenges.


Storage strategies for machine learning and AI workloads

#artificialintelligence

Businesses are increasingly using data assets to accelerate their competitiveness and drive greater revenue. Part of this strategy is to use machine learning and AI tools and technologies. But AI workloads have significantly different data storage and computing needs than generic workloads. AI and machine learning workloads require huge amounts of data both to build and train the models and to keep them running. When it comes to storage for these workloads, high-performance and long-term data storage are the most important concerns.