storage


The Era of Change – IoT and Machine Learning Trends in Industry for 2020

#artificialintelligence

IoT or Internet of Things is slowly making its way in every aspect of our lives. If you don't own an IoT device yet, you will surely own one soon. But it is highly unlikely that you even wouldn't have heard of such devices. From smart televisions, fridges, thermostats to smart coffee makers, IoT devices have infiltrated our daily lives and are slowly gaining mainstream recognition. According to recent studies as of 2019, the number of active IoT devices peaked at a significant 26.66 billion.


All the Data Processing Terms You Need to Know, According to a Data Scientist

#artificialintelligence

Some people may even wonder what data science really means. At its core, data science seeks to comprehend the what and the why questions. This article aims to introduce all the branches of data science and explain its various phases. Below is a quick look at all the terms and techniques that I'll be reviewing in this article: Data access is the first step in any data science project. It refers to the data scientist's ability to read, write or receive the data within a database or a remote repository.


Beat the GPU Storage Bottleneck for AI and ML

#artificialintelligence

Data centers that support AI and ML deployments rely on Graphics Processing Unit (GPU)-based servers to power their computationally intensive architectures. Across multiple industries, expansion in GPU use is behind the over 31 percent CAGR in GPU servers projected through 2024. That means more system architects will be tasked to assure top performance and cost-efficiency from GPU systems. Yet optimizing storage for these GPU-based AI/ML workloads is no small feat. GPU servers are highly efficient for the matrix multiplication and convolution required to train large AI/ML datasets.


Newsmaker Interview: Derek Manky on 'Self-Organizing Botnet Swarms'

#artificialintelligence

For over five years Derek Manky, global security strategist at Fortinet and FortiGuard Labs, has been helping the private and public sector identify and fight cybercrime. His job also includes working with noted groups: Computer Emergency Response, NATO NICP, INTERPOL Expert Working Group and the Cyber Threat Alliance. Recently Threatpost caught up with Manky to discuss the latest developments around his research on botnet "swarm intelligence." That's a technique where criminals enlist artificial intelligence (AI) inside botnet nodes. Those nodes are then programmed to work toward a common goal of bolstering an attack chain and accelerating the time it takes to breach an organization.


How AI Impacts Storage and IT

#artificialintelligence

Artificial intelligence (AI) and machine learning (ML) have had quite the impact on most industries in the last couple of years, but what about the effect on our own IT industry? On April 1, 2020, the SNIA Cloud Storage Technologies Initiative will host a live webcast, "The Impact of Artificial Intelligence on Storage and IT, where our experts will explore how AI is changing the nature of applications, the shape of the data center, and its demands on storage. Learn how the rise of ML can develop new insights and capabilities for IT operations. Yes, we know this is on April 1st, but it's no joke! So, don't be fooled and find out why everyone is talking about AI now.


Artificial Intelligence and Machine Learning Demand High Performance Storage Series, part three: Execute

#artificialintelligence

In this edition of my Artificial Intelligence series, I wanted to look at the third, and final, major phase of an AI implementation: execution (see the overall AI process in the diagram below). As you may recall, in my previous two blogs, I reviewed the Ingest and Transformation and the Training phases of the AI process. As we saw, there can be a very dramatic impact on overall performance in those two phases created by integrating flash in the form of SSDs and memory. In the ingest/transform phase, the faster you can get raw data from myriad sources into the AI infrastructure and transform it into a useable standardized format suitable for your modeling, the faster you will move to the train phase. While the train phase is very CPU – and more specifically GPU – centric, we showed that adding flash and memory to the solution makes a big impact.


HPE makes Kubernetes-based container platform generally available

#artificialintelligence

Hewlett Packard Enterprise (HPE) has put its acquisition of BlueData and MapR to good use, making the Kubernetes-based container platform it built up from the companies' solutions generally available. Built upon the technology developed by BlueData and MapR, the HPE Container Platform offering was first announced in November last year, but has until now only been available in a limited beta form. HPE describes the offering as an integrated turnkey solution with BlueData software as the container management control plane and the MapR distributed file system as the unified data fabric for persistent storage. Broadly, the HPE Container Platform is an enterprise-grade container platform designed to support both cloud-native and non-cloud-native applications using open source Kubernetes – running on bare-metal or virtual machines (VMs), in the data centre on any public cloud, or at the edge. "The next phase of enterprise container adoption requires breakthrough innovation and a new approach," senior vice president and chief technology officer of Hybrid IT at HPE Kumar Sreekanti said.


HPE Announces the General Availability of its Container Platform

#artificialintelligence

"With the HPE Container Platform, GM Financial has deployed containerized applications for machine learning and data analytics running in production in a multi-tenant hybrid cloud architecture, for multiple use cases from credit risk analysis to improving customer experience," said Lynn Calvo, AVP of Emerging Data Technology at GM Financial. "The next phase of enterprise container adoption requires breakthrough innovation and a new approach," said Kumar Sreekanti, senior vice president and chief technology officer of Hybrid IT at HPE. "Our HPE Container Platform software brings agility and speed to accelerate application development with Kubernetes at scale. Customers benefit from greater cost efficiency by running containers on bare-metal, with the flexibility to run on VMs or in a cloud environment." "We're leveraging the innovations of the open source Kubernetes community, together with our own software innovations for multi-tenancy, security, and persistent data storage with containers," continued Sreekanti. "The new HPE Container Platform is designed to help customers as they expand their containerization deployments, for multiple large-scale Kubernetes clusters with use cases ranging from machine learning to CI / CD pipelines."


HPE Announces the General Availability of its Container Platform

#artificialintelligence

"With the HPE Container Platform, GM Financial has deployed containerized applications for machine learning and data analytics running in production in a multi-tenant hybrid cloud architecture, for multiple use cases from credit risk analysis to improving customer experience," said Lynn Calvo, AVP of Emerging Data Technology at GM Financial. "The next phase of enterprise container adoption requires breakthrough innovation and a new approach," said Kumar Sreekanti, senior vice president and chief technology officer of Hybrid IT at HPE. "Our HPE Container Platform software brings agility and speed to accelerate application development with Kubernetes at scale. Customers benefit from greater cost efficiency by running containers on bare-metal, with the flexibility to run on VMs or in a cloud environment." "We're leveraging the innovations of the open source Kubernetes community, together with our own software innovations for multi-tenancy, security, and persistent data storage with containers," continued Sreekanti. "The new HPE Container Platform is designed to help customers as they expand their containerization deployments, for multiple large-scale Kubernetes clusters with use cases ranging from machine learning to CI / CD pipelines."


Catalysing India's e-economy

#artificialintelligence

India is shifting quickly to become digital on the backdrop of a robust foundation of IT industry and swiftly moving on the cusp of a data revolution. An average Indian today consumes approximately 8.3 GB data per month, a 92 per cent increase compared to the data consumption four years ago, and by 2022 the per capita data consumption is expected to touch 14 GB. By 2022, the next half billion Indians will come online for the first time through their mobile phones. The immense digital presence of Indians in terms of 1.25 billion Aadhaar, 1.2 billion mobile phones and 1 billion bank accounts requires colossal digital infrastructure for storage and processing of data. This massive amount of data appetite of a digital-first nation has opened magnificent avenues for businesses and government alike.