Goto

Collaborating Authors

IBM Unlocks New Storage Secrets

#artificialintelligence

For the first time, scientists at IBM Research have demonstrated reliably storing 3 bits of data per cell using a relatively new memory technology known as phase-change memory (PCM). The current memory landscape spans from venerable DRAM to hard disk drives to ubiquitous flash. But in the last several years PCM has attracted the industry's attention as a potential universal memory technology based on its combination of read/write speed, endurance, non-volatility and density. For example, PCM doesn't lose data when powered off, unlike DRAM, and the technology can endure at least 10 million write cycles, compared to an average flash USB stick, which tops out at 3,000 write cycles. This research breakthrough provides fast and easy storage to capture the exponential growth of data from mobile devices and the Internet of Things.


How AI In Edge Computing Drives 5G And The IoT

#artificialintelligence

Edge computing, which is the concept of processing and analyzing data in servers closer to the applications they serve, is growing in popularity and opening new markets for established telecom providers, semiconductor startups, and new software ecosystems. It's brilliant how technology has come together over the last several decades to enable this new space starting with Big Data and the idea that with lots of information, now stored in mega-sized data centers, we can analyze the chaos in the world to provide new value to consumers. Combine this concept with IoT, and connected everything, from coffee cups to pill dispensers, oil refineries to paper mills, smart goggles to watches, and the value to the consumer could be infinite. However, many argue the market didn't experience the hockey stick growth curves expected for the Internet of Things. The connectivity of the IoT simply didn't bring enough consumer value, except for specific niches. Over the past five years however, technology advancements as artificial intelligence (AI) has begun to revolutionize industries and the concepts of the amount of value that connectivity can provide to consumers. It's a very exciting time as the market can see unlimited potential in the combination of big data, IoT, and AI, but we are only at the beginning of a long road.


Decision points in storage for artificial intelligence, machine learning and big data

#artificialintelligence

Data analytics has rarely been more newsworthy. Throughout the Covid-19 coronavirus pandemic, governments and bodies such as the World Health Organization (WHO) have produced a stream of statistics and mathematical models. Businesses have run models to test post-lockdown scenarios, planners have looked at traffic flows and public transport journeys, and firms use artificial intelligence (AI) to reduce the workload for hard-pressed customer services teams and to handle record demand for e-commerce. Even before Covid-19, industry analysts at Gartner pointed out that expansion of digital business would "result in the unprecedented growth of unstructured data within the enterprise in the next few years". Advanced analytics needs powerful computing to turn data into insights.


AI is data Pac-Man. Winning requires a flashy new storage strategy.

#artificialintelligence

When it comes to data, AI is like Pac-Man. Hard disk drives, NAS, conventional data center and cloud-based storage schemes can't sate AI's voracious appetite for speed and capacity, especially for real time. Playing the game today requires a fundamental rethinking of storage as a foundation of machine learning, deep learning, image processing, and neural network success. "AI and Big Data are dominating every aspect of decision-making and operations," says Jeff Denworth, vice president of products and co-founder at Vast Data, a provider of all-flash storage and services. "The need for vast amounts of fast data are rendering the traditional storage pyramid obsolete. Applying new thinking to many of the toughest problems helps simplify the storage and access of huge reserves of data, in real time, leading to insights that were not possible before."


Take your machine-learning workloads to the edge? Yes, says Intel

#artificialintelligence

Sponsored Artificial intelligence and machine learning hold out the promise of enabling businesses to work smarter and faster, by improving and streamlining operations or offering firms the chance to gain a competitive advantage over their rivals. But where is best to host such applications – in the cloud, or locally, at the edge? Despite all the hype, it is early days for the technologies that we loosely label "AI", and many organisations lack the expertise and resources to really take advantage of it. Machine learning and deep learning often require teams of experts, for example, as well as access to large data sets for training, and specialised infrastructure with a considerable amount of processing power. This is because cloud service providers have a wealth of development tools and other resources readily available such as pre-trained deep neural networks for voice, text, image, and translation processing, according to Moor Insights & Strategy Senior Analyst Karl Freund.