Goto

Collaborating Authors

Results


IBM Research explains how quantum computing works and why it matters

#artificialintelligence

As the technological progress codified as Moore's Law slows down, computer scientists are turning to alternative methods of computing, such as superconducting quantum processors to deliver computing gains in the future. Jeffrey Welser, vice president and lab director at IBM Research at Almaden, spoke about quantum computing at the 49th annual Semicon West chip manufacturing show in San Francisco last week. I caught up with him to get his take on quantum computing for the layperson. IBM also displayed a part of its IBM Q System at the show, giving us an idea of how much refrigeration technology has to be built around a current quantum processor to ensure its calculations are accurate. Binary digits -- ones and zeroes -- are the basic components of information in classical computers. Quantum bits, or qubits, are built on a much smaller scale.


IBM Research breakthroughs: Nanosheet to Phase-Change Memory

#artificialintelligence

The newly announced IBM Research innovations are intended to address a critical issues with artificial intelligence advances, including making hardware systems more efficient to keep pace with the demand of artificial intelligence software and data workloads. The announcements were made at the International Electron Devices Meeting (IEDM) conference in San Francisco. The event is an annual micro- and nanoelectronics conference, which is held each December. The conference serves as a forum for reporting technological breakthroughs in the areas of semiconductor and related device technologies, design, manufacturing, physics, modeling and circuit-device interaction. The key highlights from IBM are: Nanosheet technology IBM shared its new features for high-performance computing that meet the massive data requirements of AI and 5G.


Micron debuts flash memory-optimized AI development platform - SiliconANGLE

#artificialintelligence

Computer chipmaker and storage company Micron Technology Inc. is pitching its hardware for artificial intelligence workloads after acquiring a startup called FWDNXT. The company announced the acquisition at its annual Micron Insight conference in San Francisco today, describing FWDNXT as a provider of AI hardware and software for deep learning, which is a subset of AI that tries to mimic the way the human brain solves problems. Micron's plan is to integrate FWDNXT's technology with its own, optimized flash memory products to create what it says will be a "comprehensive AI development platform." "FWDNXT is an architecture designed to create fast-time-to-market edge AI solutions through an extremely easy to use software framework with broad modeling support and flexibility," Micron Executive Vice President and Chief Business Officer Sumit Sadana said in a statement. "FWDNXT's five generations of machine learning inference engine development and neural network algorithms, combined with Micron's deep memory expertise, unlocks new power and performance capabilities to enable innovation for the most complex and demanding edge applications."


Edge computing drives storage innovation while China edges its way into flash memory - SiliconANGLE News - UrIoTNews

#artificialintelligence

The superpowers of the new economy are also the buzz words changing how the world interacts: Artificial intelligence, the "internet of things" and edge computing are the megatrends dominating the conversation on both a business and a personal level. "Practical everyday things are being done in AI," said David Floyer, co-host of theCUBE, SiliconANGLE Media's mobile livestreaming studio. "It's going from being a niche to being just everyday use, and its impact long-term is profound." TheCUBE co-host Dave Vellante joined Floyer during today's Micron Insight event in San Francisco. They discussed recent developments in storage and memory, as well as the challenges and opportunities facing Micron Technology Inc. in the marketplace (see the full interview with transcript here).


Nvidia's New EGX Platform Brings Power of Accelerated AI to the Edge

#artificialintelligence

Nvidia has announced the launch of EGX Edge Supercomputing Platform designed to let organisations easily deploy the hardware and software necessary for high-performance, low-latency AI workloads. Instead of being deployed inside big data centres, an EGX deployment is designed to sit at the edge of the cloud which, Nvidia believes, makes it ideal for the next generation of use cases. "We've entered a new era, where billions of always-on IoT sensors will be connected by 5G and processed by AI," Jensen Huang, Nvidia founder and CEO, said at a keynote ahead of MWC Los Angeles earlier this week. "Its foundation requires a new class of highly secure, networked computers operated with ease from far away. "We've created the Nvidia EGX Edge Supercomputing Platform for this world, where computing moves beyond personal and beyond the cloud to operate at planetary scale," he added. The EGX stack includes an Nvidia driver, Kubernetes plug-in, Nvidia container runtime, and GPU monitoring tools, delivered through the Nvidia GPU Operator, which allows you to standardise and automate the deployment of all necessary components for provisioning GPU-enabled Kubernetes systems. Nvidia will certify hardware as'NGC Ready for Edge' that customers will be able to buy from partners such as Advantech, Altos Computing, ASRock RACK, Atos, Dell Technologies, Fujitsu, GIGABYTE, Hewlett Packard Enterprise, Lenovo, MiTAC, QCT, Supermicro, and TYAN. Nvidia says EGX is already being used by customers. At Walmart's Intelligent Retail Lab in Levittown, New York, for example, EGX enables real time processing of more than 1.6 terabytes of data generated each second to "automatically alert associates to restock shelves, open up new checkout lanes, retrieve shopping carts, and ensure product freshness in meat and produce departments." The EGX platform features software to support a wide range of applications, including Nvidia Metropolis, which can be used to power smart cities and build intelligent video analytics applications. The city of Las Vegas, for example, is using EGX to capture vehicle and pedestrian data to make its streets safer. San Francisco's Union Square Business Improvement District is using EGX to capture real-time pedestrian counts for local retailers. "We use our smartphones sporadically -- we type into it, or watch a movie now or then -- and frankly there are only seven and a half billion of us," Huang said. "In the case of sensors, it will be streaming all the time.


VMware and Nvidia partner to simplify virtualised GPUs

#artificialintelligence

Nvidia announced its new enterprise software product, vComputeServer, which has been developed and optimised for use with VMware's vSphere. Last week, VMware announced its intention to acquire Carbon Black and Pivotal, in a massive deal that will expand the company's SaaS offerings, while enhancing its ability to enable digital transformation for customers. Before the dust had even settled on that news, the company announced today (26 August), that it is set to launch a hybrid cloud on AWS (Amazon Web Services) in partnership with Nvidia, which will improve GPU (graphics processing unit) virtualisation. The two companies say that this is the first hybrid cloud service that lets enterprises accelerate AI, machine learning or deep learning workloads with GPUs. At the VMWorld conference in San Francisco, Nvidia's VP of product management, John Fanelli, told reporters: "In a modern data centre, organisations are going to be using GPUs to power AI, deep learning and analytics. "Due to the scale of those types of workloads, they're going to be doing some processing on premise in data centres, some processing in clouds and continually iterating between them." The company said that this will make the completion of deep learning training up to 50 times faster than with a CPU alone. This product is aimed at people who may be using Nvidia's Rapids software, Fanelli explained, which is a suite of data processing and machine learning libraries used for GPU-acceleration in data science workflows. Nvidia founder and CEO Jensen Huang said: "From operational intelligence to artificial intelligence, businesses rely on GPU-accelerated computing to make fast, accurate predictions that directly impact their bottom line.


I played Shadow of the Tomb Raider over 5G, and it didn't suck

PCWorld

Anyone who's experimented with a cloud gaming service knows that wired ethernet is almost required. At AT&T's Spark conference in San Francisco on Monday, I had a chance to try out Nvidia's GeForce Now service for PCs running over AT&T's 5G service, playing the newly-released Shadow of the Tomb Raider game on a generic Lenovo ThinkPad. The traditional way to run a PC game is locally, running the game off a hard drive or SSD on your PC, using the CPU and GPU to render the game as fast as it can. The downside, of course, is that you have to buy all of that hardware yourself. The trade-off is that the 3D rendering takes place on a remote server--a cheaper solution than buying a high-end graphics card, at least in the short term.


A quantum computing startup tries to live up to the hype

The Japan Times

SAN FRANCISCO – Few corners of the tech industry are as tantalizing or complex as quantum computing. For years its evangelists have promised machines capable of breaking the most impenetrable coded messages, unlocking the secret properties of the physical world and putting supercomputers to shame. Right now, Rigetti's challenge for itself is this: Can it solve one, single problem with a quantum computer that a conventional machine cannot? Even if it just meant answering a question more quickly or cheaply than a supercomputer, the team of physicists and mathematicians at the startup's Berkeley, California, office would be overjoyed. Today, your laptop can solve pretty much everything one of the startup's quantum computers can do, just as quickly.


IBM's Dario Gil says quantum computing promises to accelerate AI

#artificialintelligence

Speaking at MIT Technology Review's EmTech Digital conference in San Francisco, Dario Gil of IBM said that quantum computers, which take advantage of the mind-bending phenomena of quantum physics, could have a big impact on one of the hottest fields in technology: artificial intelligence. Unlike classical computers, which store information in bits that are either 1 or 0, quantum computers use qubits, which can exist in multiple states of 1 and 0 at the same time--a phenomenon known as "superposition." Qubits can also influence one another even when they're not physically connected, via a process known as "entanglement." Thanks to these exotic qualities, adding extra qubits to a quantum machine increases its computing power exponentially (see our qubit counter here). There are still challenges to be overcome.


Qubit announces AI-powered mobile commerce discovery platform at MB 2017

#artificialintelligence

Online retailers have a problem. Search is broken, and much of the available inventory is never found by consumers. In fact, 70 percent of the top 50 grossing US ecommerce sites still require "exact matching" to ensure a product appears in the results. Today Qubit -- the marketing personalization company -- has announced at MobileBeat (MB 2017) in San Francisco an exclusive beta program for Aura, an AI-powered solution that is changing how consumers discover the product catalog on mobile devices. Most mcommerce sites attempt to solve product discovery through search.