computing


How can quantum computing be useful for Machine Learning

#artificialintelligence

If you've heard of quantum computing, you might be excited about the possibility of applying it to machine learning applications. I work at Springboard, and we recently launched a machine learning bootcamp that includes a job guarantee. We want to make sure our graduates are exposed to cutting-edge machine learning applications -- so we put together this article as part of our research into the intersection of quantum computing and machine learning. Let's start by examining the difference between quantum computing and classical computing. In classical computing, your data is stored in physical bits and it is binary and mutually exhaustive: a bit is either in a 0 state or in a 1 state and it cannot be both at the same time.


Emerging Trends In Cognitive Systems

#artificialintelligence

From automated to autonomous and now cognitive, a paradigm shift is taking place in the design principles of machines, matter, methods and more. So, what are the principles of cognitive design? And are they centered around the future of humanity? From tabulating systems to programmable systems and now with cognitive systems, the evolution in computing allows humans to move beyond numbers and data to knowledge and intelligence. It is no longer about the replacement of man with machine, but rather about intelligence augmentation.


Intel Extends FPGA Ecosystem: Edge, Network, Data Center

#artificialintelligence

The insatiable appetite for higher throughput and lower latency – particularly where edge analytics and AI, network functions, or for a range of data center acceleration needs are concerned – has compelled IT managers and chip makers to venture out, increasingly, beyond CPUs and GPUs. The "inherent parallelism" of FPGAs (see below) to handle specialized workloads in AI- and HPDA-related implementations has brought on greater investments from IT decision makers and vendors, who see increasing justification for the challenge of FPGA programming. Of course, adoption of unfamiliar technologies is always painful and slow, particularly those without a built-out ecosystem of frameworks and APIs that simplify their use. Why are FPGAs bursting out of their communication, industrial and military niches and into the data center? Partly because of the limits of CPUs, which have their roots on the desktop and were, said Steve Conway, senior research VP at Hyperion Research, never really intended for advanced computing.


28 Artificial Intelligence Terms You Need to Know - DZone AI

#artificialintelligence

As artificial intelligence becomes less of an ambiguous marketing buzzword and more of a precise ideology, it's increasingly becoming a challenge to understand all of the AI terms out there. So to kick off the brand new AI Zone, the Editorial Team here at DZone got together to define some of the biggest terms in the world of artificial intelligence for you. Algorithms: A set of rules or instructions given to an AI, neural network, or other machines to help it learn on its own; classification, clustering, recommendation, and regression are four of the most popular types. Artificial intelligence: A machine's ability to make decisions and perform tasks that simulate human intelligence and behavior. Artificial neural network (ANN): A learning model created to act like a human brain that solves tasks that are too difficult for traditional computer systems to solve.


Machine-Learning on the Rise in Financial Services: Refinitiv

#artificialintelligence

Conducted via 447 telephone interviews of senior executives and data-science practitioners across various financial services firms, the survey also found the quality of data as the primary barrier to machine-learning adoption. Machine-learning has long been the mainstay of deep-pocketed hedge funds, which have combined complex algorithmic strategies with financial data to make big bets on markets. But with the growing use of cloud computing and the constant pressure on banks to reduce costs, machine-learning techniques have seen a greater acceptance among banks. "Thanks to parallel computing and cloud computing, we are seeing the playing field being slowly leveled in terms of machine-learning strategies," said Tim Baker, global head of applied innovation at Refinitiv. The survey also found foreign exchange ranked a distant fourth in terms of structured data by asset class with stocks, fixed income and derivatives the top three.


Qualcomm President Amon intends to win in cloud where company failed in past

ZDNet

"Were you surprised we went into the cloud?" asks Qualcomm president Cristiano Amon during a chat by video conference. Amon was referring to how Qualcomm, a giant in mobile chips, is now hoping to make it big in machine learning "inference" for data centers. Qualcomm is not a presence in the data center. It entered that market in 2014 with lots of gusto, only to back out last year. When Amon became president, in December of 2017, his team took a look at the enormous cost to compete with the server CPU king, Intel, and how little it had produced in actual shipments for Qualcomm.


Intel unveils broad Xeon stack with dozens of workload-optimized processors

ZDNet

The Intel Xeon Family (from left): Intel Xeon Platinum 9200 processor, 2nd-Gen Intel Xeon Scalable Processor and Intel Xeon D-1600 Processor. Intel Corporation on April 2, 2019, introduced a portfolio of data-centric tools to help its customers extract more value from their data. Intel on Tuesday announced its broadest portfolio of Xeon processors to date, including more than 50 workload-optimized processors. The new Xeon chips, along with other new chips, memory and storage solutions, are all part of Intel's strategy to transform from a "PC-centric" company into a "data-centric" company. The products announced Tuesday amount to an "unmatched portfolio to move, store and process data," Navin Shenoy, Intel EVP and GM of the Data Center Group, said at a launch event.


'Godfathers of AI' Receive Turing Award, the Nobel Prize of Computing - AI Trends

#artificialintelligence

The 2018 Turing Award, known as the "Nobel Prize of computing," has been given to a trio of researchers who laid the foundations for the current boom in artificial intelligence. Yoshua Bengio, Geoffrey Hinton, and Yann LeCun -- sometimes called the'godfathers of AI' -- have been recognized with the $1 million annual prize for their work developing the AI subfield of deep learning. The techniques the trio developed in the 1990s and 2000s enabled huge breakthroughs in tasks like computer vision and speech recognition. Their work underpins the current proliferation of AI technologies, from self-driving cars to automated medical diagnoses. In fact, you probably interacted with the descendants of Bengio, Hinton, and LeCun's algorithms today -- whether that was the facial recognition system that unlocked your phone, or the AI language model that suggested what to write in your last email.


Developments in Quantum Computing - Connected World

#artificialintelligence

One of the hallmarks of this century will be the progress made toward a new paradigm in computing: quantum computing. A quantum computer has the potential to quickly and efficiently solve problems that conventional computers can't tackle by leveraging principles of quantum physics, such as superposition. While still in its early stages, the quantum computing market is already expanding, and there's much more growth expected in the years to come. This expected growth is thanks to the efforts of computing companies that see the benefits of a quantum future and want to capitalize on it. Tractica says the enterprise quantum computing market will reach $2.2 billion by 2025, up from $39.2 million in 2017.


10 Books on AI Machine Learning You Shouldn't Miss Reading

#artificialintelligence

"If you program a machine, you know what it's capable of. If the machine is programming itself, who knows what it might do?" ― Garry Kasparov Artificial Intelligence is a complex subject. However, reading and acquiring knowledge through books written on Artificial Intelligence, Machine Learning, Data Science and other related topics can help technology enthusiasts to a great extent. Here is a list of ten books on AI and Machine Learning that provide the information on basics of technology, its present, the future paradigm and the most rabid fictionalized set-ups that are expected to arrive in the coming future. We have rated the books on a scale of 1-5, considering their depth, research, uniqueness, reader's review, and the AiThority News Quotient.