Results


Why it's time for CIOs to invest in machine learning

#artificialintelligence

To Olley, machine learning fills a gap in technology that has existed for a long time: solving complex problems with pattern recognition. "With the majority of Elsevier's revenue coming from technology-based products and services, we started using machine learning in our commercial products, but it's equally applicable to internal IT platforms," Olley says. As part of the executive teams within RBI and Elsevier, Dan continues to drive organic online product growth across the portfolio. Prior to RELX Group, Dan held technology and product management leadership roles with GM Financial, Wunderman Cato Johnson, and IBM, as well as a number of software organizations in the United Kingdom and other international locales.


AI everywhere

#artificialintelligence

"We invented a computing model called GPU accelerated computing and we introduced it almost slightly over 10 years ago," Huang said, noting that while AI is only recently dominating tech news headlines, the company was working on the foundation long before that. Nvidia's tech now resides in many of the world's most powerful supercomputers, and the applications include fields that were once considered beyond the realm of modern computing capabilities. Now, Nvidia's graphics hardware occupies a more pivotal role, according to Huang – and the company's long list of high-profile partners, including Microsoft, Facebook and others, bears him out. GTC, in other words, has evolved into arguably the biggest developer event focused on artificial intelligence in the world.


How blockchain can create the world's biggest supercomputer

#artificialintelligence

As our desktop computers, laptops, mobile devices, etc. stand idly by for a huge portion of the day, the need for computing resources is growing at a fast pace. Large IoT ecosystems, machine learning and deep learning algorithms and other sophisticated solutions being deployed in every domain and industry are raising the demand for stronger cloud servers and more bandwidth to address the minute needs of enterprises and businesses. So how can we make a more economic and efficient use of all the computing power that's going to waste? Blockchain, the distributed ledger that's gaining traction across various domains, might have the answer to the dilemma by providing a platform that enables participants to lend and borrow computing resources -- and make money in the process. "There is a growing demand for computing power from industries and scientific communities to run large applications and process huge volumes of data," says Gilles Fedak, co-founder of iEx.ec, a distributed cloud computing platform.


Smart machines primed to accelerate across business

#artificialintelligence

Almost one-third of larger companies will make use of smart machines by 2021, according to analyst Gartner. Such machines, which include cognitive computing, artificial intelligence (AI), intelligent automation, machine learning and deep learning, will change the way businesses operate, the analyst firm predicted. Download our most popular articles for IT leaders to learn about skills of great CIOs, tech and digital strategy at the board-level, salary survey results, finding funding for digital initiatives and digital opportunities for CIOs. Corporate E-mail Address: This email address is already registered. By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.


From the Turing Test to Deep Learning: Artificial Intelligence Goes Mainstream 7wData

@machinelearnbot

This year, the Association for Computing Machinery (ACM) celebrates 50 years of the ACM Turing Award, the most prestigious technical award in the computing industry. The Turing Award, generally regarded as the'Nobel Prize of computing', is an annual prize awarded to "an individual selected for contributions of a technical nature made to the computing community". In celebration of the 50 year milestone, renowned computer scientist Melanie Mitchell spoke to CBR's Ellie Burns about artificial intelligence (AI) – the biggest breakthroughs, hurdles and myths surrounding the technology. EB: What are the most important examples of Artificial Intelligence in mainstream society today? MM: There are many important examples of AI in the mainstream; some very visible, others blended in so well with other methods that the AI part is nearly invisible.


Nvidia CEO's "Hyper-Moore's Law" Vision for Future Supercomputers

#artificialintelligence

Over the last year in particular, we have documented the merger between high performance computing and deep learning and its various shared hardware and software ties. This next year promises far more on both horizons and while GPU maker Nvidia might not have seen it coming to this extent when it was outfitting its first GPUs on the former top "Titan" supercomputer, the company sensed a mesh on the horizon when the first hyperscale deep learning shops were deploying CUDA and GPUs to train neural networks. All of this portends an exciting year ahead and for once, the mighty CPU is not the subject of the keenest interest. Instead, the action is unfolding around the CPU's role alongside accelerators; everything from Intel's approach to integrating the Nervana deep learning chips with Xeons, to Pascal and future Volta GPUs, and other novel architectures that have made waves. While Moore's Law for traditional CPU-based computing is on the decline, Jen-Hsun Huang, CEO of GPU maker, Nvidia told The Next Platform at SC16 that we are just on the precipice of a new Moore's Law-like curve of innovation--one that is driven by traditional CPUs with accelerator kickers, mixed precision capabilities, new distributed frameworks for managing both AI and supercomputing applications, and an unprecedented level of data for training.


How Machine Learning is Changing the Face of the Data Center 7wData

#artificialintelligence

Machine learning and artificial intelligence have arrived in the data center, changing the face of the hyperscale server farm as racks begin to fill with ASICs, GPUs, FPGAs and supercomputers. These technologies provide more computing horsepower to train machine learning systems, a process that involved enormous amounts of data-crunching. The end goal is to create smarter applications, and improve the services you already use every day. "Artificial intelligence is now powering things like your Facebook Newsfeed," said Jay Parikh, Global Head of Engineering and Infrastructure for Facebook. "It is helping us serve better ads.


The Business Implications of Machine Learning

#artificialintelligence

As buzzwords become ubiquitous they become easier to tune out. We've finely honed this defense mechanism, for good purpose. It's better to focus on what's in front of us than the flavor of the week. CRISPR might change our lives, but knowing how it works doesn't help you. VR could eat all media, but it's hardware requirements keep it many years away from common use.


From the Turing Test to Deep Learning: Artificial Intelligence Goes Mainstream - Computer Business Review

#artificialintelligence

This year, the Association for Computing Machinery (ACM) celebrates 50 years of the ACM Turing Award, the most prestigious technical award in the computing industry. The Turing Award, generally regarded as the'Nobel Prize of computing', is an annual prize awarded to "an individual selected for contributions of a technical nature made to the computing community". In celebration of the 50 year milestone, renowned computer scientist Melanie Mitchell spoke to CBR's Ellie Burns about artificial intelligence (AI) – the biggest breakthroughs, hurdles and myths surrounding the technology. EB: What are the most important examples of Artificial Intelligence in mainstream society today? MM: There are many important examples of AI in the mainstream; some very visible, others blended in so well with other methods that the AI part is nearly invisible.


Artificial Intelligence vs. Deep Learning vs. Big Data - Nanalyze

@machinelearnbot

Computing was some pretty exciting stuff for those of us back in the 80s who still remember the first time we booted up our 386DX. That's right, the "DX", not the "MX". While nobody could really say what the advantages of the "DX" were, better at math or something, we still ponied up the extra $200 USD to pick up that 386DX 16Mhz along with a Super VGA graphics card, then hooked that bad boy up to CompuServe via our lightning fast 14,400 baud U.S. Robotics "Sportster" modem. That was well before Al Gore created the Internet, and a lot has changed since then. Personal computing just isn't cool anymore and it's all about "the cloud" and "big data" and "deep learning".