Results


Exploring the Artificially Intelligent Future of Finance

#artificialintelligence

Exploring the Artificially Intelligent Future of Finance With technological enhancements increasing computing power and decreasing its cost, easing access to big data and innovating algorithms, there has been a huge surge in interest of artificial intelligence, machine learning and its subset, deep learning, in recent years. What have been the leading factors enabling recent advancements and uptake of deep learning? Yuanyuan: Customer experience could be significantly improved using AI by analyzing individual level attributes to make traditional service much more tailor-made. Alesis: One of the main challenges for start-ups when applying Machine Learning specifically to financial services is educating the customers on the importance of data and access to it.


IBM Watson: Not So Elementary

#artificialintelligence

As for what we would call unsupervised learning--which is to say, we're not training it to process but it's beginning to learn on its own--that is moving more in the direction of what some consider true artificial intelligence, or even AGI: artificial general intelligence. But you can begin to understand extended cold spells, extended warm spells, droughts--and all of these things help with water management and agriculture. And I would say we'll see some of this in other areas--traffic systems, logistics systems, et cetera. In America, we have a national weather service, we have NOAA [National Oceanic and Atmospheric Administration], we've got a private sector offering forecasts.


IBM Watson: Not So Elementary

#artificialintelligence

FORTUNE: We hear a lot of terms on the AI front these days--"artificial intelligence," "machine learning," "deep learning," "unsupervised learning," and the one IBM uses to describe Watson: "cognitive computing." So it's a system between machine computing and humans interpreting, and we call those machine-human interactions cognitive systems. KENNY: It takes enormous, enormous amounts of computing power to do that because you've got to leave Watson running at all times, just like the human brain, and that's why I believe cloud computing has been such an important enabler here because prior to cloud computing--where you could access many machines at the same time--you were limited by a mainframe. Bob Picciano (left) of IBM with David Kenny (right) at the IBM Insight Conference in 2015.


How quantum effects could improve artificial intelligence

#artificialintelligence

More recently, research has suggested that quantum effects could offer similar advantages for the emerging field of quantum machine learning (a subfield of artificial intelligence), leading to more intelligent machines that learn quickly and efficiently by interacting with their environments. As quantum technologies emerge, quantum machine learning will play an instrumental role in our society--including deepening our understanding of climate change, assisting in the development of new medicine and therapies, and also in settings relying on learning through interaction, which is vital in automated cars and smart factories." In the new study, the researchers' main result is that quantum effects can help improve reinforcement learning, which is one of the three main branches of machine learning. But while in certain situations quantum effects have the potential to offer great improvements, in other cases classical machine learning likely performs just as well or better than it would with quantum effects.


Business is waking up to the idea of deep learning

#artificialintelligence

In the movie Transcendence, Johnny Depp plays Dr Will Caster, a researcher in artificial intelligence at Berkeley trying to build a sentient computer. It is transforming how computers transcribe speech into text, recognise images, rank search results, and perform many other tasks that require intelligence. For instance, deep learning requires lots of data. It is sure to play a critical role in driving autonomous cars, ranking search results, recommending products, identifying spam email, trading stocks, and interpreting medical images.


Deep-learning artificial intelligence - Can We Open the Black Box of AI? the plastic brain

#artificialintelligence

"Sandia National Laboratories researchers are drawing inspiration from neurons in the brain, such as these green fluorescent protein-labeled neurons in a mouse neocortex, with the aim of developing neuro-inspired computing systems to reboot computing. "Summary: Researchers explore neural computing to extend Moore's Law. Sandia explores neural computing to extend Moore's Law. Historically, neural computing has been seen as approximate and fuzzy, he added; however, Sandia researchers in their papers aim to extend neural algorithms so they incorporate rigor and predictability, which shows they may have a role in high performance scientific computing.


Azure is becoming the first AI supercomputer, says Microsoft

ZDNet

You may have thought it was just a cloud computing service, but Microsoft's Azure Cloud is on its way to become the first artificial intelligence supercomputer, according to the company's CEO Satya Nadella. This means building cloud processing power not just based on traditional CPU architectures but also on GPUs. On top of this AI-oriented architecture, Nadella said Microsoft is offering higher level services that help build AI services, such as APIs to connect to speech, image, object recognition and natural language processing services. Amazon's new GPU-cloud wants to chew through your AI and big data projects Azure, Office 365: Microsoft's two new cloud regions tackle data privacy issues


AI Computing Takes Center Stage at GTC China NVIDIA Blog

#artificialintelligence

Kicking off the first in a series of global GPU Technology Conferences, NVIDIA co-founder and CEO Jen-Hsun Huang today at GTC China unveiled technology that will accelerate the deep learning revolution that is sweeping across industries. On stage he announced the Tesla P4 and P40 GPU accelerators for inferencing production workloads for AI services and, a small, energy-efficient AI supercomputer for highway driving -- the NVIDIA DRIVE PX 2 for AutoCruise. NVIDIA GPUs powering deep learning neural networks are the key enabling technologies for the future development of AI, Huang said. In China specifically, Huang described firms that are using deep learning to provide real-time weather forecasting, eye-tracking for human-machine interaction, medical imaging for early detection of disease, product recognition, detection and search, and personal concierge applications.


Supermicro(R) Introduces NVIDIA(R) Pasca(TM) GPU-Enabled Server Solutions Featuring NVIDIA Tesla(R) P100 GPUs

#artificialintelligence

Super Micro Computer, Inc. (SMCI), a global leader in compute, storage, networking technologies and green computing today announced the general availability of its SuperServer solutions optimized for NVIDIA Tesla P100 accelerators with the new Pascal GPU architecture. "The new SuperServers deliver superior energy-efficient performance for compute-intensive data analytics, deep learning and scientific applications while minimizing power consumption." With the convergence of Big Data Analytics, the latest GPU architectures, and improved Machine Learning algorithms, Deep Learning applications require processing power of multiple GPUs that must communicate efficiently and effectively to expand the GPU network. Supermicro (SMCI), the leading innovator in high-performance, high-efficiency server technology is a premier provider of advanced server Building Block Solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide.


Artificial Intelligence will rely on open models

#artificialintelligence

We are entering the "age of complexity"., where computers can optimize processes based by crunching massive data sets. It takes a lot of time to label the data needed to train machines, and high-quality labeled data sets are therefore difficult to come by. Skimming through these data sets requires huge computational power. This is why a worldwide computer grid enabling energy-efficient computing power is critical to the development of Artificial Intelligence.