Results


Is Artificial Intelligence Finally Coming into Its Own?

#artificialintelligence

In March the company bought a startup cofounded by Geoffrey Hinton, a University of Toronto computer science professor who was part of the team that won the Merck contest. Extending deep learning into applications beyond speech and image recognition will require more conceptual and software breakthroughs, not to mention many more advances in processing power. Programmers would train a neural network to detect an object or phoneme by blitzing the network with digitized versions of images containing those objects or sound waves containing those phonemes. A team led by Stanford computer science professor Andrew Ng and Google Fellow Jeff Dean showed the system images from 10 million randomly selected YouTube videos.


Sundar Pichai Sees Google's Future in the Smartest Cloud

#artificialintelligence

In other words, he hopes the new chip and the new service will set Google's cloud business apart from services offered by its main rivals, including Amazon and Microsoft, the unnamed competitive threat underlying his I/O keynote. Between its two AI labs--Google Brain, based at company headquarters in Silicon Valley, and DeepMind, a London AI startup Google purchased a little more than three years ago--Google is leading the new wave of artificial intelligence research and development so rapidly changing entire industries and economies. But the company believes cloud computing--where computing power is rented over the internet to businesses and software developers--could one day bring in far more. Google built its new chip as a better way of serving its own AI services, most notably Google Translate, says Jeff Dean, the uber-engineer who oversees Google Brain, the company's main AI lab.


The New Intel: How Nvidia Went From Powering Video Games To Revolutionizing Artificial Intelligence

#artificialintelligence

It was in this same dingy diner in April 1993 that three young electrical engineers--Malachowsky, Curtis Priem and Nvidia's current CEO, Jen-Hsun Huang--started a company devoted to making specialized chips that would generate faster and more realistic graphics for video games. "We've been investing in a lot of startups applying deep learning to many areas, and every single one effectively comes in building on Nvidia's platform," says Marc Andreessen of venture capital firm Andreessen Horowitz. Starting in 2006, Nvidia released a programming tool kit called CUDA that allowed coders to easily program each individual pixel on a screen. From his bedroom, Krizhevsky had plugged 1.2 million images into a deep learning neural network powered by two Nvidia GeForce gaming cards.


WTF is machine learning?

#artificialintelligence

While not well understood, neural networks, deep learning, and reinforcement learning are all machine learning. Each layer of a deep learning model lets the computer identify another level of abstraction of the same object. Reinforcement learning, takes ideas from game theory, and includes a mechanism to assist learning through rewards. Researchers refer to this challenge as the black box problem of machine learning.


Alibaba to supply AI and data tech to Chinese deep space exploration and smart city projects

#artificialintelligence

Alibaba will be among 13 businesses working with the Hangzhou government on a'brain' for the city and will work with the National Astronomical Observatory of China (NAOC) on deep space exploration projects, it announced at its annual Computing Conference this week. According to the retail and cloud computing giant, it will be supplying a range of its tech services such as AI, deep learning and data storage. The B2B technology supply side to the Alibaba business is growing fast and puts it very much in battle with Amazon on a global playing field. The Hangzhou City Brain project is a new government initiative to address its urban city living issues, such as traffic congestion. It will use Alibaba Cloud's AI program "ET" and big data analytics capabilities to perform real-time traffic prediction by using its video and image recognition technologies.


The Rise of Deep Learning

#artificialintelligence

Deep learning is becoming increasingly used throughout the world of technology, and there are now endless blogs, books, courses and other resources available for those to use. If that's still not quite good enough for you and you don't want to implement deep learning yourself, there are now several machine learning API services available that will do it for you. But, where has this rise in deep learning stemmed from you may be wondering? Big companies use deep learning techniques in various practices throughout their businesses. They generate lots of data, and this is mega important to them as they can learn from this data which will ultimately lead to increased revenue.


Google Artificial Intelligence Guru Says A.I. Won't Eliminate Jobs

#artificialintelligence

Computers can more easily recognize cats in photos and translate text because of advances in artificial intelligence. Mustafa Suleyman, co-founder of artificial intelligence startup DeepMind, later acquired by Google, said on Monday that has seen no evidence that advances in A.I. technologies are impacting the workforce. Nevertheless, it's something that people "should definitely pay attention to" as the technologies continue to mature. Suleyman predicated that humanity is still "many decades away from encountering that sort of labor replacement at scale." Instead, the technology is best used to help humans with work-related tasks rather than replace them outright.


Self-learning computer tackles problems beyond the reach of previous systems

#artificialintelligence

Experimental tests have shown that the new system, which is based on the artificial intelligence algorithm known as "reservoir computing," not only performs better at solving difficult computing tasks than experimental reservoir computers that do not use the new algorithm, but it can also tackle tasks that are so challenging that they are considered beyond the reach of traditional reservoir computing. The results highlight the potential advantages of self-learning hardware for performing complex tasks, and also support the possibility that self-learning systems--with their potential for high energy-efficiency and ultrafast speeds--may provide an extension to the anticipated end of Moore's law. The researchers, Michiel Hermans, Piotr Antonik, Marc Haelterman, and Serge Massar at the Université Libre de Bruxelles in Brussels, Belgium, have published a paper on the self-learning hardware in a recent issue of Physical Review Letters. "On the one hand, over the past decade there has been remarkable progress in artificial intelligence, such as spectacular advances in image recognition, and a computer beating the human Go world champion for the first time, and this progress is largely based on the use of error backpropagation," Antonik told Phys.org. "On the other hand, there is growing interest, both in academia and industry (for example, by IBM and Hewlett Packard) in analog, brain-inspired computing as a possible route to circumvent the end of Moore's law.


The Conundrum of Machine Learning and Cognitive Biases

#artificialintelligence

Machine learning is on the rise due to the technological convergence of the growth of big data, decreasing data storage costs, increasing computing power, improved artificial intelligence algorithms and acceleration of cloud computing. Machine learning is the ability for computers to learn without explicit programming. It's analogous to the human ability to identify an octopus based on the set of data input that goes to the brain, such as eight arms, tentacles, lack of skeleton and other characteristics, without having prior knowledge of every type of cephalopod mollusk in existence. However, human decision-making is subject to numerous cognitive biases that can easily distort judgement. For example, iconoclastic author Tom Peters highlights 159 cognitive biases that impact management decision-making (Peters, Tom.


Artificial intelligence and the future of design

#artificialintelligence

For a deep dive into emerging AI techniques and technologies, join us September 26-27, 2016, for the O'Reilly Artificial Intelligence Conference in New York. If a figure be anyhow divided and the compartments differently coloured so that figures with any portion of common boundary line are differently coloured--four colours may be wanted, but not more--the following is the case in which four colours are wanted. Query cannot a necessity for five or more be invented. That is, you'll never need more than four colors on an ordinary two-dimensional map in order to color every country differently from the countries adjoining it. A proof for the four-color conjecture evaded mathematicians until 1976, when Kenneth Appel and Wolfgang Haken announced a solution.