Results


Inside Microsoft's AI Comeback

#artificialintelligence

But while his peer scientists Yann LeCun and Geoffrey Hinton have signed on to Facebook and Google, respectively, Bengio, 53, has chosen to continue working from his small third-floor office on the hilltop campus of the University of Montreal. Shum, who is in charge of all of AI and research at Microsoft, has just finished a dress rehearsal for next week's Build developers conference, and he wants to show me demos. Shum has spent the past several years helping his boss, CEO Satya Nadella, make good on his promise to remake Microsoft around artificial intelligence. Bill Gates showed off a mapping technology in 1998, for example, but it never came to market; Google launched Maps in 2005.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


To democratise artificial intelligence, Intel launches educational programme for developers

#artificialintelligence

Reiterating its commitment to boost adoption of artificial intelligence (AI), Intel India today announced a developer community initiative – AI Developer Education Programme, aimed at educating 15,000 scientists, developers, analysts, and engineers. The educational programme is also aimed at deep learning and machine learning, the tech major said in a statement. The programme was announced at the first AI Day held in Bengaluru where thought-leaders from government, industry, and the academia congregated and discussed the potential of accelerating the AI revolution in the country. Under the programme, Intel will run 60 programmes across the year, ranging from workshops, roadshows, user group and senior technology leader round-tables. Announcing the programme, Intel South Asia managing director Prakash Mallya said data center and the intelligence behind the data collected can enable government and industry to make effective decisions based on algorithms.


Intel bets on India to boost artificial intelligence usage

#artificialintelligence

Global chip maker Intel on Tuesday announced a string of initiatives to boost the usage of Artificial Intelligence (AI) in diverse sectors by collaborating with partners and customers across the country. "Our developer education programme will educate 15,000 scientists, developers, analysts and engineers on AI technologies, including Deep Learning and Machine Learning in India," said Intel South Asia Managing Director Praksh Mallya here. AI is a software programme that makes computers and machines think intelligently and faster with more predictability than a human mind. AI is also the main workload in data centres which operate in line with the Moore's Law of computing power doubling every year. By 2020, the industry expects more servers to process data analytics than other workloads and analytics predictors will be built into every application.


Deep Learning Institute Workshop hosted by Dedicated Computing, NVIDIA and Milwaukee School of Engineering

#artificialintelligence

Dedicated Computing is co-hosting a Deep Learning Institute workshop in collaboration with NVIDIA and Milwaukee School of Engineering (MSOE). The workshop will take place at MSOE on April 13, 2017. Deep learning is a new area of machine learning that seeks to use algorithms, big data, and parallel computing to enable real-world applications and deliver results. Machines are now able to learn at the speed, accuracy, and scale required for true artificial intelligence. This technology is used to improve self-driving cars, aid mega-city planners, and help discover new drugs to cure disease.


Intel bets on India to boost artificial intelligence usage

#artificialintelligence

Global chip maker Intel on Tuesday announced a string of initiatives to boost the usage of Artificial Intelligence (AI) in diverse sectors by collaborating with partners and customers across the country. "Our developer education programme will educate 15,000 scientists, developers, analysts and engineers on AI technologies, including Deep Learning and Machine Learning in India," said Intel South Asia Managing Director Praksh Mallya here. AI is a software programme that makes computers and machines think intelligently and faster with more predictability than a human mind. AI is also the main workload in data centres which operate in line with the Moore's Law of computing power doubling every year. By 2020, the industry expects more servers to process data analytics than other workloads and analytics predictors will be built into every application.


Everything you want to know about how AI is transforming banking but were too afraid to ask

#artificialintelligence

The rise of artificial intelligence (AI) is set to change the way banks and financial services operate, as well as the way consumers approach their personal banking. Powerful AI can replace humans with machines, improve customer experience and provide simplified cost-effective solutions for businesses. AI is the science of creating intelligent machines and computer programs that can replace human tasks. Computer programs have vast capabilities to execute tasks quicker than humans can, with embedded algorithms that leave less chance to make human error. Robert Smith, Chairman and CEO of Vista Equity partners said at The World Economic Forum: "Since the invention of computers, we have envisioned that computer systems will take the best of what we think and deliver real time solutions that are more efficient and efficient.


The New Intel: How Nvidia Went From Powering Video Games To Revolutionizing Artificial Intelligence

#artificialintelligence

Nvidia cofounder Chris Malachowsky is eating a sausage omelet and sipping burnt coffee in a Denny's off the Berryessa overpass in San Jose. It was in this same dingy diner in April 1993 that three young electrical engineers--Malachowsky, Curtis Priem and Nvidia's current CEO, Jen-Hsun Huang--started a company devoted to making specialized chips that would generate faster and more realistic graphics for video games. East San Jose was a rough part of town back then--the front of the restaurant was pocked with bullet holes from people shooting at parked cop cars--and no one could have guessed that the three men drinking endless cups of coffee were laying the foundation for a company that would define computing in the early 21st century in the same way that Intel did in the 1990s. "There was no market in 1993, but we saw a wave coming," Malachowsky says. "There's a California surfing competition that happens in a five-month window every year.


Where are developers looking next?

@machinelearnbot

As part of the research underpinning Developer Economics we actively monitor industry trends and opportunities, looking for new areas of significant developer interest. In our Developer Economics survey, we invested in trends in Data Science and Machine Learning among other areas of emerging tech- the latter probably being the least hyped emerging tech space with the most developer activity. A side effect of there now being a 1990s level supercomputer in 2 3 billion pockets worldwide is that we're drowning in data. All of the data collected in human history, up to the turn of the millennium, is certainly less than we now generate every day. The Internet of Things is adding sensors to anything and everything, which will compound this problem.