Results


Inside Microsoft's AI Comeback

#artificialintelligence

But while his peer scientists Yann LeCun and Geoffrey Hinton have signed on to Facebook and Google, respectively, Bengio, 53, has chosen to continue working from his small third-floor office on the hilltop campus of the University of Montreal. Shum, who is in charge of all of AI and research at Microsoft, has just finished a dress rehearsal for next week's Build developers conference, and he wants to show me demos. Shum has spent the past several years helping his boss, CEO Satya Nadella, make good on his promise to remake Microsoft around artificial intelligence. Bill Gates showed off a mapping technology in 1998, for example, but it never came to market; Google launched Maps in 2005.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Where are developers looking next?

@machinelearnbot

As part of the research underpinning Developer Economics we actively monitor industry trends and opportunities, looking for new areas of significant developer interest. In our Developer Economics survey, we invested in trends in Data Science and Machine Learning among other areas of emerging tech- the latter probably being the least hyped emerging tech space with the most developer activity. A side effect of there now being a 1990s level supercomputer in 2 3 billion pockets worldwide is that we're drowning in data. All of the data collected in human history, up to the turn of the millennium, is certainly less than we now generate every day. The Internet of Things is adding sensors to anything and everything, which will compound this problem.


The New Intel: How Nvidia Went From Powering Video Games To Revolutionizing Artificial Intelligence

#artificialintelligence

Nvidia cofounder Chris Malachowsky is eating a sausage omelet and sipping burnt coffee in a Denny's off the Berryessa overpass in San Jose. It was in this same dingy diner in April 1993 that three young electrical engineers--Malachowsky, Curtis Priem and Nvidia's current CEO, Jen-Hsun Huang--started a company devoted to making specialized chips that would generate faster and more realistic graphics for video games. East San Jose was a rough part of town back then--the front of the restaurant was pocked with bullet holes from people shooting at parked cop cars--and no one could have guessed that the three men drinking endless cups of coffee were laying the foundation for a company that would define computing in the early 21st century in the same way that Intel did in the 1990s. "There was no market in 1993, but we saw a wave coming," Malachowsky says. "There's a California surfing competition that happens in a five-month window every year.


Will AI built by a 'sea of dudes' understand women? AI's inclusivity problem

#artificialintelligence

Only 26 percent of computer professionals were women in 2013, according to a recent review by the American Association of University Women. That figure has dropped 9 percent since 1990. Some say the industry is masculine by design. Others claim computer culture is unwelcoming -- even hostile -- to women. So, while STEM fields like biology, chemistry, and engineering see an increase in diversity, computing does not.


Will AI built by a 'sea of dudes' understand women? AI's inclusivity problem

#artificialintelligence

Only 26 percent of computer professionals were women in 2013, according to a recent review by the American Association of University Women. That figure dropped by nine percent since 1990. Some say the industry is masculine by design. Others claim computer culture is unwelcoming -- even hostile -- to women. So, while STEM fields like biology, chemistry, and engineering see an increase in diversity, computing does not.


The New Intel: How Nvidia Went From Powering Video Games To Revolutionizing Artificial Intelligence

#artificialintelligence

Nvidia cofounder Chris Malachowsky is eating a sausage omelet and sipping burnt coffee in a Denny's off the Berryessa overpass in San Jose. It was in this same dingy diner in April 1993 that three young electrical engineers--Malachowsky, Curtis Priem and Nvidia's current CEO, Jen-Hsun Huang--started a company devoted to making specialized chips that would generate faster and more realistic graphics for video games. East San Jose was a rough part of town back then--the front of the restaurant was pocked with bullet holes from people shooting at parked cop cars--and no one could have guessed that the three men drinking endless cups of coffee were laying the foundation for a company that would define computing in the early 21st century in the same way that Intel did in the 1990s. "There was no market in 1993, but we saw a wave coming," Malachowsky says. "There's a California surfing competition that happens in a five-month window every year.


Call for Papers @CloudExpo #BigData #IoT #AI #DevOps #FinTech #Blockchain

#artificialintelligence

The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Machine Learning and WebRTC to one location. Featured on-site presentation and ongoing on-demand webcast exposure to a captive audience of industry decision-makers Showcase exhibition during our new extended dedicated expo hours Breakout Session Priority scheduling for Sponsors that have been guaranteed a 35 minute technical session Online targeted advertising in SYS-CON's i-Technology Publications Capitalize on our Comprehensive Marketing efforts leading up to the show with print mailings, e-newsletters and extensive online media coverage Unprecedented Marketing Coverage: Editorial Coverage on ITweetup to over 100,000 plus followers, press releases sent on major wire services to over 500 industry analysts Online targeted advertising in SYS-CON's i-Technology Publications All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA.


The New Intel: How Nvidia Went From Powering Video Games To Revolutionizing Artificial Intelligence

#artificialintelligence

It was in this same dingy diner in April 1993 that three young electrical engineers--Malachowsky, Curtis Priem and Nvidia's current CEO, Jen-Hsun Huang--started a company devoted to making specialized chips that would generate faster and more realistic graphics for video games. "We've been investing in a lot of startups applying deep learning to many areas, and every single one effectively comes in building on Nvidia's platform," says Marc Andreessen of venture capital firm Andreessen Horowitz. Starting in 2006, Nvidia released a programming tool kit called CUDA that allowed coders to easily program each individual pixel on a screen. From his bedroom, Krizhevsky had plugged 1.2 million images into a deep learning neural network powered by two Nvidia GeForce gaming cards.