Results


Smart Machines Need Smart Silicon

#artificialintelligence

It seems like even the biggest hyperscale platform developers who have long touted software-defined architectures as the key to computing nirvana are starting to learn a cardinal rule of infrastructure: No matter how much you try to abstract it, basic hardware still matters. A key example of this is Google's Tensor Processing Unit (TPU), which the company designed specifically for machine learning and other crucial workflows that were starting to push the limits of available CPUs and GPUs. In fact, the company says that without the TPU, it was looking at doubling its data center footprint in order to support applications like voice recognition and image search. The TPU is custom-designed to work with the TensorFlow software library, generating results 15 to 30 times faster than state-of-the-art Intel Haswell or Nvidia K80 devices. This may seem like a harbinger of bad times ahead for Intel and Nvidia, but the broader picture is a bit more muddled.


Unlocking the True Value of Finance as a Business Partner – Share Talk

#artificialintelligence

How can finance become a better business partner through utilizing emerging technologies? Here are 7 recommendations on how to unlock finance's potential. Over the last couple of years, companies have started to prepare for the 2020s and beyond; constantly responding to their rapidly changing environment. These changes are powered by emerging technologies, macroeconomic trends, consumer expectations and business models. Until recently, developments have been traditional and linear, following an incremental pace.


Don't fall for the AI hype: Here are the ingredients you need to build an actual useful thing

#artificialintelligence

Artificial intelligence these days is sold as if it were a magic trick. Data is fed into a neural net – or black box – as a stream of jumbled numbers, and voilà! It comes out the other side completely transformed, like a rabbit pulled from a hat. That's possible in a lab, or even on a personal dev machine, with carefully cleaned and tuned data. However, it is takes a lot, an awful lot, of effort to scale machine-learning algorithms up to something resembling a multiuser service – something useful, in other words.


Flipboard on Flipboard

#artificialintelligence

The global energy industry is facing disruption as it transitions from fossils to renewables (and occasionally back again). Its challenges include balancing growing demand in developing nations with the need for sustainability, and predicting the effect of extreme weather conditions on supply and demand. Against this backdrop, GE Power – whose turbines and generators supply 30 per cent of the world's electricity – has been working on applying Big Data, machine learning and Internet of Things (IoT) technology to build an "internet of power" to replace the linear, one-way traditional model of energy delivery. Ganesh Bell – first and current Chief Data Officer at GE Power, tells me "The biggest opportunity is that, if you think about it, the electricity industry is still following a one-hundred-year-old model which our founder, Edison, helped to proliferate. "It's the generation of electrons in one source which are then transmitted in a one-way linear model.


EMA Analyst's Corner

#artificialintelligence

While containers are still an important topic in enterprise IT today, there are two other trends that are stealing the show: serverless computing and artificial intelligence-driven cloud operations--let's call this concept "Cloud AI," and here is why: Your CFO is telling you that "we have to get out of the business of operating servers." Containerization of applications will not get you there, as managing container frameworks comes with significant management demands. This means that you will have to manage containers alongside your existing bare metal and virtual data center infrastructure. Containers, of course, come with specific requirements in terms of security, performance and availability management and monitoring. To add insult to injury, there is a good amount of uncertainty in terms of portability of your container environment from one cloud to another.


A 4-Step Innovation Framework for CIOs

#artificialintelligence

How to transform from a CIO to a CIO. Now that I have your attention. This is not a typo – I did mean from CIO to CIO. CIOs are good with transforming the business but not so well at transforming themselves. CIOs need to transform from Chief Information Officer to Chief Influence Officer and eventually to Chief Innovation Officer.


New Relic Previews Artificial Intelligence Technology: Project Seymour – Military Technologies

#artificialintelligence

SAN FRANCISCO–(BUSINESS WIRE)– NEW RELIC FUTURESTACK – Digital intelligence leader New Relic, Inc. (NYSE:NEWR) today shared a preview of its artificial intelligence (AI) technology, code-named "Project Seymour," at the company's fourth annual FutureStack event in San Francisco. Project Seymour is designed to deliver advanced AI and machine learning capabilities to help companies uncover the most interesting, most relevant, and most actionable insights to improve their customer experience, and the performance and availability of their digital initiatives. "Our customers have increasingly complex systems and often struggle to understand all of the facets of what's going on in their customer experience, in their applications, and in their infrastructure. Seymour is another manifestation of New Relic's continued obsession to make it easy for our customers to understand everything going on in their digital business," said Lew Cirne, CEO and founder, New Relic. "New Relic has a unique opportunity to leverage the power of AI because our cloud-based platform already analyzes billions of metrics for our customers every day.


The AI Takeover Is Coming. Let's Embrace It.

#artificialintelligence

On Tuesday, the White House released a chilling report on AI and the economy. It began by positing that "it is to be expected that machines will continue to reach and exceed human performance on more and more tasks," and it warned of massive job losses. Yet to counter this threat, the government makes a recommendation that may sound absurd: we have to increase investment in AI. The risk to productivity and the US's competitive advantage is too high to do anything but double down on it. This approach not only makes sense, but also is the only approach that makes sense.


The AI Takeover Is Coming. Let's Embrace It.

#artificialintelligence

On Tuesday, the White House released a chilling report on AI and the economy. It began by positing that "it is to be expected that machines will continue to reach and exceed human performance on more and more tasks," and it warned of massive job losses. Yet to counter this threat, the government makes a recommendation that may sound absurd: we have to increase investment in AI. The risk to productivity and the US's competitive advantage is too high to do anything but double down on it. This approach not only makes sense, but also is the only approach that makes sense.


The AI Takeover Is Coming. Let's Embrace It.

#artificialintelligence

A few weeks ago, Google computer scientists working with medical researchers reported an algorithm that can detect diabetic retinopathy in images of the eye as well as an ophthalmologist can. The tool of choice in the aforementioned examples of successful AIs is deep learning: the artificial intelligence technique that's been rivaling habaneros in blistering hotness. Instead, there's essentially one algorithm (with many minor variants) that can adjust its own structure to solve a problem, directly from whatever massively large data set you feed it. In September, Google announced an enormous upgrade in the performance of Google Translate, using a system it's calling Google Neural Machine Translation (GNMT).