Results


Accenture Combines AI And IoT To Build Technology That Serves Humans

#artificialintelligence

Global technology major Accenture is looking to use the power of combination of the best technologies in the world to build solutions that are unique for their clients. "Accenture is like a glue organisation," says Marc Carrel-Billiard, global managing director -Technology R&D, Accenture. Accenture's technology R&D division is working towards combining different technologies wherein Artificial intelligence (AI) and Internet of Things (IoT) are at the core of it. Their engineers are combining AI with deep learning and machine learning to interpret the large amounts of data being generated from the 20 billion connected devices today. "So, if you combine all the data coming from the sensors with all the computing power, machine learning...we have the best of both worlds.


The Business Of HPC Is Evolving

#artificialintelligence

With most of the year finished and a new one coming up fast, and a slew of new compute and networking technologies ramping for the past year and more on the horizon for a very exciting 2017, now is the natural time to take stock of what has happened in the HPC business and what is expected to happen in the coming years. The theme of the SC16 supercomputing conference this year is that HPC matters, and of course, we have all known this since the first such machines were distinct from enterprise-class electronic computers back in the 1960s. HPC not only matters, but the growing consensus at the conference is that HPC will possibly be returning to its roots as a sector for innovation and specialization to solve very specific and computational intensive and complex problems. We could be seeing the waning of the Era of General Purpose HPC even as the simulation, modeling, analytics, and machine learning workloads that comprise modern HPC continue to evolve at a rapid pace. It takes money to make HPC happen, and HPC also makes money happen, and it is supposed to be a virtuous cycle where more innovation in HPC systems drives more innovation in product design and various kinds of simulation such as weather forecasting or particle physics or cosmology, just to name a few.


Beyond the Cloud: When Industrial Data Centers Become Intelligent

#artificialintelligence

There are certain inflection points, transitions, and cyclical patterns in technology. Mainframe to personal computer, desktop to mobile, mobile to wearables -- each pushing the center of computing back and forth, back and forth.With each of these inflections there are concurrent power shifts; IBM to Microsoft, Microsoft to Apple, Google to Amazon, and now, most recently, Apple to GE. "Technology innovation is table stakes for the internet of things transformation," GE's Chief Commercial Officer Kate Johnson said "We will need new, innovative ways to ingest and analyze mass quantities of data. Many companies are rising to this challenge – we are pleased with the plethora of new technology that is available." What this means for everyone, from the consumer to the enterprise, is that, soon, every traffic light, washing machine, toaster and train engine will be monitored, controlled and built better, on a daily basis. Over the past several years, in anticipation of a huge shift, GE has been patiently building products and a data processing platform to consume and process everything on the planet.


SAPPHIRE 2016: About machine learning and moving to the next level of computing

#artificialintelligence

In this context, business application journey has not finished yet. Over this subject, SAP believes on the use of machine learning technology supporting business processes to achieve truly "work liberation". The idea is to have the machine learning from input data and provide a decision on the expected decision to make based on what is happening. SAP has expressed the commitment for supporting customers journey to obtain the best assets for real insight through the pillars of innovation and empathy.


After Moore's Law: Predicting The Future Beyond Silicon Chips

#artificialintelligence

And for decades, the principle guiding much of the innovation in computing has been Moore's law -- a prediction, made by Intel co-founder Gordon Moore, that the number of transistors on a microprocessor chip would double every two years or so. And we can use (it) for problems that today are every expensive to execute on modern computers -- things like image recognition or voice recognition, things that today take an amazing amount of compute power to do well. "Roadmapping" Moore's law has really driven the industry in terms of making faster and smaller transistors. These domains are, for example, weather prediction, or what we call big data analytics, which is what Google does, or machine learning for recognition of voice or images, ... a lot of high-performance computing simulations, such as thermal process evolution simulations.