AMONG THE MANY EMERGING TRENDS IN THE technology sector, the rise of artificial intelligence (AI) is likely to be one of the most significant over the coming years. AI refers to the ability of machines to perform tasks that would typically be associated with human cognition such as responding to questions, recognizing faces, playing video games or describing objects. Over recent years, AI capability has improved to such an extent that a range of commercial applications are now possible in areas like consumer electronics, industrial automation and online retail. Technology companies of all sizes and in locations all around the world are developing AI-driven products aimed at reducing operating costs, improving decision-making and enhancing consumer services across a range of client industries. And despite a decline in venture capital funding across industries overall in 2016, AI startups raised a record $5 billion globally last year – a 71% annualized growth rate and near-tenfold rise over the 2012 level (see EXHIBIT 1).

Artificial Intelligence might soon be judging gymnastics


It's an unlikely pairing, as one of the most artistic sports might soon be judged by computers. Gymnastics has come a long way since Nadia Comaneci scored the first perfect 10 in 1976. It's become faster, more technical, and more competitive. In response, the pointing and judging system has changed too. Judges have had to improve, as the differences between athletes have become smaller and subtler.

For HPC and Deep Learning, GPUs are here to stay - insideHPC


In this special guest feature from Scientific Computing World, David Yip, HPC and Storage Business Development at OCF, provides his take on the place of GPU technology in HPC. There was an interesting story published earlier this week in which NVIDIA's founder and CEO, Jensen Huang, said: 'As advanced parallel-instruction architectures for CPU can be barely worked out by designers, GPUs will soon replace CPUs'. There are only so many processing cores you can fit on a single CPU chip. There are optimized applications that take advantage of a number of cores, but typically they are used for sequential serial processing (although Intel is doing an excellent job of adding more and more cores to its CPUs and getting developers to program multicore systems). By contrast, a GPU has massively parallel architecture consisting of many thousands of smaller, more efficient cores designed for handling multiple tasks simultaneously.

Amazon brings machine learning to "everyday developers" » Banking Technology


"Amazon has a long history of machine learning" Amazon Web Services (AWS) is looking to bring machine learning (ML) to ordinary developers, launching the SageMaker service to simplify building applications, reports Enterprise Cloud News (Banking Technology's sister publication). ML is too complicated for ordinary developers, AWS CEO Andy Jassy said at a keynote during the AWS re:Invent event. "If you want to enable most enterprises and companies to be able to use ML in an expansive way, we have to solve the problem of making it accessible to everyday developers and scientists," he said. Amazon has a long history of ML, Jassy says. "We've been doing ML at Amazon for 20 years," he said.

AI innovation will trigger the robotics network effect


Anyone who has thought about scaling a business or building a network is familiar with a dynamic referred to as the "network effect." The more buyers and sellers who use a marketplace like eBay, for example, the more useful it becomes. Well, the data network effect is a dynamic in which increased use of a service actually improves the service, such as how machine-learning models generally grow more accurate as a result of training from larger and larger volumes of data. Autonomous vehicles and other smart robots rely on sensors that generate increasingly massive volumes of highly varied data. This data is used to build better AI models that robots rely on to make real-time decisions and navigate real-world environments.

With IBM POWER9, we're all riding the AI wave - IBM Systems Blog: In the Making


There's a big connection between my love for water sports and hardware design -- both involve observing waves and planning several moves ahead. Four years ago, when we started sketching the POWER9 chip from scratch, we saw an upsurge of modern workloads driven by artificial intelligence and massive data sets. We are now ready to ride this new tide of computing with POWER9. It is a transformational architecture and an evolutionary shift from the archaic ways of computing promoted by x86. POWER9 is loaded with industry-leading new technologies designed for AI to thrive.

These Smartphone Companies are Leveraging AI to Stay Ahead of the Game


You're reading Entrepreneur India, an international franchise of Entrepreneur Media. Technology has revolutionized the way our world operates. The pace of its adoption is getting faster in each and every industry. For Example, smartphones have started integrating virtual assistants to make life easier and save valuable time. Similarly, many companies are leveraging artificial intelligence to give the best experience by making smartphones smarter.

Australia just landed its first high performance centre for esports


Oceania's esports industry just took a huge step forward. Australia has opened its very first Esports High Performance Centre in Sydney, a new home base for Oceania's leading League of Legends team, the LG Dire Wolves. Established in Sydney's city sporting precinct, sitting in the side of Allianz Stadium looking towards the Sydney Cricket Ground, the facility aims to drive growth and development in Australia's esports industry. The facility will be stocked with new technology in eye-tracking and performance analysis, as part of a partnership with the University of Technology Sydney. The Dire Wolves, alongside Australia's leading mixed-gender Counter-Strike team, Supa-Stellar, will train and develop surrounded by some of Sydney's traditional sports teams, also residents of the precinct, including the Sydney Swans, Sydney Sixers, Sydney Roosters, Sydney FC, Cricket NSW, and the NSW Waratahs.

Intel Will Ship First Neural Network Chip This Year


In an editorial posted on Intel's news site, Intel CEO Brian Krzanich announced they would be releasing the company's first AI processor before the end of 2017. The new chip, formally codenamed "Lake Crest," will be officially known as the Nervana Neural Network Processor, or NNP, for short. As implied by its name, the chip will use technology from Nervana, an AI startup Intel acquired for more than $350 million last year. Unlike GPUs or FPGAs, NNP is a custom-built coprocessor aimed specifically at deep learning, that is, processing the neural networks upon which these applications are based. In that sense, Intel's NNP is much like Google's Tensor Processing Unit (TPU), a custom-built chip the search giant developed to handle much its own deep learning work.

Storage will continue to play a role in the advancement of AI: Pure Storage


Storage is an important component underpinning artificial intelligence (AI) and other emerging technologies with similar infrastructure demands, according to Robert Lee, VP and chief architect at Pure Storage, and therefore needs to be included in discussions about such technologies. Lee told ZDNet that significant advancements in technology -- particularly around parallelisation, compute, and networking -- enable new algorithms to apply more compute power against data. "Historically, the limit to how much data has been able to be processed, the limit to how much insight we've been able to garner from data has been bottlenecked by storage's ability to keep the compute fed," said Lee, who previously worked at Oracle before joining Pure Storage in 2013. "Somewhere around the early 2000s, the hardware part of compute, CPUs started getting more parallel. It started doing multi-socket architectures, hyper threading multi-core.