Goto

Collaborating Authors

The Most Common Misconceptions About Artificial Intelligence – IAM Network

#artificialintelligence

In a world where big data, automation, and neural networks have become everyday parlance, misconceptions about artificial intelligence and the processes behind it are spreading like wildfire. Naturally, the vast and unprecedented potential applications of AI tend to generate a lot of buzz, particularly where the economy is concerned. However, all-too-often people tend to mischaracterize or misunderstand what AI is all about, which only serves to undermine its potential as a liberating technology. Let's clear up the most common AI misconceptions, in order to have a more grounded understanding of this emerging technology and its potential use cases. Yes, automation is leading to the increased redundancy of a number of certain low-skilled jobs, but this trend has been significantly overblown in recent years. In addition, most scientific estimates demonstrate that AI-driven automation will likely create more jobs than it will displace.


Using Continuous Machine Learning to Run Your ML Pipeline

#artificialintelligence

CI/CD is a key concept that is becoming increasingly popular and widely adopted in the software industry nowadays. Incorporating continuous integration and deployment for a software project that doesn't contain a machine learning component is fairly straightforward because the stages of the pipeline are somewhat standard, and it is unlikely that the CI/CD pipeline will change a lot over the course of development. But, when the project involves a machine learning component, this may not be true. As opposed to traditional software development, building a pipeline for a machine learning components may involve a lot of changes over time, mostly in response to observations made during past iterations of development. Therefore, for ML projects, notebooks are widely used to get started with the project, and once a stable foundation (base code for different stages of the ML pipeline) is available to build upon, the code is pushed to a version control system, and the pipeline is migrated to a CI/CD tool such as Jenkins or TravisCI.


Machine & deep learning in mobile video game AI development

#artificialintelligence

Machine learning is enhancing at a promising rate. It becomes challenging when it comes to supporting game design and development and personalization of the gaming experience based on the data collected related to the player's behavior. Whereas, AI developers are trying to use AI to make the game look and feel more realistic where players can interact naturally with other players and the environment. While on the other hand, the developer can achieve the intended player experience. The motive is to enhance an individual player's experience during the game, and even after.


IBM To Acquire WDG Automation To Advance AI-Infused Automation Capabilities For Enterprises - Liwaiwai

#artificialintelligence

IBM announced it has reached a definitive agreement to acquire Brazilian software provider of robotic process automation (RPA) WDG Soluções Em Sistemas E Automação De Processos LTDA (referred to as "WDG Automation" throughout). The acquisition further advances IBM's comprehensive AI-infused automation capabilities, spanning business processes to IT operations. Financial terms were not disclosed. In today's digital era, companies are looking for new ways to create new business models, deliver new services and lower costs. The need to drive this transformation is even greater now given the uncertainties of COVID-19.


Strong AI Versus Weak AI Is Completely Misunderstood, Including For AI Self-Driving Cars

#artificialintelligence

Or, if you prefer, you can state it as weak versus strong AI (it's okay to list them in either order). If you've been reading about AI in the popular press, the odds are that you've seen references to so-called strong AI and so-called weak AI, and yet the odds further are that both of those phrases have been used wrongly and offer misleading and confounding impressions. Time to set the record straight. First, let's consider what is being incorrectly stated. Some speak of weak AI as though it is AI that is wimpy and not up to the same capabilities as strong AI, including that weak AI is decidedly slower, or much less optimized, or otherwise inevitably and unarguably feebler in its AI capacities.


Artie releases tool to measure bias in speech recognition models

#artificialintelligence

Artie, a startup developing a platform for mobile games on social media, today released a data set and tool for detecting demographic bias in voice apps. The Artie Bias Corpus (ABC), which consists of audio files along with their transcriptions, aims to diagnose and mitigate the impact of factors like age, gender, and accent in voice recognition systems. Speech recognition has come a long way since IBM's Shoebox machine and Worlds of Wonder's Julie doll. But despite progress made possible by AI, voice recognition systems today are at best imperfect -- and at worst discriminatory. In a study commissioned by the Washington Post, popular smart speakers made by Google and Amazon were 30% less likely to understand non-American accents than those of native-born users. More recently, the Algorithmic Justice League's Voice Erasure project found that that speech recognition systems from Apple, Amazon, Google, IBM, and Microsoft collectively achieve word error rates of 35% for African American voices versus 19% for white voices.


Advancing Azure Service Quality With Artificial Intelligence: AIOps - Liwaiwai

#artificialintelligence

"In the era of big data, insights collected from cloud services running at the scale of Azure quickly exceed the attention span of humans. It's critical to identify the right steps to maintain the highest possible quality of service based on the large volume of data collected. In applying this to Azure, we envision infusing AI into our cloud platform and DevOps process, becoming AIOps, to enable the Azure platform to become more self-adaptive, resilient, and efficient. AIOps will also support our engineers to take the right actions more effectively and in a timely manner to continue improving service quality and delighting our customers and partners. This post continues our Advancing Reliability series highlighting initiatives underway to keep improving the reliability of the Azure platform. The post that follows was written by Jian Zhang, our Program Manager overseeing these efforts, as she shares our vision for AIOps, and highlights areas of this AI infusion that are already a reality as part of our end-to-end cloud service management."--Mark


Tencent Unveils Plans For Artificial Intelligence, Integration With Industries - Liwaiwai

#artificialintelligence

Tencent unveiled a comprehensive blueprint for the development of its Artificial Intelligence technologies at the 2020 World Artificial Intelligence Conference (WAIC) in Shanghai. Tencent introduced a white paper, "Tencent AI: Ambient Intelligence", and announced the launch of the Light 2.0 Program and four new platforms including an AI pan-entertainment platform, an AI console for broadcasting and TV media, a content review platform, and an industrial AI platform. With the launch of a comprehensive and open AI ecosystem, Tencent is seeking to further unlock the value in the AI industry, reinforce the company's leadership in AI, and kickstart a new round of innovation in AI. The Light 2.0 Program and four new platforms aim to unlock the benefits of accumulated technologies and capabilities, inspire and train young AI talent for tomorrow, and speed up the integration of AI with industries. The program, which follows from its predecessor Light 1.0 in 2019, takes advantage of technologies, products, resources, and projects from Tencent and its partners in the area of AI. It also provides a communication and experimental platform that promotes cooperation between industry and higher education and facilitates the training and development of future leaders.


Machine Learning In The Enterprise: Where Will The Next Trillion Dollars Of Value Accrue?

#artificialintelligence

Every company will become an ML company. In the world of Harry Potter, the sorting hat serves as an algorithm that takes data from a student's behavioral history, preferences and personality and turns that into a decision on which Hogwarts house they should join. If the real world had sorting hats, it would take the form of machine learning (ML) applications that make autonomous decisions based on complex datasets. While software has been "eating the world," ML is starting to eat software, and it is supercharging trillion-dollar global industries such as healthcare, security and agriculture. If ML is expected to create significant value, the question becomes: where will this value accrue?


MIT researchers warn that deep learning is approaching computational limits

#artificialintelligence

That's according to researchers at the Massachusetts Institute of Technology, Underwood International College, and the University of Brasilia, who found in a recent study that progress in deep learning has been "strongly reliant" on increases in compute. It's their assertion that continued progress will require "dramatically" more computationally efficient deep learning methods, either through changes to existing techniques or via new as-yet-undiscovered methods. "We show deep learning is not computationally expensive by accident, but by design. The same flexibility that makes it excellent at modeling diverse phenomena and outperforming expert models also makes it dramatically more computationally expensive," the coauthors wrote. "Despite this, we find that the actual computational burden of deep learning models is scaling more rapidly than (known) lower bounds from theory, suggesting that substantial improvements might be possible."