If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Artificial intelligence (AI) can distinguish a dog from a cat, but the billions of calculations needed to do so demand quite a lot of energy. The human brain can do the same thing while using only a small fraction of this energy. Could this phenomenon inspire us to develop more energy-efficient AI systems? Our computational power has risen exponentially, enabling the widespread use of artificial intelligence, a technology that relies on processing huge amounts of data to recognize patterns. When we use the recommendation algorithm of our favorite streaming service, we usually don't realize the gigantic energy consumption behind it.
Understanding the set of elementary steps and kinetics in each reaction is extremely valuable to make informed decisions about creating the next generation of catalytic materials. With physical and mechanistic complexity of industrial catalysts, it is critical to obtain kinetic information through experimental methods. As such, this work details a methodology based on the combination of transient rate/concentration dependencies and machine learning to measure the number of active sites, the individual rate constants, and gain insight into the mechanism under a complex set of elementary steps. This new methodology was applied to simulated transient responses to verify its ability to obtain correct estimates of the micro-kinetic coefficients. Furthermore, experimental CO oxidation data was analyzed to reveal the Langmuir-Hinshelwood mechanism driving the reaction. As oxygen accumulated on the catalyst, a transition in the mechanism was clearly defined in the machine learning analysis due to the large amount of kinetic information available from transient reaction techniques. This methodology is proposed as a new data driven approach to characterize how materials control complex reaction mechanisms relying exclusively on experimental data.
Information technology (IT) students in Denmark have created a software program that can determine the energy consumption and the amount of carbon dioxide generated by the development of deep learning algorithms. According to their estimates, hardware used to train a deep learning algorithm can use worrying amounts of energy from an environmental standpoint. Whether browsing movies suggested by Netflix based on your viewing history, asking your voice assistant a question or interacting with a chatbot on an e-commerce website, all of these everyday online processes rely on deep learning algorithms. However, developing algorithms contributes to digital pollution. And it's precisely this environmental impact that students from the IT department of the University of Copenhagen have sought to quantify, using their Carbontracker software program.
Some of the biggest names in AI research have laid out a road map suggesting how machine learning can help save our planet and humanity from imminent peril. The report covers possible machine-learning interventions in 13 domains, from electricity systems to farms and forests to climate prediction. Within each domain, it breaks out the contributions for various subdisciplines within machine learning, including computer vision, natural-language processing, and reinforcement learning. Recommendations are also divided into three categories: "high leverage" for problems well suited to machine learning where such interventions may have an especially great impact; "long-term" for solutions that won't have payoffs until 2040; and "high risk" for pursuits that have less certain outcomes, either because the technology isn't mature or because not enough is known to assess the consequences. Many of the recommendations also summarize existing efforts that are already happening but not yet at scale.
Current commonsense reasoning research mainly focuses on developing models that use commonsense knowledge to answer multiple-choice questions. However, systems designed to answer multiple-choice questions may not be useful in applications that do not provide a small list of possible candidate answers to choose from. As a step towards making commonsense reasoning research more realistic, we propose to study open-ended commonsense reasoning (OpenCSR) -- the task of answering a commonsense question without any pre-defined choices, using as a resource only a corpus of commonsense facts written in natural language. The task is challenging due to a much larger decision space, and because many commonsense questions require multi-hop reasoning. We propose an efficient differentiable model for multi-hop reasoning over knowledge facts, named DrFact. We evaluate our approach on a collection of re-formatted, open-ended versions of popular tests targeting commonsense reasoning, and show that our approach outperforms strong baseline methods by a large margin.
Ashok, CEO of UnfoldLabs, is an innovation veteran who believes in making the world a better place with futuristic technology products. Australian researchers have suggested a 2050 scenario of doomsday for humanity. Climate change is the biggest and toughest global problem humanity faces today. Global warming requires innovation from the brightest and the best. Our scientists have turned to artificial intelligence (AI) for the best possible solutions because it is easy to proactively predict and build models immediately.
Extremely energy-efficient artificial intelligence is now closer to reality after a study by UCL researchers found a way to improve the accuracy of a brain-inspired computing system. The system, which uses memristors to create artificial neural networks, is at least 1,000 times more energy efficient than conventional transistor-based AI hardware, but has until now been more prone to error. Existing AI is extremely energy-intensive--training one AI model can generate 284 tons of carbon dioxide, equivalent to the lifetime emissions of five cars. Replacing the transistors that make up all digital devices with memristors, a novel electronic device first built in 2008, could reduce this to a fraction of a ton of carbon dioxide--equivalent to emissions generated in an afternoon's drive. Since memristors are so much more energy-efficient than existing computing systems, they can potentially pack huge amounts of computing power into hand-held devices, removing the need to be connected to the Internet.
It was the Australian bush fire that finally did it. For 12 years Adam Hearne had worked at companies that represented some of the world's largest sources of greenhouse gas emissions. First at Rio Tinto, one of the largest industrial miners, and then at Amazon, where he handled inbound delivery operations across the EU, Hearne was involved in ensuring that things flowed smoothly for companies whose operations spew millions of tons of carbon dioxide into the environment. Amazon's business alone was responsible for emitting 51.17 million metric tons of carbon dioxide last year -- the equivalent of 13 coal-burning power plants, according to a report from the company. Then, Hearne's home country burned.
Recent news about the benefits of Machine Learning (ML) and Deep Learning (DL) has taken a slightly downbeat turn toward pointing out that there is a potential ecological cost associated with these systems. In particular, AI developers and AI researchers need to be mindful of the adverse and damaging carbon footprint that they are generating while crafting ML/DL capabilities. It is a so-called "green" or environmental wake-up call for AI that is worth hearing. Let's first review the nature of carbon footprints (CFPs) that are already quite familiar to all of us, such as the carbon belching transportation industry. A carbon footprint is usually expressed as the amount of carbon dioxide emissions spewed forth, including for example when you fly in a commercial plane from Los Angeles to New York, or when you drive your gasoline-powered car from Silicon Valley to Silicon Beach.
The maker of the N95 respirator mask designed it for the mining and construction industry. It filters out 95 percent of the visible airborne dust particles, keeping them from entering the lungs. It has vents so the wearers can exhale. Demolition crews use them, concrete laborers use them, so do workers who sand wood floors, gypsum board walls, and plaster ceilings. Workers in other trades, such as electrical, plumbing, rough carpentry, and wood finishers rarely wear masks.