Goto

Collaborating Authors

computer


How artificial intelligence could lower nuclear energy costs

#artificialintelligence

Argonne scientists are building systems to streamline operations and maintenance at reactors. Nuclear power plants provide large amounts of electricity without releasing planet-warming pollution. But the expense of running these plants has made it difficult for them to stay open. If nuclear is to play a role in the U.S. clean energy economy, costs must come down. Scientists at the U.S. Department of Energy's (DOE) Argonne National Laboratory are devising systems that could make nuclear energy more competitive using artificial intelligence.


How artificial intelligence could lower nuclear energy costs

#artificialintelligence

Nuclear power plants provide large amounts of electricity without releasing planet-warming pollution. But the expense of running these plants has made it difficult for them to stay open. If nuclear is to play a role in the U.S. clean energy economy, costs must come down. Scientists at the U.S. Department of Energy's (DOE) Argonne National Laboratory are devising systems that could make nuclear energy more competitive using artificial intelligence. Nuclear power plants are expensive in part because they demand constant monitoring and maintenance to ensure consistent power flow and safety.


Data Analytics and Artificial Intelligence for Cognitive Procurement

#artificialintelligence

Data analytics is the process of examining data sets in order to draw conclusions about the information they contain, increasingly with the aid of specialized systems and software. Data analytics technologies and techniques are widely used in commercial industries to enable organizations to make more-informed business decisions. Artificial Intelligence (AI) is where we first heard the term cognitive technology. Technology experts began seeing the benefits of AI among consumer and business application. Amazon's Alexa and chatbots are two that come to mind that is now ubiquitous in homes and businesses.


Artificial intelligence (AI) vs. machine learning (ML): Key comparisons

#artificialintelligence

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Within the last decade, the terms artificial intelligence (AI) and machine learning (ML) have become buzzwords that are often used interchangeably. While AI and ML are inextricably linked and share similar characteristics, they are not the same thing. Rather, ML is a major subset of AI.


Chemistry and digital chemistry: healthcare needs both

#artificialintelligence

At first glance, it almost sounds like an oxymoron. Surely there is nothing more analogue than how the building blocks of our world interact? Chemistry is all about lab work and real world experiments, no? All true, but digital chemistry – using computational techniques to solve complex chemical challenges – is a rapidly growing space (8,394 job postings worldwide on LinkedIn as I type). As we make advances in computer algorithms, artificial intelligence and machine learning, digital chemistry is becoming a vital tool for companies involved in any industry that relies upon solving chemical problems, from packaging to paint.


Inventing the Future: Artificial Intelligence (AI): A Tool for a Better Future

#artificialintelligence

"The development of full artificial intelligence could spell the end of the human race…it would take off on its own, and re-design itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded." Artificial Intelligence is undoubtedly one of the key technologies that defines the 21st century. Before throwing this two-word phrase around, having a general understanding of what Artificial Intelligence (AI) entails is important. To put it simply, AI is an attempt to emulate and simulate varied forms of human intelligence in machines.


Topics: Artificial intelligence

#artificialintelligence

Artificial intelligence (AI) is the ability of machines to mimic human capabilities in a way that we would consider'smart'. You most likely have come across – or are aware of – AI applications such as self-driving cars, facial recognition, chess or go players, security systems, or speech/voice recognition (for example, those used in an intelligent virtual assistant). In conventional computing, a programmer writes a computer program that precisely instructs a computer what to do to solve a particular problem. With AI, however, the programmer instead writes a program that allows the computer to learn to solve a problem by itself. That sounds like overdoing it, but this is really the way we do things.


Level Up Your AI Skillset and Dive Into The Deep End Of TinyML

#artificialintelligence

Machine learning (ML) is a growing field, gaining popularity in academia, industry, and among makers. We will take a look at some of the available tools to help make machine learning easier, but first, let's review some of the terms commonly used in machine learning. John McCarthy provides a definition of artificial intelligence (AI) in his 2007 Stanford paper, "What is Artificial Intelligence?" In it, he says AI "is the science and engineering of making intelligent machines, especially intelligent computer programs." This definition is extremely broad, as McCarthy defines intelligence as "the computational part of the ability to achieve goals in the world." As a result, any program that achieves some goal can easily be classified as artificial intelligence. In her article "Machine Learning on Microcontrollers" (Make: Vol.


A brief history of the Turing Test.

#artificialintelligence

The Turing Test was developed by Alan Turing in 1950. Alan Turing was a renowned English mathematician, logician, cryptographer, and computer scientist. He is widely known for his notable work of developing the Universal Turing Machine in 1936, which is said to form the basis of the first computers. Alan Turing also played a crucial role in breaking the German Enigma Code during World War II. In 1950, Alan Turing introduced the Turing test in his paper "Computing Machinery and Intelligence."


AI Maturity in Banking Lags All Other Industries

#artificialintelligence

The need for financial institutions to quickly operationalize their artificial intelligence capabilities has moved beyond important to imperative. More than supporting risk and fraud analysis, and increased productivity, a higher level of AI maturity at banks and credit unions will be a competitive differentiator, increasing business value across the organization. The banking industry must move out of the formative stages of AI deployment to help enhance human intelligence. This includes uncovering the drivers of key performance measures such as revenue and profit, and propelling innovation in products, services, processes and customer service. All financial institutions have the ability within reach to harness insights at scale – leveraging the right information, from the right people, at the right time.