Goto

Collaborating Authors

simulation


A New Era for Mechanical CAD

Communications of the ACM

Computer-Aided Design (CAD) has been around since the 1950s. The first graphical CAD program, called Sketchpad, came out of MIT (designworldonline.com). Since then, CAD has become essential to designing and manufacturing hardware products. Today, there are multiple types of CAD. This article focuses on mechanical CAD, used for mechanical engineering. Digging into the history of computer graphics reveals some interesting connections between the most ambitious and notorious engineers. Ivan Sutherland, who received the Turing Award for Sketchpad in 1988, had Edwin Catmull as a student.


Futuristic AI-Based Computing Devices: Physicists Simulate Artificial Brain Networks With New Quantum Materials

#artificialintelligence

Like biologically based systems (left), complex emergent behaviors--which arise when separate components are merged together in a coordinated system--also result from neuromorphic networks made up of quantum-materials-based devices (right). Pandemic lockdown forces a new perspective on designs for futuristic AI-based computing devices. Isaac Newton's groundbreaking scientific productivity while isolated from the spread of bubonic plague is legendary. University of California San Diego physicists can now claim a stake in the annals of pandemic-driven science. A team of UC San Diego researchers and colleagues at Purdue University have now simulated the foundation of new types of artificial intelligence computing devices that mimic brain functions, an achievement that resulted from the COVID-19 pandemic lockdown.


Physicists Simulate Artificial Brain Networks with New Quantum Materials

#artificialintelligence

Isaac Newton's groundbreaking scientific productivity while isolated from the spread of bubonic plague is legendary. University of California San Diego physicists can now claim a stake in the annals of pandemic-driven science. A team of UC San Diego researchers and colleagues at Purdue University have now simulated the foundation of new types of artificial intelligence computing devices that mimic brain functions, an achievement that resulted from the COVID-19 pandemic lockdown. By combining new supercomputing materials with specialized oxides, the researchers successfully demonstrated the backbone of networks of circuits and devices that mirror the connectivity of neurons and synapses in biologically based neural networks. Like biologically based systems (left), complex emergent behaviors--which arise when separate components are merged together in a coordinated system--also result from neuromorphic networks made up of quantum-materials-based devices (right).


Meet the women making waves in AI ethics, research, and entrepreneurship

#artificialintelligence

The Transform Technology Summits start October 13th with Low-Code/No Code: Enabling Enterprise Agility. Women in the AI field are making research breakthroughs, launching exciting companies, spearheading vital ethical discussions, and inspiring the next generation of AI professionals. And that's why we created the VentureBeat Women in AI Awards -- to emphasize the importance of their voices, work, and experiences, and to shine a light on some of these leaders. We first announced the six winners at Transform 2021 in July, and ever since, we've been catching up with each of them for deeper discussions around their work and emerging challenges in the field. Our conversations have touched on everything from regulation and dealing with messy real world data to how to approach AI more responsibly.


Guest post: How artificial intelligence is fast becoming a key tool for climate science

#artificialintelligence

The extensive evidence feeding into the report includes observations collected from across land, ocean and atmosphere, as well as numerous simulations from the latest generation of climate models. However, in recent years, climate scientists have another tool available to them thanks to rapid advances in the development of artificial intelligence (AI) and, particularly, machine learning. In contrast to models that follow a set of explicit and pre-defined rules, machine learning aims towards building systems that can learn and infer such rules based on patterns in data. As a result, a new line of climate research is emerging that aims to complement and extend the use of observations and climate models. The overall goal is to tackle persistent challenges of climate research and to improve projections for the future.


How Computationally Complex Is a Single Neuron?

#artificialintelligence

Our mushy brains seem a far cry from the solid silicon chips in computer processors, but scientists have a long history of comparing the two. As Alan Turing put it in 1952: "We are not interested in the fact that the brain has the consistency of cold porridge." Today, the most powerful artificial intelligence systems employ a type of machine learning called deep learning. Their algorithms learn by processing massive amounts of data through hidden layers of interconnected nodes, referred to as deep neural networks. As their name suggests, deep neural networks were inspired by the real neural networks in the brain, with the nodes modeled after real neurons -- or, at least, after what neuroscientists knew about neurons back in the 1950s, when an influential neuron model called the perceptron was born.


What is Artificial Intelligence?

#artificialintelligence

Artificial intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. The term may also be applied to any machine that exhibits traits associated with a human mind such as learning and problem-solving. Who is the father of Artificial Intelligence? Artificial Intelligence as a research study field was born in the summer of 1956 during the course of a crucial workshop at Dartmouth College in Hanover, New Hampshire. It was just a year before when Marvin Minsky, Nathaniel Rochester, Claude Shannon, and John McCarthy proposed that they should hold a workshop to put together a roadmap about how to make machines think and learn similarly to human beings.


WarpDrive: Extremely Fast Reinforcement Learning on an NVIDIA GPU

#artificialintelligence

It achieves orders of magnitude faster multi-agent RL training with 2000 environments and 1000 agents in a simple Tag environment. WarpDrive provides lightweight tools and workflow objects to build your own fast RL workflows. Check out the code, this blog, and the white paper for more details! The name WarpDrive is inspired by the science fiction concept of a fictional superluminal spacecraft propulsion system. Moreover, at the time of writing, a "warp" is a group of 32 threads that are executing at the same time in (certain) GPUs.


The Loss Function of Intelligence

#artificialintelligence

Simulating artificial general intelligence has appeared to be a harder problem than previously thought [1]: progress in the field of machine learning has proven to be insufficient to complete this challenge. This article suggests a way in which'intelligence' can be simulated, arguing an evolutionary approach is at least one option. What we as humans define as intelligence is hard to put into words. If one would ask around to see how people define this term they would logically end up with varying answers, as is the case for probably all concepts. Still, the word'intelligence' is a relatively broad concept when compared to other ones. Without agreeing on one definition, one can not easily simulate intelligence artificially so that all spectators would agree that it is: usually, we all think a different set of facets of human behaviour can be attributed to'intelligence', although they might be similar [2]. Notice, however, that it is human behaviour, that embeds something that we usually associate with intelligence. When we call other creatures intelligent, like elephants or dolphins, it is usually because we recognize human-like behaviour in theirs.


IBM's fastest supercomputer will be used to find better ways to produce green electricity

ZDNet

The US Department of Energy awarded a total of over seven million node hours on the Summit supercomputer to 20 research teams. Energy giant General Electric (GE) will be using one of the world's most powerful supercomputers, IBM's Summit, to run two new research projects that could boost the production of cleaner power. Last month, the US Department of Energy (DoE), which hosts Summit in Oak Ridge National Laboratory, awarded a total of over seven million node hours on the supercomputer to 20 research teams, two of which belong to GE Research. The Summit supercomputing system is the second most powerful in the world, behind the Fugaku supercomputer located in Japan. Built by IBM, Summit boasts system power equivalent to 70 million iPhone 11s, which scientists can leverage to run large computations such as simulating systems' behavior or solving complex physics problems.