Goto

Collaborating Authors

 huerta


'Being on camera is no longer sensible': persecuted Venezuelan journalists turn to AI

The Guardian

The Colombian Nobel laureate Gabriel García Márquez, who spent some of his happiest years chronicling life in Caracas, once declared journalism "the best job in the world". Not so if you are reporting on today's Venezuela, where journalists are feeling the heat as the South American country lurches towards full-blown dictatorship under President Nicolás Maduro. In the four weeks since Venezuela's disputed election, local journalists have come up with a distinctly 21st-century tactic to avoid being arrested for reporting on 21st-century socialism: using artificial intelligence avatars to report all the news Maduro's regime deems unfit to print. In daily broadcasts, the AI-created newsreaders have been telling the world about the president's post-election crackdown on opponents, activists and the media, without putting the reporters behind the stories at risk. Carlos Eduardo Huertas, the director of Connectas, the Colombia-based journalism platform coordinating the initiative, said far from being a gimmick, the use of AI was a response to "the persecution and the growing repression that our colleagues are suffering in Venezuela, where the uncertainty over the safety of doing their job … grows by the minute".


Faced With A Data Deluge, Astronomers Turn To Automation - AI Summary

#artificialintelligence

Specifically, Huerta and his then graduate student Daniel George pioneered the use of so-called convolutional neural networks (CNNs), which are a type of deep-learning algorithm, to detect and decipher gravitational-wave signals in real time. Roughly speaking, training or teaching a deep-learning system involves feeding it data that are already categorized--say, images of galaxies obscured by lots of noise--and getting the network to identify the patterns in the data correctly. After their initial success with CNNs, Huerta and George, along with Huerta's graduate student Hongyu Shen, scaled up this effort, designing deep-learning algorithms that were trained on supercomputers using millions of simulated signatures of gravitational waves mixed in with noise derived from previous observing runs of Advanced LIGO--an upgrade to LIGO completed in 2015. For instance, Adam Rebei, a high school student in Huerta's group, showed in a recent study that deep learning can identify the complex gravitational-wave signals produced by the merger of black holes in eccentric orbits--something LIGO's traditional algorithms cannot do in real time. In a preprint paper last September, Nicholas Choma of New York University and his colleagues reported the development of a special type of deep-learning algorithm called a graph neural network, whose connections and architecture take advantage of the spatial geometry of the sensors in the ice and the fact that only a few sensors see the light from any given muon track.


Inference-optimized AI and high performance computing for gravitational wave detection at scale

Chaturvedi, Pranshu, Khan, Asad, Tian, Minyang, Huerta, E. A., Zheng, Huihuo

arXiv.org Artificial Intelligence

We introduce an ensemble of artificial intelligence models for gravitational wave detection that we trained in the Summit supercomputer using 32 nodes, equivalent to 192 NVIDIA V100 GPUs, within 2 hours. Once fully trained, we optimized these models for accelerated inference using NVIDIA TensorRT. We deployed our inference-optimized AI ensemble in the ThetaGPU supercomputer at Argonne Leadership Computer Facility to conduct distributed inference. Using the entire ThetaGPU supercomputer, consisting of 20 nodes each of which has 8 NVIDIA A100 Tensor Core GPUs and 2 AMD Rome CPUs, our NVIDIA TensorRT-optimized AI ensemble processed an entire month of advanced LIGO data (including Hanford and Livingston data streams) within 50 seconds. Our inference-optimized AI ensemble retains the same sensitivity of traditional AI models, namely, it identifies all known binary black hole mergers previously identified in this advanced LIGO dataset and reports no misclassifications, while also providing a 3X inference speedup compared to traditional artificial intelligence models. We used time slides to quantify the performance of our AI ensemble to process up to 5 years worth of advanced LIGO data. In this synthetically enhanced dataset, our AI ensemble reports an average of one misclassification for every month of searched advanced LIGO data. We also present the receiver operating characteristic curve of our AI ensemble using this 5 year long advanced LIGO dataset. This approach provides the required tools to conduct accelerated, AI-driven gravitational wave detection at scale.


AI and extreme scale computing to learn and infer the physics of higher order gravitational wave modes of quasi-circular, spinning, non-precessing binary black hole mergers

Khan, Asad, Huerta, E. A.

arXiv.org Artificial Intelligence

We use artificial intelligence (AI) to learn and infer the physics of higher order gravitational wave modes of quasi-circular, spinning, non precessing binary black hole mergers. We trained AI models using 14 million waveforms, produced with the surrogate model NRHybSur3dq8, that include modes up to $\ell \leq 4$ and $(5,5)$, except for $(4,0)$ and $(4,1)$, that describe binaries with mass-ratios $q\leq8$ and individual spins $s^z_{\{1,2\}}\in[-0.8, 0.8]$. We use our AI models to obtain deterministic and probabilistic estimates of the mass-ratio, individual spins, effective spin, and inclination angle of numerical relativity waveforms that describe such signal manifold. Our studies indicate that AI provides informative estimates for these physical parameters. This work marks the first time AI is capable of characterizing this high-dimensional signal manifold. Our AI models were trained within 3.4 hours using distributed training on 256 nodes (1,536 NVIDIA V100 GPUs) in the Summit supercomputer.


3 space science questions that computing is helping to answer

MIT Technology Review

Scientists have since charted these observations and scrambled to learn all they can about these elusive forces. They've detected dozens more gravitational-wave signals, and advances in computing are helping them to keep up. As a postdoc, Huerta searched for gravitational waves by tediously trying to match data collected by detectors to a catalogue of potential waveforms. He wanted to find a better way. Earlier this year Huerta, who is now a computational scientist at Argonne National Laboratory near Chicago, created an AI ensemble that's capable of processing a month's worth of LIGO data in just seven minutes.


Interpretable AI forecasting for numerical relativity waveforms of quasi-circular, spinning, non-precessing binary black hole mergers

Khan, Asad, Huerta, E. A., Zheng, Huihuo

arXiv.org Artificial Intelligence

We present a deep-learning artificial intelligence model that is capable of learning and forecasting the late-inspiral, merger and ringdown of numerical relativity waveforms that describe quasi-circular, spinning, non-precessing binary black hole mergers. We used the NRHybSur3dq8 surrogate model to produce train, validation and test sets of $\ell=|m|=2$ waveforms that cover the parameter space of binary black hole mergers with mass-ratios $q\leq8$ and individual spins $|s^z_{\{1,2\}}| \leq 0.8$. These waveforms cover the time range $t\in[-5000\textrm{M}, 130\textrm{M}]$, where $t=0M$ marks the merger event, defined as the maximum value of the waveform amplitude. We harnessed the ThetaGPU supercomputer at the Argonne Leadership Computing Facility to train our AI model using a training set of 1.5 million waveforms. We used 16 NVIDIA DGX A100 nodes, each consisting of 8 NVIDIA A100 Tensor Core GPUs and 2 AMD Rome CPUs, to fully train our model within 3.5 hours. Our findings show that artificial intelligence can accurately forecast the dynamical evolution of numerical relativity waveforms in the time range $t\in[-100\textrm{M}, 130\textrm{M}]$. Sampling a test set of 190,000 waveforms, we find that the average overlap between target and predicted waveforms is $\gtrsim99\%$ over the entire parameter space under consideration. We also combined scientific visualization and accelerated computing to identify what components of our model take in knowledge from the early and late-time waveform evolution to accurately forecast the latter part of numerical relativity waveforms. This work aims to accelerate the creation of scalable, computationally efficient and interpretable artificial intelligence models for gravitational wave astrophysics.


Overcoming Labor Shortages with Self-Driving Forklifts

#artificialintelligence

The material handling industry has been plagued by labor shortages and high-turnover rates, while simultaneously struggling to react to e-commerce demand reaching all-time highs. To address these problems, leading facilities in the industrial sector are increasingly turning to self-driving equipment like Autonomous Mobile Robots (AMRs) to increase adaptability and throughput. Listen to the recent Advanced Manufacturing Now podcast to hear from Jeff Huerta, VP of Sales at Vecna Robotics, about how profitable facilities are using intelligent autonomous equipment to not only outlast, but come out ahead in turbulent conditions. Learn why autonomous vehicles like self-driving forklifts, pallet movers, and tuggers are more effective than ever before in creating a competitive advantage manufacturing, distribution and warehousing. Over 4 billion pallets are moved everyday on a global scale, with a good deal of that being non value-added travel.


Scientists use artificial intelligence to detect gravitational waves

#artificialintelligence

When gravitational waves were first detected in 2015 by the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), they sent a ripple through the scientific community, as they confirmed another of Einstein's theories and marked the birth of gravitational wave astronomy. Five years later, numerous gravitational wave sources have been detected, including the first observation of two colliding neutron stars in gravitational and electromagnetic waves. As LIGO and its international partners continue to upgrade their detectors' sensitivity to gravitational waves, they will be able to probe a larger volume of the universe, thereby making the detection of gravitational wave sources a daily occurrence. This discovery deluge will launch the era of precision astronomy that takes into consideration extrasolar messenger phenomena, including electromagnetic radiation, gravitational waves, neutrinos and cosmic rays. Realizing this goal, however, will require a radical re-thinking of existing methods used to search for and find gravitational waves.


Advances in Machine and Deep Learning for Modeling and Real-time Detection of Multi-Messenger Sources

Huerta, E. A., Zhao, Zhizhen

arXiv.org Artificial Intelligence

This chapter provides a summary of recent developments harnessing the data revolution to realize the science goals of Gravitational Wave Astrophysics. This is an exciting journey that is powered by the renaissance of artificial intelligence, and a new generation of researchers that are willing to embrace disruptive advances in innovative computing and signal processing tools. In this chapter, machine learning refers to a class of algorithms that can learn from data to solve new problems without being explicitly re-programmed. While traditional machine learning algorithms, e.g., random forests, nearest neighbors, etc., have been used successfully in many applications, they are limited in their ability to process raw data, usually requiring time-consuming feature engineering to preprocess data into a suitable representation for each application. On the other hand, deep learning algorithms can learn patterns from unstructured data, finding useful representations and automatically extracting relevant features for each application. The ability of deep learning to deal with poorly defined abstractions and problems has led to major advances in image recognition, speech, computer vision applications, robotics, among others [1]. The following sections describe a few noteworthy applications of modern machine learning for gravitational wave modeling, detection and inference. It is the expectation that by the time this chapter is published, the ongoing developments at the interface of artificial intelligence and extreme-scale computing will have leapt forward, making this chapter a reminiscence of a fast-paced, evolving field of research. The chapter concludes with a summary of recent applications at the interface of deep learning and high performance computing to address computational grand challenges in Gravitational Wave Astrophysics.


Confluence of Artificial Intelligence and High Performance Computing for Accelerated, Scalable and Reproducible Gravitational Wave Detection

Huerta, E. A., Khan, Asad, Huang, Xiaobo, Tian, Minyang, Levental, Maksim, Chard, Ryan, Wei, Wei, Heflin, Maeve, Katz, Daniel S., Kindratenko, Volodymyr, Mu, Dawei, Blaiszik, Ben, Foster, Ian

arXiv.org Artificial Intelligence

Over the last five years, the advanced LIGO and advanced Virgo detectors have completed three observing runs, reporting over 50 gravitational wave sources [3, 4]. Significant improvements in the sensitivity of the advanced LIGO and advanced Virgo detectors during the last three observing runs have increased the observable volume they can probe, thereby increasing the number of gravitational wave observations [4]. As these observatories continue to enhance their detection capabilities, and other detectors join the international array of gravitational wave detectors, it is expected that gravitational wave sources will be observed at a rate of several per day [4, 5]. An ever-increasing catalog of gravitational wave sources will enable systematic studies that will refine and advance our understanding of stellar evolution, cosmology, alternative theories and gravity, among others [6-11]. The combination of gravitational and electromagnetic waves, and cosmic neutrinos, will shed revolutionary insights into the nature of supranuclear matter in neutron stars [12-14] and the formation and evolution of black holes and neutron stars, providing new and detailed information about their astrophysical environments [15-18]. While all of these science goals are feasible in principle given the proven detection capabilities of astronomical observatories, it is equally true that established algorithms for the observation of multi-messenger sources, such as template matching and nearest neighbors, are compute-intensive and poorly scalable [19-23]. Furthermore, available computational resources will remain oversubscribed, and planned enhancements will be rapidly outstripped with the advent of next-generation detectors within the next couple of years [24, 25]. Thus, an urgent rethinking is critical if we are to realize the Multi-Messenger Astrophysics program in the big-data era [26-28]. To contend with these challenges, a number of researchers have been exploring the application of deep learning and GPU-accelerated computing.