hassanzadeh
WikiCausal: Corpus and Evaluation Framework for Causal Knowledge Graph Construction
Recently, there has been an increasing interest in the construction of general-domain and domain-specific causal knowledge graphs. Such knowledge graphs enable reasoning for causal analysis and event prediction, and so have a range of applications across different domains. While great progress has been made toward automated construction of causal knowledge graphs, the evaluation of such solutions has either focused on low-level tasks (e.g., cause-effect phrase extraction) or on ad hoc evaluation data and small manual evaluations. In this Resource Track paper, we present a corpus, task, and evaluation framework for causal knowledge graph construction. Our corpus consists of Wikipedia articles for a collection of event-related concepts in Wikidata. The task is to extract causal relations between event concepts from the corpus. The evaluation is performed in part using existing causal relations in Wikidata to measure recall, and in part using Large Language Models to avoid the need for manual or crowd-sourced evaluation. We evaluate a pipeline for causal knowledge graph construction that relies on neural models for question answering and concept linking, and show how the corpus and the evaluation framework allow us to effectively find the right model for each task. The corpus and the evaluation framework are publicly available.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- Europe > Austria > Vienna (0.14)
- North America > United States > Texas > Travis County > Austin (0.04)
- (11 more...)
- Health & Medicine > Epidemiology (0.68)
- Health & Medicine > Therapeutic Area (0.46)
- Banking & Finance > Economy (0.46)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (0.46)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Semantic Networks (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (0.48)
Beyond Suspension: A Two-phase Methodology for Concluding Sports Leagues
Hassanzadeh, Ali, Hosseini, Mojtaba, Turner, John G.
Problem definition: Professional sports leagues may be suspended due to various reasons such as the recent COVID-19 pandemic. A critical question the league must address when re-opening is how to appropriately select a subset of the remaining games to conclude the season in a shortened time frame. Academic/practical relevance: Despite the rich literature on scheduling an entire season starting from a blank slate, concluding an existing season is quite different. Our approach attempts to achieve team rankings similar to that which would have resulted had the season been played out in full. Methodology: We propose a data-driven model which exploits predictive and prescriptive analytics to produce a schedule for the remainder of the season comprised of a subset of originally-scheduled games. Our model introduces novel rankings-based objectives within a stochastic optimization model, whose parameters are first estimated using a predictive model. We introduce a deterministic equivalent reformulation along with a tailored Frank-Wolfe algorithm to efficiently solve our problem, as well as a robust counterpart based on min-max regret. Results: We present simulation-based numerical experiments from previous National Basketball Association (NBA) seasons 2004--2019, and show that our models are computationally efficient, outperform a greedy benchmark that approximates a non-rankings-based scheduling policy, and produce interpretable results. Managerial implications: Our data-driven decision-making framework may be used to produce a shortened season with 25-50\% fewer games while still producing an end-of-season ranking similar to that of the full season, had it been played.
- North America > United States > California > Orange County > Irvine (0.14)
- North America > United States > California > Los Angeles County > Los Angeles (0.04)
- North America > United States > Wisconsin > Milwaukee County > Milwaukee (0.04)
- (12 more...)
- Leisure & Entertainment > Sports > Football (1.00)
- Leisure & Entertainment > Sports > Basketball (1.00)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (1.00)
- (3 more...)
- Information Technology > Data Science > Data Mining (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (1.00)
Fourier Transformations Reveal How AI Learns Complex Physics
A new study has found that Fourier analysis, a mathematical technique that has been around for 200 years, can be used to reveal important information about how deep neural networks learn to perform complex physics tasks, such as climate and turbulence modeling. This research highlights the potential of Fourier analysis as a tool for gaining insights into the inner workings of artificial intelligence and could have significant implications for the development of more effective machine learning algorithms. Fourier transformations reveal how deep neural network learns complex physics. One of the oldest tools in computational physics -- a 200-year-old mathematical technique known as Fourier analysis -- can reveal crucial information about how a form of artificial intelligence called a deep neural network learns to perform tasks involving complex physics like climate and turbulence modeling, according to a new study. The discovery by mechanical engineering researchers at Rice University is described in an open-access study published in the journal PNAS Nexus, a sister publication of the Proceedings of the National Academy of Sciences.
- North America > United States > New York (0.05)
- North America > United States > California > Santa Clara County > Palo Alto (0.05)
What's inside a deep neural network
Say you have a cutting-edge gadget that can crack any safe in the world--but you haven't got a clue how it works. You could take a much older safe-cracking tool--a trusty crowbar, perhaps. You could use that lever to pry open your gadget, peek at its innards, and try to reverse-engineer it. As it happens, that's what scientists have just done with mathematics. Researchers have examined a deep neural network--one type of artificial intelligence, a type that's notoriously enigmatic on the inside--with a well-worn type of mathematical analysis that physicists and engineers have used for decades.
Closed-form discovery of structural errors in models of chaotic systems by integrating Bayesian sparse regression and data assimilation
Mojgani, Rambod, Chattopadhyay, Ashesh, Hassanzadeh, Pedram
Models used for many important engineering and natural systems are imperfect. The discrepancy between the mathematical representations of a true physical system and its imperfect model is called the model error. These model errors can lead to substantial difference between the numerical solutions of the model and the observations of the system, particularly in those involving nonlinear, multi-scale phenomena. Thus, there is substantial interest in reducing model errors, particularly through understanding their physics and sources and leveraging the rapid growth of observational data. Here we introduce a framework named MEDIDA: Model Error Discovery with Interpretability and Data Assimilation. MEDIDA only requires a working numerical solver of the model and a small number of noise-free or noisy sporadic observations of the system. In MEDIDA, first the model error is estimated from differences between the observed states and model-predicted states (the latter are obtained from a number of one-time-step numerical integrations from the previous observed states). If observations are noisy, a data assimilation (DA) technique such as ensemble Kalman filter (EnKF) is first used to provide a noise-free analysis state of the system, which is then used in estimating the model error. Finally, an equation-discovery technique, such as the relevance vector machine (RVM), a sparsity-promoting Bayesian method, is used to identify an interpretable, parsimonious, closed-form representation of the model error. Using the chaotic Kuramoto-Sivashinsky (KS) system as the test case, we demonstrate the excellent performance of MEDIDA in discovering different types of structural/parametric model errors, representing different types of missing physics, using noise-free and noisy observations.
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.69)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.34)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.34)
This neural network accurately predicts extreme weather events
Researchers from Rice University have developed a deep learning system that can predict deadly heat waves and winter storms before they happen. The system was trained on hundreds of pairs of maps that showed surface temperatures and air pressures. These included the hot and cold spells that typically lead to extreme weather events. Each pair showed these conditions in the same geographical area, but several days apart. After training, the system was applied to maps that it had never seen before and tasked with making five-day forecasts of extreme weather.
Deep Learning Accurately Forecasts Heat Waves, Cold Spells
Rice University engineers have created a deep learning computer system that taught itself to accurately predict extreme weather events, like heat waves, up to five days in advance using minimal information about current weather conditions. Ironically, Rice's self-learning "capsule neural network" uses an analog method of weather forecasting that computers made obsolete in the 1950s. During training, it examines hundreds of pairs of maps. Each map shows surface temperatures and air pressures at five-kilometers height, and each pair shows those conditions several days apart. The training includes scenarios that produced extreme weather -- extended hot and cold spells that can lead to deadly heat waves and winter storms.
- Europe > France (0.07)
- North America > United States > California (0.05)
- Europe > Russia (0.05)
- Asia > Russia (0.05)
And now, here's Cli-Mate 9000 with the weather... Pattern-recognizing neural network tries its hand at forecasting
Deep-learning software may help scientists predict extreme weather patterns more accurately than relying on today's weather prediction models alone. Simulations involving complex differential equations are run on supercomputers to predict the weather. The accuracy of forecasts using this approach have improved over time, though it's still tricky to pinpoint extreme events like cold spells or heat waves. "It may be that we need faster supercomputers to solve the governing equations of the numerical weather prediction models at higher resolutions," Pedram Hassanzadeh, an assistant professor at the United States' Rice University's Department of Mechanical Engineering, said on Tuesday. "But because we don't fully understand the physics and precursor conditions of extreme-causing weather patterns, it's also possible that the equations aren't fully accurate, and they won't produce better forecasts, no matter how much computing power we put in." Here's where AI may come in handy.