DUE: A Deep Learning Framework and Library for Modeling Unknown Equations
Chen, Junfeng, Wu, Kailiang, Xiu, Dongbin
Equations, particularly differential equations, are fundamental for understanding natural phenomena and predicting complex dynamics across various scientific and engineering disciplines. However, the governing equations for many complex systems remain unknown due to intricate underlying mechanisms. Recent advancements in machine learning and data science offer a new paradigm for modeling unknown equations from measurement or simulation data. This paradigm shift, known as data-driven discovery or modeling, stands at the forefront of AI for science, with significant progress made in recent years. In this paper, we introduce a systematic framework for data-driven modeling of unknown equations using deep learning. This versatile framework is capable of learning unknown ODEs, PDEs, DAEs, IDEs, SDEs, reduced or partially observed systems, and non-autonomous differential equations. Based on this framework, we have developed Deep Unknown Equations (DUE), an open-source software package designed to facilitate the data-driven modeling of unknown equations using modern deep learning techniques. DUE serves as an educational tool for classroom instruction, enabling students and newcomers to gain hands-on experience with differential equations, data-driven modeling, and contemporary deep learning approaches such as FNN, ResNet, generalized ResNet, operator semigroup networks (OSG-Net), and Transformers. Additionally, DUE is a versatile and accessible toolkit for researchers across various scientific and engineering fields. It is applicable not only for learning unknown equations from data but also for surrogate modeling of known, yet complex, equations that are costly to solve using traditional numerical methods. We provide detailed descriptions of DUE and demonstrate its capabilities through diverse examples, which serve as templates that can be easily adapted for other applications.
$\alpha$-Flow: A Unified Framework for Continuous-State Discrete Flow Matching Models
Cheng, Chaoran, Li, Jiahan, Fan, Jiajun, Liu, Ge
Recent efforts have extended the flow-matching framework to discrete generative modeling. One strand of models directly works with the continuous probabilities instead of discrete tokens, which we colloquially refer to as Continuous-State Discrete Flow Matching (CS-DFM). Existing CS-DFM models differ significantly in their representations and geometric assumptions. This work presents a unified framework for CS-DFM models, under which the existing variants can be understood as operating on different $\alpha$-representations of probabilities. Building upon the theory of information geometry, we introduce $\alpha$-Flow, a family of CS-DFM models that adheres to the canonical $\alpha$-geometry of the statistical manifold, and demonstrate its optimality in minimizing the generalized kinetic energy. Theoretically, we show that the flow matching loss for $\alpha$-flow establishes a unified variational bound for the discrete negative log-likelihood. We comprehensively evaluate different instantiations of $\alpha$-flow on various discrete generation domains to demonstrate their effectiveness in discrete generative modeling, including intermediate values whose geometries have never been explored before. $\alpha$-flow significantly outperforms its discrete-state counterpart in image and protein sequence generation and better captures the entropy in language modeling.
Conditional Distribution Compression via the Kernel Conditional Mean Embedding
Broadbent, Dominic, Whiteley, Nick, Allison, Robert, Lovett, Tom
Existing distribution compression methods, like Kernel Herding (KH), were originally developed for unlabelled data. However, no existing approach directly compresses the conditional distribution of labelled data. To address this gap, we first introduce the Average Maximum Conditional Mean Discrepancy (AMCMD), a natural metric for comparing conditional distributions. We then derive a consistent estimator for the AMCMD and establish its rate of convergence. Next, we make a key observation: in the context of distribution compression, the cost of constructing a compressed set targeting the AMCMD can be reduced from $\mathcal{O}(n^3)$ to $\mathcal{O}(n)$. Building on this, we extend the idea of KH to develop Average Conditional Kernel Herding (ACKH), a linear-time greedy algorithm that constructs a compressed set targeting the AMCMD. To better understand the advantages of directly compressing the conditional distribution rather than doing so via the joint distribution, we introduce Joint Kernel Herding (JKH), a straightforward adaptation of KH designed to compress the joint distribution of labelled data. While herding methods provide a simple and interpretable selection process, they rely on a greedy heuristic. To explore alternative optimisation strategies, we propose Joint Kernel Inducing Points (JKIP) and Average Conditional Kernel Inducing Points (ACKIP), which jointly optimise the compressed set while maintaining linear complexity. Experiments show that directly preserving conditional distributions with ACKIP outperforms both joint distribution compression (via JKH and JKIP) and the greedy selection used in ACKH. Moreover, we see that JKIP consistently outperforms JKH.
Towards Weaker Variance Assumptions for Stochastic Optimization
Alacaoglu, Ahmet, Malitsky, Yura, Wright, Stephen J.
We revisit a classical assumption for analyzing stochastic gradient algorithms where the squared norm of the stochastic subgradient (or the variance for smooth problems) is allowed to grow as fast as the squared norm of the optimization variable. We contextualize this assumption in view of its inception in the 1960s, its seemingly independent appearance in the recent literature, its relationship to weakest-known variance assumptions for analyzing stochastic gradient algorithms, and its relevance in deterministic problems for non-Lipschitz nonsmooth convex optimization. We build on and extend a connection recently made between this assumption and the Halpern iteration. For convex nonsmooth, and potentially stochastic, optimization, we analyze horizon-free, anytime algorithms with last-iterate rates. For problems beyond simple constrained optimization, such as convex problems with functional constraints or regularized convex-concave min-max problems, we obtain rates for optimality measures that do not require boundedness of the feasible set.
Artificial intelligence transforms patient care and reduces burnout, physician says
With just one click, the AI technology begins transcribing the doctor's conversation with a patient. DENVER – Artificial intelligence is quietly transforming how doctors interact with patients -- and it might already be in use during your next visit to the doctor's office. Thousands of physicians across the country are using a form of AI called ambient listening, surveys show. This technology listens to conversations between doctors and patients, creates real-time transcriptions, and then compiles detailed clinical notes -- all without disrupting the flow of the appointment. Dr. Daniel Kortsch, associate chief of artificial intelligence and digital health at Denver Health, said that ambient listening technology has made a big difference since his practice began using it in fall 2024.
Netflix tests out new AI search engine for movies and TV shows powered by OpenAI
Black Mirror may be able to draw inspiration for future episodes from the very platform it streams on. Netflix has just recently rolled out access to a new AI search engine tool to some of its subscribers, according to a report from Bloomberg. The AI search engine, which is powered by ChatGPT creator OpenAI, takes Netflix's search capabilities beyond looking up movies and TV shows by title, genre, or actor. The tool allows users to search for content using numerous other search queries, such as mood. Being that the feature is powered by OpenAI, it appears likely that users will be able to use natural language in their search.
Is your phone secretly listening to you? Here's a simple way to find out
If you're a smartphone owner--and chances are that's everyone reading this--you've probably encountered an eerie, but all too common scenario: One day you're talking about a random topic while your phone is next to you and the following day you notice ads start popping up related to that same topic. How do these ads know what you were talking about? Your smartphone may be the culprit. Every smartphone has its built-in microphone constantly turned on in order for the virtual assistant to hear your voice commands. So, could it be that these devices are also secretly eavesdropping on your conversations in order to serve you ads? Here's everything you need to know, plus a simple test to find out.
Upgrade to Windows 11 Pro for less than a movie ticket
TL;DR: Upgrade to Microsoft Windows 11 Pro for just 14.97 (regularly 199) and enjoy enhanced security, productivity features, and the AI-powered Copilot assistant. In the ever-evolving world of technology, keeping your operating system current is essential for optimal performance and security. For a limited time, you can upgrade to Microsoft Windows 11 Pro for just 14.97, a significant reduction from its regular price of 199. Windows 11 Pro offers a sleek and user-friendly interface designed to enhance your computing experience. Features like Snap Layouts and Virtual Desktops allow for efficient multitasking and enable you to organize your workspace with ease.
Let AI help you tackle work tasks with this tool that combines the major models, now 80 for life
Curious about using AI to help with work tasks but aren't sure where to start? Let 1min.AI serve as your one-stop shop. This handy platform combines several popular AI models -- including ChatGPT, Gemini, and Midjourney -- into one app, letting you test out their unique features without hopping between services. Right now, a lifetime subscription to the 1min.AI Advanced Business Plan can be yours for just 79.97 (reg. Once it's done, just give it a human once-over, and it will be good to go.
The rise of end times fascism
The movement for corporate city states cannot believe its good luck. For years, it has been pushing the extreme notion that wealthy, tax-averse people should up and start their own high-tech fiefdoms, whether new countries on artificial islands in international waters ("seasteading") or pro-business "freedom cities" such as Próspera, a glorified gated community combined with a wild west med spa on a Honduran island. Yet despite backing from the heavy-hitter venture capitalists Peter Thiel and Marc Andreessen, their extreme libertarian dreams kept bogging down: it turns out most self-respecting rich people don't actually want to live on floating oil rigs, even if it means lower taxes, and while Próspera might be nice for a holiday and some body "upgrades", its extra-national status is currently being challenged in court. Now, all of a sudden, this once-fringe network of corporate secessionists finds itself knocking on open doors at the dead center of global power. The first sign that fortunes were shifting came in 2023, when a campaigning Donald Trump, seemingly out of nowhere, promised to hold a contest that would lead to the creation of 10 "freedom cities" on federal lands. The trial balloon barely registered at the time, lost in the daily deluge of outrageous claims. Since the new administration took office, however, would-be country starters have been on a lobbying blitz, determined to turn Trump's pledge into reality. "The energy in DC is absolutely electric," Trey Goff, the chief of staff of Próspera, recently enthused after a trip to Capitol Hill.