diag
Towards Anytime-Valid Statistical Watermarking
Huang, Baihe, Xu, Eric, Ramchandran, Kannan, Jiao, Jiantao, Jordan, Michael I.
The proliferation of Large Language Models (LLMs) necessitates efficient mechanisms to distinguish machine-generated content from human text. While statistical watermarking has emerged as a promising solution, existing methods suffer from two critical limitations: the lack of a principled approach for selecting sampling distributions and the reliance on fixed-horizon hypothesis testing, which precludes valid early stopping. In this paper, we bridge this gap by developing the first e-value-based watermarking framework, Anchored E-Watermarking, that unifies optimal sampling with anytime-valid inference. Unlike traditional approaches where optional stopping invalidates Type-I error guarantees, our framework enables valid, anytime-inference by constructing a test supermartingale for the detection process. By leveraging an anchor distribution to approximate the target model, we characterize the optimal e-value with respect to the worst-case log-growth rate and derive the optimal expected stopping time. Our theoretical claims are substantiated by simulations and evaluations on established benchmarks, showing that our framework can significantly enhance sample efficiency, reducing the average token budget required for detection by 13-15% relative to state-of-the-art baselines.
- Asia > Middle East > Jordan (0.41)
- North America > United States > California > Alameda County > Berkeley (0.04)
- North America > United States > Massachusetts > Middlesex County > Burlington (0.04)
- Europe > United Kingdom > Scotland > City of Edinburgh > Edinburgh (0.04)
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > Canada (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.50)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Regression (0.40)
tandx
BytheMarkovian assumption forlatent state vectors, the Hessian matrix is tri-block diagonal. To facilitate convergence, we initialize the Newton update with a smoothing estimate bylocalGaussian approximation. Theforwardfiltering foradynamic Poisson modelhas been previously described (Eden etal., 2004), and we use anadditional backward pass tosmooth (Rauchetal.,1965). Without constraints, the sampling ofh(j), g(j) and σ2(j) is the same as shown previously. The update of A(j), b(j) and Q(j) is the standard multivariate Bayesian linear regression.
- Europe > France (0.04)
- Asia > Middle East > Jordan (0.04)
- Europe > Switzerland > Zürich > Zürich (0.13)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.04)
- Asia > China > Guangdong Province > Shenzhen (0.04)
- Asia > China > Heilongjiang Province > Harbin (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- North America > United States > Wisconsin > Dane County > Madison (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (2 more...)
- North America > United States > Colorado (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (2 more...)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.92)
- Asia > Afghanistan > Parwan Province > Charikar (0.04)
- Europe > Russia (0.04)
- Asia > Russia (0.04)
- Europe > France > Grand Est > Meurthe-et-Moselle > Nancy (0.04)
- Health & Medicine > Therapeutic Area (0.67)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.46)
- Health & Medicine > Diagnostic Medicine (0.45)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Data Science (0.92)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.67)
- Information Technology > Artificial Intelligence > Vision (0.67)