ext
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Europe > Austria > Vienna (0.14)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- North America > United States > Arizona (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- Europe > Hungary > Budapest > Budapest (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- North America > United States > District of Columbia > Washington (0.04)
XAM: Interactive Explainability for Authorship Attribution Models
Alshomary, Milad, Bhatnagar, Anisha, Zeng, Peter, Muresan, Smaranda, Rambow, Owen, McKeown, Kathleen
We present IXAM, an Interactive eXplainability framework for Authorship Attribution Models. Given an authorship attribution (AA) task and an embedding-based AA model, our tool enables users to interactively explore the model's embedding space and construct an explanation of the model's prediction as a set of writing style features at different levels of granularity. Through a user evaluation, we demonstrate the value of our framework compared to predefined stylistic explanations.
- North America > United States > Pennsylvania > Philadelphia County > Philadelphia (0.04)
- North America > United States > New York > Suffolk County > Stony Brook (0.04)
- North America > Dominican Republic (0.04)
- (3 more...)
How many measurements are enough? Bayesian recovery in inverse problems with general distributions
We study the sample complexity of Bayesian recovery for solving inverse problems with general prior, forward operator and noise distributions. We consider posterior sampling according to an approximate prior $\mathcal{P}$, and establish sufficient conditions for stable and accurate recovery with high probability. Our main result is a non-asymptotic bound that shows that the sample complexity depends on (i) the intrinsic complexity of $\mathcal{P}$, quantified by its so-called approximate covering number, and (ii) concentration bounds for the forward operator and noise distributions. As a key application, we specialize to generative priors, where $\mathcal{P}$ is the pushforward of a latent distribution via a Deep Neural Network (DNN). We show that the sample complexity scales log-linearly with the latent dimension $k$, thus establishing the efficacy of DNN-based priors. Generalizing existing results on deterministic (i.e., non-Bayesian) recovery for the important problem of random sampling with an orthogonal matrix $U$, we show how the sample complexity is determined by the coherence of $U$ with respect to the support of $\mathcal{P}$. Hence, we establish that coherence plays a fundamental role in Bayesian recovery as well. Overall, our framework unifies and extends prior work, providing rigorous guarantees for the sample complexity of solving Bayesian inverse problems with arbitrary distributions.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- North America > Canada (0.04)
- North America > United States > Michigan (0.04)
- Europe > Switzerland (0.04)