Goto

Collaborating Authors

 calculation


Comprehensive Description of Uncertainty in Measurement for Representation and Propagation with Scalable Precision

Darijani, Ali, Beyerer, Jürgen, Nasrollah, Zahra Sadat Hajseyed, Hoffmann, Luisa, Heizmann, Michael

arXiv.org Machine Learning

Probability theory has become the predominant framework for quantifying uncertainty across scientific and engineering disciplines, with a particular focus on measurement and control systems. However, the widespread reliance on simple Gaussian assumptions--particularly in control theory, manufacturing, and measurement systems--can result in incomplete representations and multistage lossy approximations of complex phenomena, including inaccurate propagation of uncertainty through multi stage processes. This work proposes a comprehensive yet computationally tractable framework for representing and propagating quantitative attributes arising in measurement systems using Probability Density Functions (PDFs). Recognizing the constraints imposed by finite memory in software systems, we advocate for the use of Gaussian Mixture Models (GMMs), a principled extension of the familiar Gaussian framework, as they are universal approximators of PDFs whose complexity can be tuned to trade off approximation accuracy against memory and computation. From both mathematical and computational perspectives, GMMs enable high performance and, in many cases, closed form solutions of essential operations in control and measurement. The paper presents practical applications within manufacturing and measurement contexts especially circular factory, demonstrating how the GMMs framework supports accurate representation and propagation of measurement uncertainty and offers improved accuracy--compared to the traditional Gaussian framework--while keeping the computations tractable.


Expectation Propagation for t-Exponential Family Using q-Algebra

Neural Information Processing Systems

Exponential family distributions are highly useful in machine learning since their calculation can be performed efficiently through natural parameters. The exponential family has recently been extended to the t-exponential family, which contains Student-t distributions as family members and thus allows us to handle noisy data well. However, since the t-exponential family is defined by the deformed exponential, an efficient learning algorithm for the t-exponential family such as expectation propagation (EP) cannot be derived in the same way as the ordinary exponential family. In this paper, we borrow the mathematical tools of q-algebra from statistical physics and show that the pseudo additivity of distributions allows us to perform calculation of t-exponential family distributions through natural parameters. We then develop an expectation propagation (EP) algorithm for the t-exponential family, which provides a deterministic approximation to the posterior or predictive distribution with simple moment matching. We finally apply the proposed EP algorithm to the Bayes point machine and Student-t process classification, and demonstrate their performance numerically.


You Can Approximate Pi by Dropping Needles on the Floor

WIRED

Who needs a supercomputer when you can calculate pi with a box of sewing needles? Happy Pi Day! March 14 is the date that otherwise rational people celebrate this irrational number, because 3/14 contains the first three digits of pi. And hey, pi deserves a day. By definition, it's the ratio of the circumference and diameter of a circle, but it shows up in all kinds of places that seem to have nothing to do with circles, from music to quantum mechanics. Pi is an infinitely long decimal number that never repeats.


Chemistry may not be the 'killer app' for quantum computers after all

New Scientist

Chemistry may not be the'killer app' for quantum computers after all Quantum chemistry calculations that could advance drug development or agriculture have recently emerged as a promising "killer application" of quantum computers, but a new analysis suggests this is unlikely to be the case. Progress in building quantum computers has greatly accelerated in recent years, but it remains an open question what uses are most likely to justify the ongoing investment in this technology. One popular contender is solving problems in quantum chemistry, such as calculating the energy levels of molecules relevant for biomedicine or industry. This requires accounting for the behavior of many quantum particles - electrons in the molecule - simultaneously, so it seems like a good match for computers made from many quantum parts. Quantum computers have finally arrived, but will they ever be useful? However, Xavier Waintal at CEA Grenoble in France and his colleagues have now shown that two leading quantum computing algorithms for this task may actually have, at best, limited use.





Appendix for Bayesian Active Causal Discovery with Multi-Fidelity Experiments Anonymous Author(s) Affiliation Address email

Neural Information Processing Systems

Then, we intend to calculate the constraint part. The algorithm for Licence method for single-target interventiion scenario is shown in Algorithm 1. The details of experimental baselines are demonstrated as follows. AIT [11] is an active learning method that utilize f-score to select intervention queries. REAL fidelity means the model always choose the highest fidelity to conduct experiments.