Causal Discovery using Compression-Complexity Measures
The task of learning a causal model from observational data, or a combination of observational and interventional data, is commonly referred to as a causal discovery or causal structure learning [1]. Causal discovery from two variables based on observational data in the absence of time series or controlled interventions is a challenging problem and necessitates additional assumptions [2]. This is a ubiquitous problem in almost all domains of science, but particularly so in econometrics, meteorology, biology and medicine where interventional approaches are difficult or in several cases not feasible. Model-free data-driven approaches for causal discovery have developed significantly over the past decade or so in an attempt to address the problem of causal discovery such as Granger Causality (GC) [3], Transfer Entropy (TE) [4] and Compression-Complexity Causality (CCC) [5]. These methods have been used in various disciplines across neuroscience, climatology, econometrics, etc and rely on properties of time-series data. Both GC and TE have assumptions that need to be met for satisfactory inference, while CCC is assumption-free and robust to many artefacts and nuisance variables. All three need careful parameter calibration and selection for optimally accurate performance. A class of model-free causal discovery methods do not assume a temporal structure in the data and are rooted in algorithmic information theory, chiefly based on the notion of Kolmogorov complexity. The Kolmogorov complexity of a finite binary string is the length of the shortest binary program that generates that string and reflects the computational resources needed to specify it.
Oct-22-2020
- Country:
- Asia (1.00)
- Europe > United Kingdom (0.28)
- North America > United States (0.46)
- Genre:
- Research Report (1.00)
- Industry: