Max Little
Causal Bootstrapping (2020)
Status: Available NowTo draw scientifically meaningful conclusions and draw reliable statistical signal processing inferences of quantitative phenomena, signal processing must take cause and effect into consideration (either implicitly or explicitly). This is particularly challenging when the relevant measurements are not obtained from controlled experimental (interventional) settings, so that cause and effect can be obscured by spurious, indirect influences. Modern predictive techniques from machine learning are capable of capturing high-dimensional, complex, nonlinear relationships between variables while relying on few parametric or probabilistic modelling assumptions. However, since these techniques are associational, applied to observational data they are prone to picking up spurious influences from non-experimental (observational) data, making their predictions unreliable. Techniques from causal inference, such as probabilistic causal diagrams and do-calculus, provide powerful (nonparametric) tools for drawing causal inferences from such observational data. However, these techniques are often incompatible with modern, nonparametric machine learning algorithms since they typically require explicit probabilistic models. In this talk I'll describe causal bootstrapping, a new set of techniques we have developed for augmenting classical nonparametric bootstrap resampling with information about the causal relationship between variables. This makes it possible to resample observational data such that, if it is possible to identify an interventional relationship from that data, new data representing that relationship can be simulated from the original observational data. In this way, we can use modern statistical machine learning and signal processing algorithms unaltered to make statistically powerful, yet causally-robust, inferences.