Identifying Confounding from Causal Mechanism Shifts

Abstract. Causal discovery methods commonly assume that all data is independently and identically distributed (i.i.d.) and that there are no unmeasured confounding variables. In practice, neither is likely to hold, and detecting confounding in non-i.i.d. settings poses a significant challenge. Motivated by this, we explore how to discover confounders from data in multiple environments with causal mechanism shifts. We show that the mechanism changes of observed variables can reveal which variable sets are confounded. Based on this idea, we propose an empirically testable criterion based on mutual information, show under which conditions it can identify confounding, and introduce Coco to discover confounders from data in multiple contexts. Our experiments confirm that Coco works well on synthetic and real-world data.

Implementation

the Python source code (March 2024) by Sarah Mameche.
GIT repository of CoCo, maintained by Sarah Mameche

Related Publications

Mameche, S, Vreeken, J & Kaltenpoth, D Identifying Confounding from Causal Mechanism Shifts. In: Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR, 2024. (27.6% acceptance rate)