The Science of Cause and Effect: From Deep Learning to Deep Understanding - 2022

Details

Title : The Science of Cause and Effect: From Deep Learning to Deep Understanding Author(s): Simons Institute Link(s) : https://www.youtube.com/watch?v=ze5-AqY_k6A

Rough Notes

  • Pearl praises George Boole for providing a formalization of logic which has been around for millenia, and Sewall Wright for his contributions to path diagrams.
  • Structural Causal Models (SCMs) as an oracle perspective of the world, as a society of listeners.
  • The fundamental laws of causal inference are:
    • Law 1, Law of counterfactuals and interventions: \(Y_x(u)=Y_{M_x}(u)\), where \(M\) is an SCM, \(x\) is a value for variable \(X\) and \(Y_{M_x}(u)\) is the value of variable \(Y\) had \(X\) been \(x\) for an individual exogenous variable \(u\). \(M_x\) is the SCM where we removed the equation for variable \(X\) in \(M\) and replaced it with the constant \(x\).
    • Law 2, Law of conditional independence (d-separation): \(\mathbb{I}(X,Y|Z)_{\mathcal{G}(M)} \implies \mathbb{I}(X,Y|Z)_{\mathbb{P}}\). That is, d-separation in the graph corresponds to conditional independence in the distribution. This law follows from the first law but is not trivial to derive that.
  • Counterfactuals and interventions are all about solving equations.
  • Interventions are a special kind of counterfactuals, \(\mathbb{P}(Y=y|do(X=x))=\mathbb{P}(Y_x=y)\). The "do" operator is needed to distinguish between things we can infer and cannot infer with randomized experiments.

Emacs 29.4 (Org mode 9.6.15)