Stop Thinking, Just Do!

Sungsoo Kim's Blog

Towards Causal Representation Learning

tagsTags

31 August 2021


Article Source


Towards Causal Representation Learning

  • In this causalcourse.com guest talk from Yoshua Bengio, Yoshua talks about causal representation learning.

Understanding and generalization beyond the training distribution are regarded as huge challenges in modern machine learning (ML) — and Yoshua Bengio argues it’s time to look at causal learning for possible solutions. In the paper Towards Causal Representation Learning, Turing Award honoree Bengio and his research team make an effort to unite causality and ML research approaches, delineate some implications of causality for ML, and propose critical areas for future research.

Abstract

The two fields of machine learning and graphical causality arose and developed separately. However, there is now cross-pollination and increasing interest in both fields to benefit from the advances of the other. In the present paper, we review fundamental concepts of causal inference and relate them to crucial open problems of machine learning, including transfer and generalization, thereby assaying how causality can contribute to modern machine learning research. This also applies in the opposite direction: we note that most work in causality starts from the premise that the causal variables are given. A central problem for AI and causality is, thus, causal representation learning, the discovery of high-level causal variables from low-level observations. Finally, we delineate some implications of causality for machine learning and propose key research areas at the intersection of both communities.


comments powered by Disqus