Stop Thinking, Just Do!

Sungsoo Kim's Blog

Quantifying and Understanding Memorization in Deep Neural Networks

tagsTags

25 July 2023


Article Source


Quantifying and Understanding Memorization in Deep Neural Networks

  • Chiyuan Zhang, Google

Abstract

Deep learning algorithms are well-known to have a propensity for fitting the training data very well and memorize idiosyncratic properties in the training examples. From a scientific perspective, understanding memorization in deep neural networks shed light on how those models generalize. From a practical perspective, understanding memorization is crucial to address privacy and security issues related to deploying models in real world applications. In this talk, we present a series of studies centered at quantifying memorization in neural language models. We explain why in many real world tasks, memorization is necessary for optimal generalization. We also present quantitative studies on memorization, forgetting and unlearning of both vision and language models, to better understand the behaviors and implications of memorization in those models.


comments powered by Disqus