Stop Thinking, Just Do!

Sungsoo Kim's Blog

Next-Generation Recurrent Network Models

tagsTags

27 June 2021


Article Source


Next-generation recurrent network models for cognitive neuroscience

  • Dr. Guangyu Robert Yang, Dept. of Brain and Cognitive Sciences (BCS),
  • EECS Dept., Schwarzman College of Computing (SCC), MIT

Abstract

Recurrent Neural Networks (RNNs) trained with machine learning techniques on cognitive tasks have become a widely accepted tool for neuroscientists. In comparison to traditional computational models in neuroscience, RNNs can offer substantial advantages at explaining complex behavior and neural activity patterns. Their use allows rapid generation of mechanistic hypotheses for cognitive computations. RNNs further provide a natural way to flexibly combine bottom-up biological knowledge with top-down computational goals into network models. However, early works of this approach are faced with fundamental challenges. In this talk, I will discuss some of these challenges, and several recent steps that we took to partly address them and to build next-generation RNN models for cognitive neuroscience.​


comments powered by Disqus