Paper Discussion: The Importance of Being Recurrent for Modeling Hierarchical Structure

Date:

You may, in fact, need more than attention. This paper is a comparison of the ability of recurrent and non-recurrent (i.e. transformer) neural network structures, focusing on their ability to model hierarchical relationships in natural language. The authors found that for both subject-object agreement and logical entailment, RNN’s outperformed transformers. While there is limited theoretical support for these findings, the empirical results are compelling.que developed by Pearson & Gallagher where students engage with material more independently over time. In this workshop, participants will learn how to apply the I do, We do, You do framework to teaching with Jupyter notebooks. Over the course of the workshop, participants will complete a series of exercises designed to help them use Jupyter notebooks more effectively support active learning in the classroom.