HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Yikang Shen; Shawn Tan; Alessandro Sordoni; Aaron Courville

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Abstract

Natural language is hierarchically structured: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). When a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM architecture allows different neurons to track information at different time scales, it does not have an explicit bias towards modeling a hierarchy of constituents. This paper proposes to add such an inductive bias by ordering the neurons; a vector of master input and forget gates ensures that when a given neuron is updated, all the neurons that follow it in the ordering are also updated. Our novel recurrent architecture, ordered neurons LSTM (ON-LSTM), achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbON-LSTM
Max F1 (WSJ): 49.4
Max F1 (WSJ10): 66.8
Mean F1 (WSJ): 47.7
Mean F1 (WSJ10): 65.1
constituency-grammar-induction-on-ptbON-LSTM (tuned)
Max F1 (WSJ): 50.0
Mean F1 (WSJ): 48.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks | Papers | HyperAI