HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Unsupervised Recurrent Neural Network Grammars

Yoon Kim; Alexander M. Rush; Lei Yu; Adhiguna Kuncoro; Chris Dyer; Gábor Melis

Unsupervised Recurrent Neural Network Grammars

Abstract

Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In this work, we experiment with unsupervised learning of RNNGs. Since directly marginalizing over the space of latent trees is intractable, we instead apply amortized variational inference. To maximize the evidence lower bound, we develop an inference network parameterized as a neural CRF constituency parser. On language modeling, unsupervised RNNGs perform as well their supervised counterparts on benchmarks in English and Chinese. On constituency grammar induction, they are competitive with recent neural language models that induce tree structures from words through attention mechanisms.

Code Repositories

harvardnlp/urnng
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbURNNG
Max F1 (WSJ): 52.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Unsupervised Recurrent Neural Network Grammars | Papers | HyperAI