HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Compound Probabilistic Context-Free Grammars for Grammar Induction

Yoon Kim; Chris Dyer; Alexander M. Rush

Compound Probabilistic Context-Free Grammars for Grammar Induction

Abstract

We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar. In contrast to traditional formulations which learn a single stochastic grammar, our grammar's rule probabilities are modulated by a per-sentence continuous latent variable, which induces marginal dependencies beyond the traditional context-free assumptions. Inference in this grammar is performed by collapsed variational inference, in which an amortized variational posterior is placed on the continuous variable, and the latent trees are marginalized out with dynamic programming. Experiments on English and Chinese show the effectiveness of our approach compared to recent state-of-the-art methods when evaluated on unsupervised parsing.

Code Repositories

harvardnlp/compound-pcfg
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbCompound PCFG
Max F1 (WSJ): 60.1
Max F1 (WSJ10): 68.8
Mean F1 (WSJ): 55.2
constituency-grammar-induction-on-ptbNeural PCFG
Max F1 (WSJ): 52.6
Mean F1 (WSJ): 50.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Compound Probabilistic Context-Free Grammars for Grammar Induction | Papers | HyperAI