HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Ensemble Distillation for Unsupervised Constituency Parsing

Behzad Shayegh; Yanshuai Cao; Xiaodan Zhu; Jackie C.K. Cheung; Lili Mou

Ensemble Distillation for Unsupervised Constituency Parsing

Abstract

We investigate the unsupervised constituency parsing task, which organizes words and phrases of a sentence into a hierarchical structure without using linguistically annotated data. We observe that existing unsupervised parsers capture differing aspects of parsing structures, which can be leveraged to enhance unsupervised parsing performance. To this end, we propose a notion of "tree averaging," based on which we further propose a novel ensemble method for unsupervised parsing. To improve inference efficiency, we further distill the ensemble knowledge into a student model; such an ensemble-then-distill process is an effective approach to mitigate the over-smoothing problem existing in common multi-teacher distilling methods. Experiments show that our method surpasses all previous approaches, consistently demonstrating its effectiveness and robustness across various runs, with different ensemble components, and under domain-shift conditions.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbEnsemble (Selective MBR)
Mean F1 (WSJ): 66.2
constituency-grammar-induction-on-ptbEnsemble (Generative MBR)
Max F1 (WSJ): 71.9
Mean F1 (WSJ): 70.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Ensemble Distillation for Unsupervised Constituency Parsing | Papers | HyperAI