HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Neural Language Modeling by Jointly Learning Syntax and Lexicon

Yikang Shen; Zhouhan Lin; Chin-Wei Huang; Aaron Courville

Neural Language Modeling by Jointly Learning Syntax and Lexicon

Abstract

We propose a neural language model capable of unsupervised syntactic structure induction. The model leverages the structure information to form better semantic representations and better language modeling. Standard recurrent neural networks are limited by their structure and fail to efficiently use syntactic information. On the other hand, tree-structured recursive networks usually require additional structural supervision at the cost of human expert annotation. In this paper, We propose a novel neural language model, called the Parsing-Reading-Predict Networks (PRPN), that can simultaneously induce the syntactic structure from unannotated sentences and leverage the inferred structure to learn a better language model. In our model, the gradient can be directly back-propagated from the language model loss into the neural parsing network. Experiments show that the proposed model can discover the underlying syntactic structure and achieve state-of-the-art performance on word/character-level language model tasks.

Code Repositories

nyu-mll/PRPN-Analysis
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbPRPN
Max F1 (WSJ): 38.1
constituency-grammar-induction-on-ptbPRPN (tuned)
Max F1 (WSJ): 47.9
Mean F1 (WSJ): 47.3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Neural Language Modeling by Jointly Learning Syntax and Lexicon | Papers | HyperAI