HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

Xiang Hu; Pengyu Ji; Qingyang Zhu; Wei Wu; Kewei Tu

Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

Abstract

A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with $9$ billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training.

Code Repositories

alipay/StructuredLM_RTDT
pytorch
Mentioned in GitHub
ant-research/structuredlm_rtdt
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
constituency-grammar-induction-on-ptbGPST(left to right parsing)
Mean F1 (WSJ): 55.2
natural-language-inference-on-multinliGPST(unsupervised generative syntactic LM)
Matched: 81.8
Mismatched: 82.0

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale | Papers | HyperAI