HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Sentence-State LSTM for Text Representation

Yue Zhang; Qi Liu; Linfeng Song

Sentence-State LSTM for Text Representation

Abstract

Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
named-entity-recognition-ner-on-conll-2003S-LSTM
F1: 91.57
part-of-speech-tagging-on-penn-treebankS-LSTM
Accuracy: 97.55
sentiment-analysis-on-imdbS-LSTM
Accuracy: 87.15
sentiment-analysis-on-mrS-LSTM
Accuracy: 76.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sentence-State LSTM for Text Representation | Papers | HyperAI