HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Chinese NER Using Lattice LSTM

Yue Zhang; Jie Yang

Chinese NER Using Lattice LSTM

Abstract

We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon. Compared with character-based methods, our model explicitly leverages word and word sequence information. Compared with word-based methods, lattice LSTM does not suffer from segmentation errors. Gated recurrent cells allow our model to choose the most relevant characters and words from a sentence for better NER results. Experiments on various datasets show that lattice LSTM outperforms both word-based and character-based LSTM baselines, achieving the best results.

Code Repositories

jiesutd/LatticeLSTM
Official
pytorch
Mentioned in GitHub
Houlong66/lattice_lstm_with_pytorch
pytorch
Mentioned in GitHub
LeeSureman/Batch_Parallel_LatticeLSTM
pytorch
Mentioned in GitHub

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp