
摘要
由于其在时间上具有优越的序列信息保存能力,长短期记忆网络(Long Short-Term Memory, LSTM)——一种计算单元更为复杂的递归神经网络——已经在多种序列建模任务中取得了显著成果。迄今为止,唯一被探索的LSTM底层结构是线性链。然而,自然语言表现出句法特性,这些特性会自然地将单词组合成语块。我们引入了树形LSTM(Tree-LSTM),即将LSTM推广到树形结构的网络拓扑。在两个任务上,树形LSTM的表现优于所有现有系统和强大的LSTM基线模型:预测两个句子的语义相关性(SemEval 2014,任务1)和情感分类(斯坦福情感树库)。
代码仓库
inyukwo1/tree-lstm
pytorch
GitHub 中提及
rohitguptacs/ReVal
pytorch
GitHub 中提及
tensorflow/fold
tf
GitHub 中提及
Mind23-2/MindCode-17
mindspore
jayanti-prasad/TreeLSTM
pytorch
GitHub 中提及
zxk19981227/LSTM-SST
pytorch
GitHub 中提及
munashe5/SemanticTreeLSTM
tf
GitHub 中提及
EmilReinert/DeepLearningPipelines
pytorch
GitHub 中提及
stanfordnlp/treelstm
官方
pytorch
GitHub 中提及
vastsak/tree_structured_gru
tf
GitHub 中提及
tomekkorbak/treehopper
pytorch
GitHub 中提及
Vivswan/Sentiment-Analysis-TreeLSTM
pytorch
GitHub 中提及
dasguptar/treelstm.pytorch
pytorch
GitHub 中提及
ttpro1995/TreeLSTMSentiment
pytorch
GitHub 中提及
基准测试
| 基准 | 方法 | 指标 |
|---|---|---|
| semantic-similarity-on-sick | Bidirectional LSTM (Tai et al., 2015) | MSE: 0.2736 Pearson Correlation: 0.8567 Spearman Correlation: 0.7966 |
| semantic-similarity-on-sick | LSTM (Tai et al., 2015) | MSE: 0.2831 Pearson Correlation: 0.8528 Spearman Correlation: 0.7911 |
| semantic-similarity-on-sick | Dependency Tree-LSTM (Tai et al., 2015) | MSE: 0.2532 Pearson Correlation: 0.8676 Spearman Correlation: 0.8083 |
| sentiment-analysis-on-sst-2-binary | 2-layer LSTM [tai2015improved] | Accuracy: 86.3 |
| sentiment-analysis-on-sst-2-binary | Consistency Tree LSTM with tuned Glove vectors [tai2015improved] | Accuracy: 88.0 |
| sentiment-analysis-on-sst-5-fine-grained | Constituency Tree-LSTM | Accuracy: 51.0 |