Semantic Textual Similarity On Sts Benchmark

评估指标

Spearman Correlation

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
Mnet-Sim0.931MNet-Sim: A Multi-layered Semantic Similarity Network to Evaluate Sentence Similarity-
MT-DNN-SMART0.925SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
StructBERTRoBERTa ensemble0.924StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding-
T5-11B0.921Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
RealFormer0.8988RealFormer: Transformer Likes Residual Attention
T5-3B0.898Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
AnglE-LLaMA-13B0.8969AnglE-optimized Text Embeddings
ASA + RoBERTa0.892Adversarial Self-Attention for Language Understanding
PromptEOL+CSE+LLaMA-30B0.8914Scaling Sentence Embeddings with Large Language Models
AnglE-LLaMA-7B0.8897AnglE-optimized Text Embeddings
AnglE-LLaMA-7B-v20.8897AnglE-optimized Text Embeddings
T5-Large 770M0.886Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
PromptEOL+CSE+OPT-13B0.8856Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-2.7B0.8833Scaling Sentence Embeddings with Large Language Models
PromCSE-RoBERTa-large (0.355B)0.8787Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
BigBird.878Big Bird: Transformers for Longer Sequences
Trans-Encoder-RoBERTa-large-cross (unsup.)0.867Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
SimCSE-RoBERTalarge0.867SimCSE: Simple Contrastive Learning of Sentence Embeddings
Trans-Encoder-RoBERTa-large-bi (unsup.)0.8655Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
ASA + BERT-base0.865Adversarial Self-Attention for Language Understanding
0 of 66 row(s) selected.
Semantic Textual Similarity On Sts Benchmark | SOTA | HyperAI超神经