Semantic Textual Similarity On Sts15

评估指标

Spearman Correlation

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
PromptEOL+CSE+LLaMA-30B0.9004Scaling Sentence Embeddings with Large Language Models
AnglE-LLaMA-13B0.8956AnglE-optimized Text Embeddings
PromptEOL+CSE+OPT-13B0.8952Scaling Sentence Embeddings with Large Language Models
PromptEOL+CSE+OPT-2.7B0.8951Scaling Sentence Embeddings with Large Language Models
AnglE-LLaMA-7B-v20.8943AnglE-optimized Text Embeddings
Trans-Encoder-RoBERTa-large-cross (unsup.)0.8863Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-large-bi (unsup.)0.8816Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
PromCSE-RoBERTa-large (0.355B)0.8808Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning
SimCSE-RoBERTalarge0.8666SimCSE: Simple Contrastive Learning of Sentence Embeddings
Trans-Encoder-RoBERTa-base-cross (unsup.)0.8577Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-base-bi (unsup.)0.8508Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Trans-Encoder-BERT-base-cross (unsup.)0.8444Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
DiffCSE-BERT-base0.8390DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
DiffCSE-RoBERTa-base0.8281DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
SRoBERTa-NLI-large0.8185Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Mirror-BERT-base (unsup.)0.814Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
Dino (STSb/)0.8049Generating Datasets with Pretrained Language Models
Mirror-RoBERTa-base (unsup.)0.798Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders
IS-BERT-NLI0.7523An Unsupervised Sentence Embedding Method by Mutual Information Maximization
BERTlarge-flow (target)0.7492On the Sentence Embeddings from Pre-trained Language Models
0 of 20 row(s) selected.
Semantic Textual Similarity On Sts15 | SOTA | HyperAI超神经