SBERT (Sentence Transformers) is not BERT Sentence Embedding: Intro & Tutorial (

Опубликовано: 26 Октябрь 2024
на канале: Discover AI
12,779
278

SBERT is not BERT Sentence Embedding (Introduction, Tutorial).

Quite some viewers ask about SBERT Sentence Transformers, and confuse it with BERT Sentence Vectors or Embedding. I try a clarification of both systems and why one outperforms the other one for sentence semantic similarity.

Simplifying systems for clear presentation is never easy, but I try to incorporate visualizations for a better understanding, and for the moment do not focus on every little detail. If you are interested in a python implementation: there are 36 SBERT videos with code on my channel.

Both systems provide you for a given sentence with a sentence embedding. So what is the difference?

SBERT = Sentence Embeddings using Siamese BERT-Networks (Bi-Encoder)

All SBERT credits go to:
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Nils Reimers, Iryna Gurevych
https://arxiv.org/abs/1908.10084


#bert
#sbert
#ai