How to use the 256 dim embedding

#16
by knielsen - opened

The readme mentions the 256 dimension embedding a few times, but does not show how to use it. Would it be possible to add such an example? I am especially interested in how it works with the transformers framework.

Nomic AI org

I added sections for SentenceTransformers and Transformers. Basically all you need for Transformers is

embeddings = embeddings[:, :matryoshka_dim]
embeddings = F.normalize(embeddings, p=2, dim=1)
zpn changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment