johnnyboycurtis's picture
Update README.md
63b3e76 verified
metadata
language:
  - en
license: apache-2.0
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:3615666
  - loss:CachedMultipleNegativesSymmetricRankingLoss
  - loss:CachedMultipleNegativesRankingLoss
widget:
  - source_sentence: what is the difference between body spray and eau de toilette?
    sentences:
      - >-
        Eau de Toilette (EDT) is ideal for those that may find the EDP or
        Perfume oil too strong, with 7%-12% fragrance concentration in alcohol.
        Gives four to five hours wear. Body Mist is a light refreshing fragrance
        perfect for layering with other products from the same family. 3-5%
        fragrance concentration in alcohol.
      - >-
        To join the Army as an enlisted member you must usually take the Armed
        Services Vocational Aptitude Battery (ASVAB) test and get a good score.
        The maximum ASVAB score is 99. For enlistment into the Army you must get
        a minimum ASVAB score of 31.
      - >-
        Points needed to redeem rewards with Redbox Perks: 1,500 points = FREE
        1-night DVD rental. 1,750 points = FREE Blu-ray rental. 2,500 points =
        FREE 1-night Game rental.
  - source_sentence: who sings soa theme song
    sentences:
      - >-
        unpleasant. Not pleasant or agreeable: bad, disagreeable, displeasing,
        offensive, uncongenial, unsympathetic. Informal: icky.npleasant. Not
        pleasant or agreeable: bad, disagreeable, displeasing, offensive,
        uncongenial, unsympathetic. Informal: icky.
      - "â\x80\x9C Song from M*A*S*H (Suicide Is Painless) â\x80\x9D is a song written by Johnny Mandel (music) and Mike Altman (lyrics), which was the theme song for both the movie and TV series M*A*S*H. Mike Altman is the son of the original filmâ\x80\x99s director, Robert Altman, and was 14 years old when he wrote the songâ\x80\x99s lyrics. On the compilation the song is titled Theme from M.A.S.H. . 2  Killarmy sampled the music for their 1997 track 5 Stars from the Silent Weapons for Quiet Wars album. 3  Edgar Cruz recorded an instrumental cover of the song for his 1997 album Reminiscence titled M*A*S*H Theme."
      - >-
        (2) This Life is the theme song for the FX television series Sons of
        Anarchy, written by singer-songwriter Curtis Stigers, Velvet Revolver
        guitarist Dave Kushner, producer Bob Thiele Jr. and show creator Kurt
        Sutter while it was performed by Curtis Stigers & The Forest Rangers.
  - source_sentence: what is virility mean
    sentences:
      - >-
        That leads us to income taxes. Income tax rates. The top income tax rate
        in the United States is 39.6 percent. That ranked 33rd highest on a list
        of the top rates in 116 nations compiled this year by KPMG, an
        international tax advisory corporation.
      - >-
        Princeton's WordNet(0.00 / 0 votes)Rate this definition: virility(noun)
        the masculine property of being capable of copulation and procreation.
        manfulness, manliness, virility(noun) the trait of being manly; having
        the characteristics of an adult male.
      - >-
        Click on the thesaurus category heading under the button in an entry to
        see the synonyms and related words for that meaning. the strength and
        power that are considered typical qualities of a man, especially sexual
        energy virility symbol: His car is a red turbo-charged Porsche, the
        classic virility symbol. Synonyms and related words. Physical
        strength:strength, force, energy...
  - source_sentence: what does the greek term demokratia translate to mean
    sentences:
      - >-
        From Wikipedia, the free encyclopedia. A democracy (Greek, demokratia)
        means rule by the people. The name is used for different forms of
        government, where the people can take part in the decisions that affect
        the way their community is run. In modern times, there are different
        ways this can be done:rom Wikipedia, the free encyclopedia. A democracy
        (Greek, demokratia) means rule by the people. The name is used for
        different forms of government, where the people can take part in the
        decisions that affect the way their community is run. In modern times,
        there are different ways this can be done:
      - >-
        Freebase(0.00 / 0 votes)Rate this definition: Demokratia is a direct
        democracy, as opposed to the modern representative democracy. It was
        used in ancient Greece, most notably Athens, and began its use around
        500 BCE. In a participant government, citizens who wish to have a say in
        government can participate in it. Demokratia excluded women, foreigners,
        and slaves.
      - >-
        sal volatile. 1. an aromatic alcoholic solution of ammonium carbonate,
        the chief ingredient in smelling salts. Some say water was thrown into
        the flame, others that it was Spirits of sal volatile. He disengaged one
        of them presently, and felt in his pocket for the sal volatile.
  - source_sentence: how did triangular trade benefit european colonies in the americas
    sentences:
      - >-
        Road to Perdition (soundtrack) Road to Perdition is the soundtrack, on
        the Decca Records label, of the 2002 Academy Award-winning and Golden
        Globe-nominated film Road to Perdition starring Tyler Hoechlin, Tom
        Hanks, Jennifer Jason Leigh, Jude Law, Daniel Craig and Paul Newman. The
        original score was composed by Thomas Newman.[4]
      - >-
        Ohm's law Ohm's law states that the current through a conductor between
        two points is directly proportional to the voltage across the two
        points. Introducing the constant of proportionality, the resistance,[1]
        one arrives at the usual mathematical equation that describes this
        relationship:[2]
      - >-
        Triangular trade New England also benefited from the trade, as many
        merchants from New England, especially the state of Rhode Island,
        replaced the role of Europe in the triangle. New England also made rum
        from the Caribbean sugar and molasses, which it shipped to Africa as
        well as within the New World.[7] Yet, the "triangle trade" as considered
        in relation to New England was a piecemeal operation. No New England
        traders are known to have completed a sequential circuit of the full
        triangle, which took a calendar year on average, according to historian
        Clifford Shipton.[8] The concept of the New England Triangular trade was
        first suggested, inconclusively, in an 1866 book by George H. Moore, was
        picked up in 1872 by historian George C. Mason, and reached full
        consideration from a lecture in 1887 by American businessman and
        historian William B. Weeden.[9] The song "Molasses to Rum" from the
        musical 1776 vividly describes this form of the triangular trade.
datasets:
  - sentence-transformers/msmarco-msmarco-distilbert-base-v3
  - sentence-transformers/gooaq
  - sentence-transformers/natural-questions
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: ModernBERT-small-Retrieval-BEIR-Tuned
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoMSMARCO
          type: NanoMSMARCO
        metrics:
          - type: cosine_accuracy@1
            value: 0.12
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.26
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.34
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.54
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.12
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.08666666666666666
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.068
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.054000000000000006
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.12
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.26
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.34
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.54
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.3042672965887713
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.2326269841269841
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.25001278764546914
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoNQ
          type: NanoNQ
        metrics:
          - type: cosine_accuracy@1
            value: 0.22
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.46
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.56
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.58
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.22
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.15333333333333332
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.11200000000000002
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.05800000000000001
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.2
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.42
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.52
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.54
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.38133852434929755
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.3473333333333333
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.33718328116218177
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoHotpotQA
          type: NanoHotpotQA
        metrics:
          - type: cosine_accuracy@1
            value: 0.64
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.76
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.86
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.64
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.29333333333333333
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.20799999999999996
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.126
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.32
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.44
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.52
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.63
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.5626299119428719
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6950555555555556
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.4762967674536366
            name: Cosine Map@100
      - task:
          type: nano-beir
          name: Nano BEIR
        dataset:
          name: NanoBEIR mean
          type: NanoBEIR_mean
        metrics:
          - type: cosine_accuracy@1
            value: 0.32666666666666666
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.47333333333333333
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.5533333333333333
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.66
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.32666666666666666
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.17777777777777778
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.12933333333333333
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.07933333333333334
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.21333333333333335
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.3733333333333333
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.46
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.57
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.41607857762698025
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.42500529100529105
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.3544976120870958
            name: Cosine Map@100

ModernBERT-small-Retrieval-BEIR-Tuned

This is a sentence-transformers model trained on the msmarco, gooaq and natural_questions datasets. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

This model is based on the wide architecture of johnnyboycurtis/ModernBERT-small

small_modernbert_config = ModernBertConfig(
    hidden_size=384,                 # A common dimension for small embedding models
    num_hidden_layers=12,               # Significantly fewer layers than the base's 22
    num_attention_heads=6,             # Must be a divisor of hidden_size
    intermediate_size=1536,            # 4 * hidden_size -- VERY WIDE!!
    max_position_embeddings=1024,       # Max sequence length for the model; originally 8192
)

model = ModernBertModel(modernbert_small_config)

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 1024 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Datasets:
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'how did triangular trade benefit european colonies in the americas',
    'Triangular trade New England also benefited from the trade, as many merchants from New England, especially the state of Rhode Island, replaced the role of Europe in the triangle. New England also made rum from the Caribbean sugar and molasses, which it shipped to Africa as well as within the New World.[7] Yet, the "triangle trade" as considered in relation to New England was a piecemeal operation. No New England traders are known to have completed a sequential circuit of the full triangle, which took a calendar year on average, according to historian Clifford Shipton.[8] The concept of the New England Triangular trade was first suggested, inconclusively, in an 1866 book by George H. Moore, was picked up in 1872 by historian George C. Mason, and reached full consideration from a lecture in 1887 by American businessman and historian William B. Weeden.[9] The song "Molasses to Rum" from the musical 1776 vividly describes this form of the triangular trade.',
    "Ohm's law Ohm's law states that the current through a conductor between two points is directly proportional to the voltage across the two points. Introducing the constant of proportionality, the resistance,[1] one arrives at the usual mathematical equation that describes this relationship:[2]",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.4886, -0.0460],
#         [ 0.4886,  1.0000, -0.0096],
#         [-0.0460, -0.0096,  1.0000]])

Evaluation

Metrics

Information Retrieval

Metric NanoMSMARCO NanoNQ NanoHotpotQA
cosine_accuracy@1 0.12 0.22 0.64
cosine_accuracy@3 0.26 0.46 0.7
cosine_accuracy@5 0.34 0.56 0.76
cosine_accuracy@10 0.54 0.58 0.86
cosine_precision@1 0.12 0.22 0.64
cosine_precision@3 0.0867 0.1533 0.2933
cosine_precision@5 0.068 0.112 0.208
cosine_precision@10 0.054 0.058 0.126
cosine_recall@1 0.12 0.2 0.32
cosine_recall@3 0.26 0.42 0.44
cosine_recall@5 0.34 0.52 0.52
cosine_recall@10 0.54 0.54 0.63
cosine_ndcg@10 0.3043 0.3813 0.5626
cosine_mrr@10 0.2326 0.3473 0.6951
cosine_map@100 0.25 0.3372 0.4763

Nano BEIR

  • Dataset: NanoBEIR_mean
  • Evaluated with NanoBEIREvaluator with these parameters:
    {
        "dataset_names": [
            "MSMARCO",
            "NQ",
            "HotpotQA"
        ]
    }
    
Metric Value
cosine_accuracy@1 0.3267
cosine_accuracy@3 0.4733
cosine_accuracy@5 0.5533
cosine_accuracy@10 0.66
cosine_precision@1 0.3267
cosine_precision@3 0.1778
cosine_precision@5 0.1293
cosine_precision@10 0.0793
cosine_recall@1 0.2133
cosine_recall@3 0.3733
cosine_recall@5 0.46
cosine_recall@10 0.57
cosine_ndcg@10 0.4161
cosine_mrr@10 0.425
cosine_map@100 0.3545

Training Details

Training Datasets

msmarco

  • Dataset: msmarco at 28ff31e
  • Size: 502,939 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 9.17 tokens
    • max: 29 tokens
    • min: 18 tokens
    • mean: 79.48 tokens
    • max: 211 tokens
    • min: 17 tokens
    • mean: 81.21 tokens
    • max: 230 tokens
  • Samples:
    anchor positive negative
    what are the liberal arts? liberal arts. 1. the academic course of instruction at a college intended to provide general knowledge and comprising the arts, humanities, natural sciences, and social sciences, as opposed to professional or technical subjects. liberal arts definition. The areas of learning that cultivate general intellectual ability rather than technical or professional skills. The term liberal arts is often used as a synonym for humanities, although the liberal arts also include the sciences. The word liberal comes from the Latin liberalis, meaning suitable for a free man, as opposed to a slave.
    what is the mechanism of action of fibrinolytic or thrombolytic drugs? Baillière's Clinical Haematology. 6 Mechanism of action of the thrombolytic agents. 6 Mechanism of action of the thrombolytic agents JEFFREY I. WEITZ Fibrin formed during the haemostatic, inflammatory or tissue repair process serves a temporary role, and must be degraded to restore normal tissue function and structure. Fibrinolytic drug. Fibrinolytic drug, also called thrombolytic drug, any agent that is capable of stimulating the dissolution of a blood clot (thrombus). Fibrinolytic drugs work by activating the so-called fibrinolytic pathway.
    what is normal plat count 78 Followers. A. Platelets are the tiny blood cells that help stop bleeding by binding together to form a clump or plug at sites of injury inside blood vessels. A normal platelet count is between 150,000 and 450,000 platelets per microliter (one-millionth of a liter, abbreviated mcL).The average platelet count is 237,000 per mcL in men and 266,000 per mcL in women.8 Followers. A. Platelets are the tiny blood cells that help stop bleeding by binding together to form a clump or plug at sites of injury inside blood vessels. A normal platelet count is between 150,000 and 450,000 platelets per microliter (one-millionth of a liter, abbreviated mcL). Platelet Count. A platelet count is part of a complete blood count test. It is ordered when a patient experiences unexplainable bruises or when small cuts and wounds take longer time to heal. A normal platelet count of an average person is 150,000 to 450,000 platelets per microliter of blood.Others may have abnormal platelet count but it doesn’t indicate any abnormality.latelets, red cells, and plasma are the major components that form the human blood. Platelets are irregular shaped molecules with a colorless body and sticky surface that forms into clots to help stop the bleeding. When a person’s normal platelet count is compromised that person’s life might be put in danger.
  • Loss: CachedMultipleNegativesSymmetricRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "mini_batch_size": 64
    }
    

gooaq

  • Dataset: gooaq at b089f72
  • Size: 3,012,496 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 8 tokens
    • mean: 12.19 tokens
    • max: 22 tokens
    • min: 13 tokens
    • mean: 58.34 tokens
    • max: 124 tokens
  • Samples:
    anchor positive
    is toprol xl the same as metoprolol? Metoprolol succinate is also known by the brand name Toprol XL. It is the extended-release form of metoprolol. Metoprolol succinate is approved to treat high blood pressure, chronic chest pain, and congestive heart failure.
    are you experienced cd steve hoffman? The Are You Experienced album was apparently mastered from the original stereo UK master tapes (according to Steve Hoffman - one of the very few who has heard both the master tapes and the CDs produced over the years). ... The CD booklets were a little sparse, but at least they stayed true to the album's original design.
    how are babushka dolls made? Matryoshka dolls are made of wood from lime, balsa, alder, aspen, and birch trees; lime is probably the most common wood type. ... After cutting, the trees are stripped of most of their bark, although a few inner rings of bark are left to bind the wood and keep it from splitting.
  • Loss: CachedMultipleNegativesSymmetricRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "mini_batch_size": 64
    }
    

natural_questions

  • Dataset: natural_questions at f9e894e
  • Size: 100,231 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 10 tokens
    • mean: 12.47 tokens
    • max: 23 tokens
    • min: 17 tokens
    • mean: 138.32 tokens
    • max: 556 tokens
  • Samples:
    anchor positive
    when did richmond last play in a preliminary final Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tig...
    who sang what in the world's come over you Jack Scott (singer) At the beginning of 1960, Scott again changed record labels, this time to Top Rank Records.[1] He then recorded four Billboard Hot 100 hits – "What in the World's Come Over You" (#5), "Burning Bridges" (#3) b/w "Oh Little One" (#34), and "It Only Happened Yesterday" (#38).[1] "What in the World's Come Over You" was Scott's second gold disc winner.[6] Scott continued to record and perform during the 1960s and 1970s.[1] His song "You're Just Gettin' Better" reached the country charts in 1974.[1] In May 1977, Scott recorded a Peel session for BBC Radio 1 disc jockey, John Peel.
    who produces the most wool in the world Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets.
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "mini_batch_size": 64
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 256
  • weight_decay: 0.01
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • bf16_full_eval: True
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: True
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss NanoMSMARCO_cosine_ndcg@10 NanoNQ_cosine_ndcg@10 NanoHotpotQA_cosine_ndcg@10 NanoBEIR_mean_cosine_ndcg@10
-1 -1 - 0.2890 0.3652 0.5065 0.3869
0.0708 1000 1.0794 - - - -
0.1416 2000 1.007 0.2944 0.3596 0.5221 0.3920
0.2124 3000 0.9012 - - - -
0.2832 4000 0.8323 0.3070 0.3797 0.5284 0.4050
0.3540 5000 0.7548 - - - -
0.4248 6000 0.6963 0.2909 0.3768 0.5560 0.4079
0.4956 7000 0.6655 - - - -
0.5664 8000 0.6235 0.3049 0.3778 0.5497 0.4108
0.6372 9000 0.6202 - - - -
0.7080 10000 0.6276 0.3072 0.3778 0.5613 0.4154
0.7788 11000 0.6101 - - - -
0.8496 12000 0.6016 0.3049 0.3756 0.5635 0.4147
0.9204 13000 0.6063 - - - -
0.9912 14000 0.5905 0.3043 0.3813 0.5626 0.4161
1.0619 15000 0.5734 - - - -
1.1327 16000 0.581 0.3119 0.3764 0.5555 0.4146
1.2035 17000 0.5744 - - - -
1.2743 18000 0.5769 0.3121 0.3682 0.5566 0.4123
1.3451 19000 0.5773 - - - -
1.4159 20000 0.5767 0.3132 0.3656 0.5602 0.4130
1.4867 21000 0.5662 - - - -
1.5575 22000 0.5662 0.3204 0.3656 0.5557 0.4139
1.6283 23000 0.5586 - - - -
1.6991 24000 0.5659 0.3209 0.3664 0.5599 0.4157
1.7699 25000 0.578 - - - -
1.8407 26000 0.5749 0.3132 0.3656 0.5599 0.4129
1.9115 27000 0.5845 - - - -
1.9823 28000 0.5769 0.3132 0.3664 0.5611 0.4136
2.0531 29000 0.5714 - - - -
2.1239 30000 0.5696 0.3132 0.3673 0.5606 0.4137
2.1947 31000 0.568 - - - -
2.2655 32000 0.5767 0.3209 0.3664 0.5602 0.4158
2.3363 33000 0.5785 - - - -
2.4071 34000 0.5666 0.3206 0.3664 0.5604 0.4158
2.4779 35000 0.5608 - - - -
2.5487 36000 0.5563 0.3206 0.3656 0.5602 0.4155
2.6195 37000 0.5768 - - - -
2.6903 38000 0.569 0.3206 0.3664 0.5602 0.4158
2.7611 39000 0.5723 - - - -
2.8319 40000 0.5714 0.3206 0.3664 0.5606 0.4159
2.9027 41000 0.5621 - - - -
2.9735 42000 0.5724 0.3206 0.3664 0.5602 0.4158
-1 -1 - 0.3043 0.3813 0.5626 0.4161
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.0.0
  • Transformers: 4.53.1
  • PyTorch: 2.7.1+cu128
  • Accelerate: 1.8.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}