MS-MARCO Embeddings
Collection
Embedding models for MS-MARCO (Simple embedding models for RAG)
•
7 items
•
Updated
This is a sentence-transformers model finetuned from nomic-ai/nomic-embed-text-v2-moe. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("BlackBeenie/nomic-embed-text-v2-moe-msmarco-bpr")
# Run inference
sentences = [
'what services are offered by adult day care',
'Consumer Guide to Long Term Care. Adult Day Care. Adult day care is a planned program offered in a group setting that provides services that improve or maintain health or functioning, and social activities for seniors and persons with disabilities.',
'The Met Life Market survey of 2008 on adult day services states the average cost for adult day care services is $64 per day. There has been an increase of 5% in these services in the past year.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
sentence_0
, sentence_1
, and sentence_2
sentence_0 | sentence_1 | sentence_2 | |
---|---|---|---|
type | string | string | string |
details |
|
|
|
sentence_0 | sentence_1 | sentence_2 |
---|---|---|
what the history of bluetooth |
When asked about the name Bluetooth, I explained that Bluetooth was borrowed from the 10th century, second King of Denmark, King Harald Bluetooth; who was famous for uniting Scandinavia just as we intended to unite the PC and cellular industries with a short-range wireless link. |
Technology: 1 How secure is a Bluetooth network? 2 What is Frequency-Hopping Spread Spectrum (FHSS)? 3 Will other RF (Radio Frequency) devices interfere with Bluetooth Devices? 4 Will Bluetooth and Wireless LAN (WLAN) interfere with each other? 5 What is the data throughput speed of a Bluetooth connection? 6 What is the range of Bluetooth 7 ... What kind of ... |
how thin can a concrete slab be |
Another issue that must be addressed is the added weight of the thin-slab. Poured gypsum thin-slabs typically add 13 to 15 pounds per square foot to the dead loading of a floor structure. Standard weight concrete thin slabs add about 18 pounds per square foot (at 1.5 thickness). |
Find the Area in square feet: We will use a concrete slab pour for our example. Letâs say that we need to figure out the yardage for a slab that will be 15 feet long by 10 feet wide and 4 inches thick. First we find the area by multiplying the length times the width. 1 15 feet X 10 feet = 150 square feet. |
how long to cook eggs to hard boil |
This method works best if the eggs are in a single layer, but you can double them up as well, you'll just need to add more time to the steaming time. 3 Set your timer for 6 minutes for soft boiled, 10 minutes for hard boiled with a still translucent and bright yolk, or 12-15 minutes for cooked-through hard boiled. |
Hard-Steamed Eggs. Fill a pot that can comfortably hold your steamer with the lid on with 1 to 2 inches of water. Bring to a rolling boil, 212 degrees Fahrenheit. Place your eggs in a metal steamer, and lower the basket into the pot. The eggs should sit above the boiling water. Cover and cook for 12 minutes. Hard-steamed eggs, like hard-boiled eggs, are eggs that are cooked until the egg yolk is fully set and has turned to a chalky texture. |
beir.losses.bpr_loss.BPRLoss
eval_strategy
: stepsper_device_train_batch_size
: 32per_device_eval_batch_size
: 32num_train_epochs
: 5fp16
: Truemulti_dataset_batch_sampler
: round_robinoverwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 32per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 5max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
: auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robinEpoch | Step | Training Loss |
---|---|---|
0.0321 | 500 | 0.3396 |
0.0641 | 1000 | 0.2094 |
0.0962 | 1500 | 0.21 |
0.1283 | 2000 | 0.1955 |
0.1603 | 2500 | 0.1989 |
0.1924 | 3000 | 0.1851 |
0.2245 | 3500 | 0.1839 |
0.2565 | 4000 | 0.1859 |
0.2886 | 4500 | 0.1892 |
0.3207 | 5000 | 0.1865 |
0.3527 | 5500 | 0.1773 |
0.3848 | 6000 | 0.1796 |
0.4169 | 6500 | 0.1929 |
0.4489 | 7000 | 0.1829 |
0.4810 | 7500 | 0.172 |
0.5131 | 8000 | 0.1792 |
0.5451 | 8500 | 0.1747 |
0.5772 | 9000 | 0.1802 |
0.6092 | 9500 | 0.1856 |
0.6413 | 10000 | 0.1751 |
0.6734 | 10500 | 0.173 |
0.7054 | 11000 | 0.1774 |
0.7375 | 11500 | 0.1722 |
0.7696 | 12000 | 0.1825 |
0.8016 | 12500 | 0.1714 |
0.8337 | 13000 | 0.1732 |
0.8658 | 13500 | 0.167 |
0.8978 | 14000 | 0.1792 |
0.9299 | 14500 | 0.1697 |
0.9620 | 15000 | 0.1682 |
0.9940 | 15500 | 0.1764 |
1.0 | 15593 | - |
1.0261 | 16000 | 0.0875 |
1.0582 | 16500 | 0.0798 |
1.0902 | 17000 | 0.0764 |
1.1223 | 17500 | 0.0783 |
1.1544 | 18000 | 0.0759 |
1.1864 | 18500 | 0.0834 |
1.2185 | 19000 | 0.082 |
1.2506 | 19500 | 0.0827 |
1.2826 | 20000 | 0.0876 |
1.3147 | 20500 | 0.0819 |
1.3468 | 21000 | 0.0841 |
1.3788 | 21500 | 0.0815 |
1.4109 | 22000 | 0.0819 |
1.4430 | 22500 | 0.0883 |
1.4750 | 23000 | 0.0826 |
1.5071 | 23500 | 0.0837 |
1.5392 | 24000 | 0.086 |
1.5712 | 24500 | 0.0806 |
1.6033 | 25000 | 0.0918 |
1.6353 | 25500 | 0.0885 |
1.6674 | 26000 | 0.0885 |
1.6995 | 26500 | 0.088 |
1.7315 | 27000 | 0.0843 |
1.7636 | 27500 | 0.0915 |
1.7957 | 28000 | 0.0843 |
1.8277 | 28500 | 0.0868 |
1.8598 | 29000 | 0.0857 |
1.8919 | 29500 | 0.0931 |
1.9239 | 30000 | 0.0852 |
1.9560 | 30500 | 0.0913 |
1.9881 | 31000 | 0.0857 |
2.0 | 31186 | - |
2.0201 | 31500 | 0.0547 |
2.0522 | 32000 | 0.0459 |
2.0843 | 32500 | 0.0451 |
2.1163 | 33000 | 0.0407 |
2.1484 | 33500 | 0.0469 |
2.1805 | 34000 | 0.0459 |
2.2125 | 34500 | 0.0508 |
2.2446 | 35000 | 0.0508 |
2.2767 | 35500 | 0.0518 |
2.3087 | 36000 | 0.0552 |
2.3408 | 36500 | 0.0491 |
2.3729 | 37000 | 0.0575 |
2.4049 | 37500 | 0.0558 |
2.4370 | 38000 | 0.0475 |
2.4691 | 38500 | 0.0486 |
2.5011 | 39000 | 0.0536 |
2.5332 | 39500 | 0.0559 |
2.5653 | 40000 | 0.0524 |
2.5973 | 40500 | 0.0496 |
2.6294 | 41000 | 0.0486 |
2.6615 | 41500 | 0.0526 |
2.6935 | 42000 | 0.0443 |
2.7256 | 42500 | 0.058 |
2.7576 | 43000 | 0.0543 |
2.7897 | 43500 | 0.0527 |
2.8218 | 44000 | 0.0528 |
2.8538 | 44500 | 0.0573 |
2.8859 | 45000 | 0.0628 |
2.9180 | 45500 | 0.0443 |
2.9500 | 46000 | 0.0531 |
2.9821 | 46500 | 0.0554 |
3.0 | 46779 | - |
3.0142 | 47000 | 0.0346 |
3.0462 | 47500 | 0.0288 |
3.0783 | 48000 | 0.0219 |
3.1104 | 48500 | 0.0259 |
3.1424 | 49000 | 0.0237 |
3.1745 | 49500 | 0.0307 |
3.2066 | 50000 | 0.0234 |
3.2386 | 50500 | 0.0312 |
3.2707 | 51000 | 0.0297 |
3.3028 | 51500 | 0.0299 |
3.3348 | 52000 | 0.0326 |
3.3669 | 52500 | 0.0266 |
3.3990 | 53000 | 0.0296 |
3.4310 | 53500 | 0.0289 |
3.4631 | 54000 | 0.0216 |
3.4952 | 54500 | 0.0289 |
3.5272 | 55000 | 0.033 |
3.5593 | 55500 | 0.0248 |
3.5914 | 56000 | 0.0246 |
3.6234 | 56500 | 0.0287 |
3.6555 | 57000 | 0.0267 |
3.6876 | 57500 | 0.0285 |
3.7196 | 58000 | 0.0288 |
3.7517 | 58500 | 0.0283 |
3.7837 | 59000 | 0.0283 |
3.8158 | 59500 | 0.029 |
3.8479 | 60000 | 0.0327 |
3.8799 | 60500 | 0.0239 |
3.9120 | 61000 | 0.0356 |
3.9441 | 61500 | 0.0323 |
3.9761 | 62000 | 0.0213 |
4.0 | 62372 | - |
4.0082 | 62500 | 0.0275 |
4.0403 | 63000 | 0.0125 |
4.0723 | 63500 | 0.0183 |
4.1044 | 64000 | 0.0138 |
4.1365 | 64500 | 0.0174 |
4.1685 | 65000 | 0.0088 |
4.2006 | 65500 | 0.0126 |
4.2327 | 66000 | 0.0134 |
4.2647 | 66500 | 0.0099 |
4.2968 | 67000 | 0.0188 |
4.3289 | 67500 | 0.0112 |
4.3609 | 68000 | 0.0156 |
4.3930 | 68500 | 0.0175 |
4.4251 | 69000 | 0.0128 |
4.4571 | 69500 | 0.0154 |
4.4892 | 70000 | 0.0127 |
4.5213 | 70500 | 0.0131 |
4.5533 | 71000 | 0.017 |
4.5854 | 71500 | 0.0116 |
4.6175 | 72000 | 0.0137 |
4.6495 | 72500 | 0.0156 |
4.6816 | 73000 | 0.0155 |
4.7137 | 73500 | 0.0078 |
4.7457 | 74000 | 0.0152 |
4.7778 | 74500 | 0.0089 |
4.8099 | 75000 | 0.0116 |
4.8419 | 75500 | 0.0144 |
4.8740 | 76000 | 0.0112 |
4.9060 | 76500 | 0.0108 |
4.9381 | 77000 | 0.0188 |
4.9702 | 77500 | 0.0109 |
5.0 | 77965 | - |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
Base model
FacebookAI/xlm-roberta-base