Wabifocus commited on
Commit
14ac905
·
verified ·
1 Parent(s): 4d00312

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -1,10 +1,2528 @@
1
  ---
 
 
 
 
 
2
  license: apache-2.0
3
  language:
4
  - en
5
- base_model:
6
- - mixedbread-ai/mxbai-embed-large-v1
7
  pipeline_tag: feature-extraction
8
- library_name: mlx
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  ---
10
- metadata
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ tags:
3
+ - mteb
4
+ - transformers.js
5
+ - transformers
6
+ - mlx
7
  license: apache-2.0
8
  language:
9
  - en
10
+ library_name: sentence-transformers
 
11
  pipeline_tag: feature-extraction
12
+ model-index:
13
+ - name: mxbai-angle-large-v1
14
+ results:
15
+ - task:
16
+ type: Classification
17
+ dataset:
18
+ name: MTEB AmazonCounterfactualClassification (en)
19
+ type: mteb/amazon_counterfactual
20
+ config: en
21
+ split: test
22
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
23
+ metrics:
24
+ - type: accuracy
25
+ value: 75.044776119403
26
+ - type: ap
27
+ value: 37.7362433623053
28
+ - type: f1
29
+ value: 68.92736573359774
30
+ - task:
31
+ type: Classification
32
+ dataset:
33
+ name: MTEB AmazonPolarityClassification
34
+ type: mteb/amazon_polarity
35
+ config: default
36
+ split: test
37
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
38
+ metrics:
39
+ - type: accuracy
40
+ value: 93.84025000000001
41
+ - type: ap
42
+ value: 90.93190875404055
43
+ - type: f1
44
+ value: 93.8297833897293
45
+ - task:
46
+ type: Classification
47
+ dataset:
48
+ name: MTEB AmazonReviewsClassification (en)
49
+ type: mteb/amazon_reviews_multi
50
+ config: en
51
+ split: test
52
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
53
+ metrics:
54
+ - type: accuracy
55
+ value: 49.184
56
+ - type: f1
57
+ value: 48.74163227751588
58
+ - task:
59
+ type: Retrieval
60
+ dataset:
61
+ name: MTEB ArguAna
62
+ type: arguana
63
+ config: default
64
+ split: test
65
+ revision: None
66
+ metrics:
67
+ - type: map_at_1
68
+ value: 41.252
69
+ - type: map_at_10
70
+ value: 57.778
71
+ - type: map_at_100
72
+ value: 58.233000000000004
73
+ - type: map_at_1000
74
+ value: 58.23700000000001
75
+ - type: map_at_3
76
+ value: 53.449999999999996
77
+ - type: map_at_5
78
+ value: 56.376000000000005
79
+ - type: mrr_at_1
80
+ value: 41.679
81
+ - type: mrr_at_10
82
+ value: 57.92699999999999
83
+ - type: mrr_at_100
84
+ value: 58.389
85
+ - type: mrr_at_1000
86
+ value: 58.391999999999996
87
+ - type: mrr_at_3
88
+ value: 53.651
89
+ - type: mrr_at_5
90
+ value: 56.521
91
+ - type: ndcg_at_1
92
+ value: 41.252
93
+ - type: ndcg_at_10
94
+ value: 66.018
95
+ - type: ndcg_at_100
96
+ value: 67.774
97
+ - type: ndcg_at_1000
98
+ value: 67.84400000000001
99
+ - type: ndcg_at_3
100
+ value: 57.372
101
+ - type: ndcg_at_5
102
+ value: 62.646
103
+ - type: precision_at_1
104
+ value: 41.252
105
+ - type: precision_at_10
106
+ value: 9.189
107
+ - type: precision_at_100
108
+ value: 0.991
109
+ - type: precision_at_1000
110
+ value: 0.1
111
+ - type: precision_at_3
112
+ value: 22.902
113
+ - type: precision_at_5
114
+ value: 16.302
115
+ - type: recall_at_1
116
+ value: 41.252
117
+ - type: recall_at_10
118
+ value: 91.892
119
+ - type: recall_at_100
120
+ value: 99.14699999999999
121
+ - type: recall_at_1000
122
+ value: 99.644
123
+ - type: recall_at_3
124
+ value: 68.706
125
+ - type: recall_at_5
126
+ value: 81.50800000000001
127
+ - task:
128
+ type: Clustering
129
+ dataset:
130
+ name: MTEB ArxivClusteringP2P
131
+ type: mteb/arxiv-clustering-p2p
132
+ config: default
133
+ split: test
134
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
135
+ metrics:
136
+ - type: v_measure
137
+ value: 48.97294504317859
138
+ - task:
139
+ type: Clustering
140
+ dataset:
141
+ name: MTEB ArxivClusteringS2S
142
+ type: mteb/arxiv-clustering-s2s
143
+ config: default
144
+ split: test
145
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
146
+ metrics:
147
+ - type: v_measure
148
+ value: 42.98071077674629
149
+ - task:
150
+ type: Reranking
151
+ dataset:
152
+ name: MTEB AskUbuntuDupQuestions
153
+ type: mteb/askubuntudupquestions-reranking
154
+ config: default
155
+ split: test
156
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
157
+ metrics:
158
+ - type: map
159
+ value: 65.16477858490782
160
+ - type: mrr
161
+ value: 78.23583080508287
162
+ - task:
163
+ type: STS
164
+ dataset:
165
+ name: MTEB BIOSSES
166
+ type: mteb/biosses-sts
167
+ config: default
168
+ split: test
169
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
170
+ metrics:
171
+ - type: cos_sim_pearson
172
+ value: 89.6277629421789
173
+ - type: cos_sim_spearman
174
+ value: 88.4056288400568
175
+ - type: euclidean_pearson
176
+ value: 87.94871847578163
177
+ - type: euclidean_spearman
178
+ value: 88.4056288400568
179
+ - type: manhattan_pearson
180
+ value: 87.73271254229648
181
+ - type: manhattan_spearman
182
+ value: 87.91826833762677
183
+ - task:
184
+ type: Classification
185
+ dataset:
186
+ name: MTEB Banking77Classification
187
+ type: mteb/banking77
188
+ config: default
189
+ split: test
190
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
191
+ metrics:
192
+ - type: accuracy
193
+ value: 87.81818181818181
194
+ - type: f1
195
+ value: 87.79879337316918
196
+ - task:
197
+ type: Clustering
198
+ dataset:
199
+ name: MTEB BiorxivClusteringP2P
200
+ type: mteb/biorxiv-clustering-p2p
201
+ config: default
202
+ split: test
203
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
204
+ metrics:
205
+ - type: v_measure
206
+ value: 39.91773608582761
207
+ - task:
208
+ type: Clustering
209
+ dataset:
210
+ name: MTEB BiorxivClusteringS2S
211
+ type: mteb/biorxiv-clustering-s2s
212
+ config: default
213
+ split: test
214
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
215
+ metrics:
216
+ - type: v_measure
217
+ value: 36.73059477462478
218
+ - task:
219
+ type: Retrieval
220
+ dataset:
221
+ name: MTEB CQADupstackAndroidRetrieval
222
+ type: BeIR/cqadupstack
223
+ config: default
224
+ split: test
225
+ revision: None
226
+ metrics:
227
+ - type: map_at_1
228
+ value: 32.745999999999995
229
+ - type: map_at_10
230
+ value: 43.632
231
+ - type: map_at_100
232
+ value: 45.206
233
+ - type: map_at_1000
234
+ value: 45.341
235
+ - type: map_at_3
236
+ value: 39.956
237
+ - type: map_at_5
238
+ value: 42.031
239
+ - type: mrr_at_1
240
+ value: 39.485
241
+ - type: mrr_at_10
242
+ value: 49.537
243
+ - type: mrr_at_100
244
+ value: 50.249
245
+ - type: mrr_at_1000
246
+ value: 50.294000000000004
247
+ - type: mrr_at_3
248
+ value: 46.757
249
+ - type: mrr_at_5
250
+ value: 48.481
251
+ - type: ndcg_at_1
252
+ value: 39.485
253
+ - type: ndcg_at_10
254
+ value: 50.058
255
+ - type: ndcg_at_100
256
+ value: 55.586
257
+ - type: ndcg_at_1000
258
+ value: 57.511
259
+ - type: ndcg_at_3
260
+ value: 44.786
261
+ - type: ndcg_at_5
262
+ value: 47.339999999999996
263
+ - type: precision_at_1
264
+ value: 39.485
265
+ - type: precision_at_10
266
+ value: 9.557
267
+ - type: precision_at_100
268
+ value: 1.552
269
+ - type: precision_at_1000
270
+ value: 0.202
271
+ - type: precision_at_3
272
+ value: 21.412
273
+ - type: precision_at_5
274
+ value: 15.479000000000001
275
+ - type: recall_at_1
276
+ value: 32.745999999999995
277
+ - type: recall_at_10
278
+ value: 62.056
279
+ - type: recall_at_100
280
+ value: 85.088
281
+ - type: recall_at_1000
282
+ value: 96.952
283
+ - type: recall_at_3
284
+ value: 46.959
285
+ - type: recall_at_5
286
+ value: 54.06999999999999
287
+ - type: map_at_1
288
+ value: 31.898
289
+ - type: map_at_10
290
+ value: 42.142
291
+ - type: map_at_100
292
+ value: 43.349
293
+ - type: map_at_1000
294
+ value: 43.483
295
+ - type: map_at_3
296
+ value: 39.18
297
+ - type: map_at_5
298
+ value: 40.733000000000004
299
+ - type: mrr_at_1
300
+ value: 39.617999999999995
301
+ - type: mrr_at_10
302
+ value: 47.922
303
+ - type: mrr_at_100
304
+ value: 48.547000000000004
305
+ - type: mrr_at_1000
306
+ value: 48.597
307
+ - type: mrr_at_3
308
+ value: 45.86
309
+ - type: mrr_at_5
310
+ value: 46.949000000000005
311
+ - type: ndcg_at_1
312
+ value: 39.617999999999995
313
+ - type: ndcg_at_10
314
+ value: 47.739
315
+ - type: ndcg_at_100
316
+ value: 51.934999999999995
317
+ - type: ndcg_at_1000
318
+ value: 54.007000000000005
319
+ - type: ndcg_at_3
320
+ value: 43.748
321
+ - type: ndcg_at_5
322
+ value: 45.345
323
+ - type: precision_at_1
324
+ value: 39.617999999999995
325
+ - type: precision_at_10
326
+ value: 8.962
327
+ - type: precision_at_100
328
+ value: 1.436
329
+ - type: precision_at_1000
330
+ value: 0.192
331
+ - type: precision_at_3
332
+ value: 21.083
333
+ - type: precision_at_5
334
+ value: 14.752
335
+ - type: recall_at_1
336
+ value: 31.898
337
+ - type: recall_at_10
338
+ value: 57.587999999999994
339
+ - type: recall_at_100
340
+ value: 75.323
341
+ - type: recall_at_1000
342
+ value: 88.304
343
+ - type: recall_at_3
344
+ value: 45.275
345
+ - type: recall_at_5
346
+ value: 49.99
347
+ - type: map_at_1
348
+ value: 40.458
349
+ - type: map_at_10
350
+ value: 52.942
351
+ - type: map_at_100
352
+ value: 53.974
353
+ - type: map_at_1000
354
+ value: 54.031
355
+ - type: map_at_3
356
+ value: 49.559999999999995
357
+ - type: map_at_5
358
+ value: 51.408
359
+ - type: mrr_at_1
360
+ value: 46.27
361
+ - type: mrr_at_10
362
+ value: 56.31699999999999
363
+ - type: mrr_at_100
364
+ value: 56.95099999999999
365
+ - type: mrr_at_1000
366
+ value: 56.98
367
+ - type: mrr_at_3
368
+ value: 53.835
369
+ - type: mrr_at_5
370
+ value: 55.252
371
+ - type: ndcg_at_1
372
+ value: 46.27
373
+ - type: ndcg_at_10
374
+ value: 58.964000000000006
375
+ - type: ndcg_at_100
376
+ value: 62.875
377
+ - type: ndcg_at_1000
378
+ value: 63.969
379
+ - type: ndcg_at_3
380
+ value: 53.297000000000004
381
+ - type: ndcg_at_5
382
+ value: 55.938
383
+ - type: precision_at_1
384
+ value: 46.27
385
+ - type: precision_at_10
386
+ value: 9.549000000000001
387
+ - type: precision_at_100
388
+ value: 1.2409999999999999
389
+ - type: precision_at_1000
390
+ value: 0.13799999999999998
391
+ - type: precision_at_3
392
+ value: 23.762
393
+ - type: precision_at_5
394
+ value: 16.262999999999998
395
+ - type: recall_at_1
396
+ value: 40.458
397
+ - type: recall_at_10
398
+ value: 73.446
399
+ - type: recall_at_100
400
+ value: 90.12400000000001
401
+ - type: recall_at_1000
402
+ value: 97.795
403
+ - type: recall_at_3
404
+ value: 58.123000000000005
405
+ - type: recall_at_5
406
+ value: 64.68
407
+ - type: map_at_1
408
+ value: 27.443
409
+ - type: map_at_10
410
+ value: 36.081
411
+ - type: map_at_100
412
+ value: 37.163000000000004
413
+ - type: map_at_1000
414
+ value: 37.232
415
+ - type: map_at_3
416
+ value: 33.308
417
+ - type: map_at_5
418
+ value: 34.724
419
+ - type: mrr_at_1
420
+ value: 29.492
421
+ - type: mrr_at_10
422
+ value: 38.138
423
+ - type: mrr_at_100
424
+ value: 39.065
425
+ - type: mrr_at_1000
426
+ value: 39.119
427
+ - type: mrr_at_3
428
+ value: 35.593
429
+ - type: mrr_at_5
430
+ value: 36.785000000000004
431
+ - type: ndcg_at_1
432
+ value: 29.492
433
+ - type: ndcg_at_10
434
+ value: 41.134
435
+ - type: ndcg_at_100
436
+ value: 46.300999999999995
437
+ - type: ndcg_at_1000
438
+ value: 48.106
439
+ - type: ndcg_at_3
440
+ value: 35.77
441
+ - type: ndcg_at_5
442
+ value: 38.032
443
+ - type: precision_at_1
444
+ value: 29.492
445
+ - type: precision_at_10
446
+ value: 6.249
447
+ - type: precision_at_100
448
+ value: 0.9299999999999999
449
+ - type: precision_at_1000
450
+ value: 0.11199999999999999
451
+ - type: precision_at_3
452
+ value: 15.065999999999999
453
+ - type: precision_at_5
454
+ value: 10.373000000000001
455
+ - type: recall_at_1
456
+ value: 27.443
457
+ - type: recall_at_10
458
+ value: 54.80199999999999
459
+ - type: recall_at_100
460
+ value: 78.21900000000001
461
+ - type: recall_at_1000
462
+ value: 91.751
463
+ - type: recall_at_3
464
+ value: 40.211000000000006
465
+ - type: recall_at_5
466
+ value: 45.599000000000004
467
+ - type: map_at_1
468
+ value: 18.731
469
+ - type: map_at_10
470
+ value: 26.717999999999996
471
+ - type: map_at_100
472
+ value: 27.897
473
+ - type: map_at_1000
474
+ value: 28.029
475
+ - type: map_at_3
476
+ value: 23.91
477
+ - type: map_at_5
478
+ value: 25.455
479
+ - type: mrr_at_1
480
+ value: 23.134
481
+ - type: mrr_at_10
482
+ value: 31.769
483
+ - type: mrr_at_100
484
+ value: 32.634
485
+ - type: mrr_at_1000
486
+ value: 32.707
487
+ - type: mrr_at_3
488
+ value: 28.938999999999997
489
+ - type: mrr_at_5
490
+ value: 30.531000000000002
491
+ - type: ndcg_at_1
492
+ value: 23.134
493
+ - type: ndcg_at_10
494
+ value: 32.249
495
+ - type: ndcg_at_100
496
+ value: 37.678
497
+ - type: ndcg_at_1000
498
+ value: 40.589999999999996
499
+ - type: ndcg_at_3
500
+ value: 26.985999999999997
501
+ - type: ndcg_at_5
502
+ value: 29.457
503
+ - type: precision_at_1
504
+ value: 23.134
505
+ - type: precision_at_10
506
+ value: 5.8709999999999996
507
+ - type: precision_at_100
508
+ value: 0.988
509
+ - type: precision_at_1000
510
+ value: 0.13799999999999998
511
+ - type: precision_at_3
512
+ value: 12.852
513
+ - type: precision_at_5
514
+ value: 9.428
515
+ - type: recall_at_1
516
+ value: 18.731
517
+ - type: recall_at_10
518
+ value: 44.419
519
+ - type: recall_at_100
520
+ value: 67.851
521
+ - type: recall_at_1000
522
+ value: 88.103
523
+ - type: recall_at_3
524
+ value: 29.919
525
+ - type: recall_at_5
526
+ value: 36.230000000000004
527
+ - type: map_at_1
528
+ value: 30.324
529
+ - type: map_at_10
530
+ value: 41.265
531
+ - type: map_at_100
532
+ value: 42.559000000000005
533
+ - type: map_at_1000
534
+ value: 42.669000000000004
535
+ - type: map_at_3
536
+ value: 38.138
537
+ - type: map_at_5
538
+ value: 39.881
539
+ - type: mrr_at_1
540
+ value: 36.67
541
+ - type: mrr_at_10
542
+ value: 46.774
543
+ - type: mrr_at_100
544
+ value: 47.554
545
+ - type: mrr_at_1000
546
+ value: 47.593
547
+ - type: mrr_at_3
548
+ value: 44.338
549
+ - type: mrr_at_5
550
+ value: 45.723
551
+ - type: ndcg_at_1
552
+ value: 36.67
553
+ - type: ndcg_at_10
554
+ value: 47.367
555
+ - type: ndcg_at_100
556
+ value: 52.623
557
+ - type: ndcg_at_1000
558
+ value: 54.59
559
+ - type: ndcg_at_3
560
+ value: 42.323
561
+ - type: ndcg_at_5
562
+ value: 44.727
563
+ - type: precision_at_1
564
+ value: 36.67
565
+ - type: precision_at_10
566
+ value: 8.518
567
+ - type: precision_at_100
568
+ value: 1.2890000000000001
569
+ - type: precision_at_1000
570
+ value: 0.163
571
+ - type: precision_at_3
572
+ value: 19.955000000000002
573
+ - type: precision_at_5
574
+ value: 14.11
575
+ - type: recall_at_1
576
+ value: 30.324
577
+ - type: recall_at_10
578
+ value: 59.845000000000006
579
+ - type: recall_at_100
580
+ value: 81.77499999999999
581
+ - type: recall_at_1000
582
+ value: 94.463
583
+ - type: recall_at_3
584
+ value: 46.019
585
+ - type: recall_at_5
586
+ value: 52.163000000000004
587
+ - type: map_at_1
588
+ value: 24.229
589
+ - type: map_at_10
590
+ value: 35.004000000000005
591
+ - type: map_at_100
592
+ value: 36.409000000000006
593
+ - type: map_at_1000
594
+ value: 36.521
595
+ - type: map_at_3
596
+ value: 31.793
597
+ - type: map_at_5
598
+ value: 33.432
599
+ - type: mrr_at_1
600
+ value: 30.365
601
+ - type: mrr_at_10
602
+ value: 40.502
603
+ - type: mrr_at_100
604
+ value: 41.372
605
+ - type: mrr_at_1000
606
+ value: 41.435
607
+ - type: mrr_at_3
608
+ value: 37.804
609
+ - type: mrr_at_5
610
+ value: 39.226
611
+ - type: ndcg_at_1
612
+ value: 30.365
613
+ - type: ndcg_at_10
614
+ value: 41.305
615
+ - type: ndcg_at_100
616
+ value: 47.028999999999996
617
+ - type: ndcg_at_1000
618
+ value: 49.375
619
+ - type: ndcg_at_3
620
+ value: 35.85
621
+ - type: ndcg_at_5
622
+ value: 38.12
623
+ - type: precision_at_1
624
+ value: 30.365
625
+ - type: precision_at_10
626
+ value: 7.808
627
+ - type: precision_at_100
628
+ value: 1.228
629
+ - type: precision_at_1000
630
+ value: 0.161
631
+ - type: precision_at_3
632
+ value: 17.352
633
+ - type: precision_at_5
634
+ value: 12.42
635
+ - type: recall_at_1
636
+ value: 24.229
637
+ - type: recall_at_10
638
+ value: 54.673
639
+ - type: recall_at_100
640
+ value: 78.766
641
+ - type: recall_at_1000
642
+ value: 94.625
643
+ - type: recall_at_3
644
+ value: 39.602
645
+ - type: recall_at_5
646
+ value: 45.558
647
+ - type: map_at_1
648
+ value: 26.695
649
+ - type: map_at_10
650
+ value: 36.0895
651
+ - type: map_at_100
652
+ value: 37.309416666666664
653
+ - type: map_at_1000
654
+ value: 37.42558333333334
655
+ - type: map_at_3
656
+ value: 33.19616666666666
657
+ - type: map_at_5
658
+ value: 34.78641666666667
659
+ - type: mrr_at_1
660
+ value: 31.486083333333337
661
+ - type: mrr_at_10
662
+ value: 40.34774999999999
663
+ - type: mrr_at_100
664
+ value: 41.17533333333333
665
+ - type: mrr_at_1000
666
+ value: 41.231583333333326
667
+ - type: mrr_at_3
668
+ value: 37.90075
669
+ - type: mrr_at_5
670
+ value: 39.266999999999996
671
+ - type: ndcg_at_1
672
+ value: 31.486083333333337
673
+ - type: ndcg_at_10
674
+ value: 41.60433333333334
675
+ - type: ndcg_at_100
676
+ value: 46.74525
677
+ - type: ndcg_at_1000
678
+ value: 48.96166666666667
679
+ - type: ndcg_at_3
680
+ value: 36.68825
681
+ - type: ndcg_at_5
682
+ value: 38.966499999999996
683
+ - type: precision_at_1
684
+ value: 31.486083333333337
685
+ - type: precision_at_10
686
+ value: 7.29675
687
+ - type: precision_at_100
688
+ value: 1.1621666666666666
689
+ - type: precision_at_1000
690
+ value: 0.1545
691
+ - type: precision_at_3
692
+ value: 16.8815
693
+ - type: precision_at_5
694
+ value: 11.974583333333333
695
+ - type: recall_at_1
696
+ value: 26.695
697
+ - type: recall_at_10
698
+ value: 53.651916666666665
699
+ - type: recall_at_100
700
+ value: 76.12083333333332
701
+ - type: recall_at_1000
702
+ value: 91.31191666666668
703
+ - type: recall_at_3
704
+ value: 40.03575
705
+ - type: recall_at_5
706
+ value: 45.876666666666665
707
+ - type: map_at_1
708
+ value: 25.668000000000003
709
+ - type: map_at_10
710
+ value: 32.486
711
+ - type: map_at_100
712
+ value: 33.371
713
+ - type: map_at_1000
714
+ value: 33.458
715
+ - type: map_at_3
716
+ value: 30.261
717
+ - type: map_at_5
718
+ value: 31.418000000000003
719
+ - type: mrr_at_1
720
+ value: 28.988000000000003
721
+ - type: mrr_at_10
722
+ value: 35.414
723
+ - type: mrr_at_100
724
+ value: 36.149
725
+ - type: mrr_at_1000
726
+ value: 36.215
727
+ - type: mrr_at_3
728
+ value: 33.333
729
+ - type: mrr_at_5
730
+ value: 34.43
731
+ - type: ndcg_at_1
732
+ value: 28.988000000000003
733
+ - type: ndcg_at_10
734
+ value: 36.732
735
+ - type: ndcg_at_100
736
+ value: 41.331
737
+ - type: ndcg_at_1000
738
+ value: 43.575
739
+ - type: ndcg_at_3
740
+ value: 32.413
741
+ - type: ndcg_at_5
742
+ value: 34.316
743
+ - type: precision_at_1
744
+ value: 28.988000000000003
745
+ - type: precision_at_10
746
+ value: 5.7059999999999995
747
+ - type: precision_at_100
748
+ value: 0.882
749
+ - type: precision_at_1000
750
+ value: 0.11299999999999999
751
+ - type: precision_at_3
752
+ value: 13.65
753
+ - type: precision_at_5
754
+ value: 9.417
755
+ - type: recall_at_1
756
+ value: 25.668000000000003
757
+ - type: recall_at_10
758
+ value: 47.147
759
+ - type: recall_at_100
760
+ value: 68.504
761
+ - type: recall_at_1000
762
+ value: 85.272
763
+ - type: recall_at_3
764
+ value: 35.19
765
+ - type: recall_at_5
766
+ value: 39.925
767
+ - type: map_at_1
768
+ value: 17.256
769
+ - type: map_at_10
770
+ value: 24.58
771
+ - type: map_at_100
772
+ value: 25.773000000000003
773
+ - type: map_at_1000
774
+ value: 25.899
775
+ - type: map_at_3
776
+ value: 22.236
777
+ - type: map_at_5
778
+ value: 23.507
779
+ - type: mrr_at_1
780
+ value: 20.957
781
+ - type: mrr_at_10
782
+ value: 28.416000000000004
783
+ - type: mrr_at_100
784
+ value: 29.447000000000003
785
+ - type: mrr_at_1000
786
+ value: 29.524
787
+ - type: mrr_at_3
788
+ value: 26.245
789
+ - type: mrr_at_5
790
+ value: 27.451999999999998
791
+ - type: ndcg_at_1
792
+ value: 20.957
793
+ - type: ndcg_at_10
794
+ value: 29.285
795
+ - type: ndcg_at_100
796
+ value: 35.003
797
+ - type: ndcg_at_1000
798
+ value: 37.881
799
+ - type: ndcg_at_3
800
+ value: 25.063000000000002
801
+ - type: ndcg_at_5
802
+ value: 26.983
803
+ - type: precision_at_1
804
+ value: 20.957
805
+ - type: precision_at_10
806
+ value: 5.344
807
+ - type: precision_at_100
808
+ value: 0.958
809
+ - type: precision_at_1000
810
+ value: 0.13799999999999998
811
+ - type: precision_at_3
812
+ value: 11.918
813
+ - type: precision_at_5
814
+ value: 8.596
815
+ - type: recall_at_1
816
+ value: 17.256
817
+ - type: recall_at_10
818
+ value: 39.644
819
+ - type: recall_at_100
820
+ value: 65.279
821
+ - type: recall_at_1000
822
+ value: 85.693
823
+ - type: recall_at_3
824
+ value: 27.825
825
+ - type: recall_at_5
826
+ value: 32.792
827
+ - type: map_at_1
828
+ value: 26.700000000000003
829
+ - type: map_at_10
830
+ value: 36.205999999999996
831
+ - type: map_at_100
832
+ value: 37.316
833
+ - type: map_at_1000
834
+ value: 37.425000000000004
835
+ - type: map_at_3
836
+ value: 33.166000000000004
837
+ - type: map_at_5
838
+ value: 35.032999999999994
839
+ - type: mrr_at_1
840
+ value: 31.436999999999998
841
+ - type: mrr_at_10
842
+ value: 40.61
843
+ - type: mrr_at_100
844
+ value: 41.415
845
+ - type: mrr_at_1000
846
+ value: 41.48
847
+ - type: mrr_at_3
848
+ value: 37.966
849
+ - type: mrr_at_5
850
+ value: 39.599000000000004
851
+ - type: ndcg_at_1
852
+ value: 31.436999999999998
853
+ - type: ndcg_at_10
854
+ value: 41.771
855
+ - type: ndcg_at_100
856
+ value: 46.784
857
+ - type: ndcg_at_1000
858
+ value: 49.183
859
+ - type: ndcg_at_3
860
+ value: 36.437000000000005
861
+ - type: ndcg_at_5
862
+ value: 39.291
863
+ - type: precision_at_1
864
+ value: 31.436999999999998
865
+ - type: precision_at_10
866
+ value: 6.987
867
+ - type: precision_at_100
868
+ value: 1.072
869
+ - type: precision_at_1000
870
+ value: 0.13899999999999998
871
+ - type: precision_at_3
872
+ value: 16.448999999999998
873
+ - type: precision_at_5
874
+ value: 11.866
875
+ - type: recall_at_1
876
+ value: 26.700000000000003
877
+ - type: recall_at_10
878
+ value: 54.301
879
+ - type: recall_at_100
880
+ value: 75.871
881
+ - type: recall_at_1000
882
+ value: 92.529
883
+ - type: recall_at_3
884
+ value: 40.201
885
+ - type: recall_at_5
886
+ value: 47.208
887
+ - type: map_at_1
888
+ value: 24.296
889
+ - type: map_at_10
890
+ value: 33.116
891
+ - type: map_at_100
892
+ value: 34.81
893
+ - type: map_at_1000
894
+ value: 35.032000000000004
895
+ - type: map_at_3
896
+ value: 30.105999999999998
897
+ - type: map_at_5
898
+ value: 31.839000000000002
899
+ - type: mrr_at_1
900
+ value: 29.051
901
+ - type: mrr_at_10
902
+ value: 37.803
903
+ - type: mrr_at_100
904
+ value: 38.856
905
+ - type: mrr_at_1000
906
+ value: 38.903999999999996
907
+ - type: mrr_at_3
908
+ value: 35.211
909
+ - type: mrr_at_5
910
+ value: 36.545
911
+ - type: ndcg_at_1
912
+ value: 29.051
913
+ - type: ndcg_at_10
914
+ value: 39.007
915
+ - type: ndcg_at_100
916
+ value: 45.321
917
+ - type: ndcg_at_1000
918
+ value: 47.665
919
+ - type: ndcg_at_3
920
+ value: 34.1
921
+ - type: ndcg_at_5
922
+ value: 36.437000000000005
923
+ - type: precision_at_1
924
+ value: 29.051
925
+ - type: precision_at_10
926
+ value: 7.668
927
+ - type: precision_at_100
928
+ value: 1.542
929
+ - type: precision_at_1000
930
+ value: 0.24
931
+ - type: precision_at_3
932
+ value: 16.14
933
+ - type: precision_at_5
934
+ value: 11.897
935
+ - type: recall_at_1
936
+ value: 24.296
937
+ - type: recall_at_10
938
+ value: 49.85
939
+ - type: recall_at_100
940
+ value: 78.457
941
+ - type: recall_at_1000
942
+ value: 92.618
943
+ - type: recall_at_3
944
+ value: 36.138999999999996
945
+ - type: recall_at_5
946
+ value: 42.223
947
+ - type: map_at_1
948
+ value: 20.591
949
+ - type: map_at_10
950
+ value: 28.902
951
+ - type: map_at_100
952
+ value: 29.886000000000003
953
+ - type: map_at_1000
954
+ value: 29.987000000000002
955
+ - type: map_at_3
956
+ value: 26.740000000000002
957
+ - type: map_at_5
958
+ value: 27.976
959
+ - type: mrr_at_1
960
+ value: 22.366
961
+ - type: mrr_at_10
962
+ value: 30.971
963
+ - type: mrr_at_100
964
+ value: 31.865
965
+ - type: mrr_at_1000
966
+ value: 31.930999999999997
967
+ - type: mrr_at_3
968
+ value: 28.927999999999997
969
+ - type: mrr_at_5
970
+ value: 30.231
971
+ - type: ndcg_at_1
972
+ value: 22.366
973
+ - type: ndcg_at_10
974
+ value: 33.641
975
+ - type: ndcg_at_100
976
+ value: 38.477
977
+ - type: ndcg_at_1000
978
+ value: 41.088
979
+ - type: ndcg_at_3
980
+ value: 29.486
981
+ - type: ndcg_at_5
982
+ value: 31.612000000000002
983
+ - type: precision_at_1
984
+ value: 22.366
985
+ - type: precision_at_10
986
+ value: 5.3420000000000005
987
+ - type: precision_at_100
988
+ value: 0.828
989
+ - type: precision_at_1000
990
+ value: 0.11800000000000001
991
+ - type: precision_at_3
992
+ value: 12.939
993
+ - type: precision_at_5
994
+ value: 9.094
995
+ - type: recall_at_1
996
+ value: 20.591
997
+ - type: recall_at_10
998
+ value: 46.052
999
+ - type: recall_at_100
1000
+ value: 68.193
1001
+ - type: recall_at_1000
1002
+ value: 87.638
1003
+ - type: recall_at_3
1004
+ value: 34.966
1005
+ - type: recall_at_5
1006
+ value: 40.082
1007
+ - task:
1008
+ type: Retrieval
1009
+ dataset:
1010
+ name: MTEB ClimateFEVER
1011
+ type: climate-fever
1012
+ config: default
1013
+ split: test
1014
+ revision: None
1015
+ metrics:
1016
+ - type: map_at_1
1017
+ value: 15.091
1018
+ - type: map_at_10
1019
+ value: 26.38
1020
+ - type: map_at_100
1021
+ value: 28.421999999999997
1022
+ - type: map_at_1000
1023
+ value: 28.621999999999996
1024
+ - type: map_at_3
1025
+ value: 21.597
1026
+ - type: map_at_5
1027
+ value: 24.12
1028
+ - type: mrr_at_1
1029
+ value: 34.266999999999996
1030
+ - type: mrr_at_10
1031
+ value: 46.864
1032
+ - type: mrr_at_100
1033
+ value: 47.617
1034
+ - type: mrr_at_1000
1035
+ value: 47.644
1036
+ - type: mrr_at_3
1037
+ value: 43.312
1038
+ - type: mrr_at_5
1039
+ value: 45.501000000000005
1040
+ - type: ndcg_at_1
1041
+ value: 34.266999999999996
1042
+ - type: ndcg_at_10
1043
+ value: 36.095
1044
+ - type: ndcg_at_100
1045
+ value: 43.447
1046
+ - type: ndcg_at_1000
1047
+ value: 46.661
1048
+ - type: ndcg_at_3
1049
+ value: 29.337999999999997
1050
+ - type: ndcg_at_5
1051
+ value: 31.824
1052
+ - type: precision_at_1
1053
+ value: 34.266999999999996
1054
+ - type: precision_at_10
1055
+ value: 11.472
1056
+ - type: precision_at_100
1057
+ value: 1.944
1058
+ - type: precision_at_1000
1059
+ value: 0.255
1060
+ - type: precision_at_3
1061
+ value: 21.933
1062
+ - type: precision_at_5
1063
+ value: 17.224999999999998
1064
+ - type: recall_at_1
1065
+ value: 15.091
1066
+ - type: recall_at_10
1067
+ value: 43.022
1068
+ - type: recall_at_100
1069
+ value: 68.075
1070
+ - type: recall_at_1000
1071
+ value: 85.76
1072
+ - type: recall_at_3
1073
+ value: 26.564
1074
+ - type: recall_at_5
1075
+ value: 33.594
1076
+ - task:
1077
+ type: Retrieval
1078
+ dataset:
1079
+ name: MTEB DBPedia
1080
+ type: dbpedia-entity
1081
+ config: default
1082
+ split: test
1083
+ revision: None
1084
+ metrics:
1085
+ - type: map_at_1
1086
+ value: 9.252
1087
+ - type: map_at_10
1088
+ value: 20.923
1089
+ - type: map_at_100
1090
+ value: 30.741000000000003
1091
+ - type: map_at_1000
1092
+ value: 32.542
1093
+ - type: map_at_3
1094
+ value: 14.442
1095
+ - type: map_at_5
1096
+ value: 17.399
1097
+ - type: mrr_at_1
1098
+ value: 70.25
1099
+ - type: mrr_at_10
1100
+ value: 78.17
1101
+ - type: mrr_at_100
1102
+ value: 78.444
1103
+ - type: mrr_at_1000
1104
+ value: 78.45100000000001
1105
+ - type: mrr_at_3
1106
+ value: 76.958
1107
+ - type: mrr_at_5
1108
+ value: 77.571
1109
+ - type: ndcg_at_1
1110
+ value: 58.375
1111
+ - type: ndcg_at_10
1112
+ value: 44.509
1113
+ - type: ndcg_at_100
1114
+ value: 49.897999999999996
1115
+ - type: ndcg_at_1000
1116
+ value: 57.269999999999996
1117
+ - type: ndcg_at_3
1118
+ value: 48.64
1119
+ - type: ndcg_at_5
1120
+ value: 46.697
1121
+ - type: precision_at_1
1122
+ value: 70.25
1123
+ - type: precision_at_10
1124
+ value: 36.05
1125
+ - type: precision_at_100
1126
+ value: 11.848
1127
+ - type: precision_at_1000
1128
+ value: 2.213
1129
+ - type: precision_at_3
1130
+ value: 52.917
1131
+ - type: precision_at_5
1132
+ value: 45.7
1133
+ - type: recall_at_1
1134
+ value: 9.252
1135
+ - type: recall_at_10
1136
+ value: 27.006999999999998
1137
+ - type: recall_at_100
1138
+ value: 57.008
1139
+ - type: recall_at_1000
1140
+ value: 80.697
1141
+ - type: recall_at_3
1142
+ value: 15.798000000000002
1143
+ - type: recall_at_5
1144
+ value: 20.4
1145
+ - task:
1146
+ type: Classification
1147
+ dataset:
1148
+ name: MTEB EmotionClassification
1149
+ type: mteb/emotion
1150
+ config: default
1151
+ split: test
1152
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1153
+ metrics:
1154
+ - type: accuracy
1155
+ value: 50.88
1156
+ - type: f1
1157
+ value: 45.545495028653384
1158
+ - task:
1159
+ type: Retrieval
1160
+ dataset:
1161
+ name: MTEB FEVER
1162
+ type: fever
1163
+ config: default
1164
+ split: test
1165
+ revision: None
1166
+ metrics:
1167
+ - type: map_at_1
1168
+ value: 75.424
1169
+ - type: map_at_10
1170
+ value: 83.435
1171
+ - type: map_at_100
1172
+ value: 83.66900000000001
1173
+ - type: map_at_1000
1174
+ value: 83.685
1175
+ - type: map_at_3
1176
+ value: 82.39800000000001
1177
+ - type: map_at_5
1178
+ value: 83.07
1179
+ - type: mrr_at_1
1180
+ value: 81.113
1181
+ - type: mrr_at_10
1182
+ value: 87.77199999999999
1183
+ - type: mrr_at_100
1184
+ value: 87.862
1185
+ - type: mrr_at_1000
1186
+ value: 87.86500000000001
1187
+ - type: mrr_at_3
1188
+ value: 87.17099999999999
1189
+ - type: mrr_at_5
1190
+ value: 87.616
1191
+ - type: ndcg_at_1
1192
+ value: 81.113
1193
+ - type: ndcg_at_10
1194
+ value: 86.909
1195
+ - type: ndcg_at_100
1196
+ value: 87.746
1197
+ - type: ndcg_at_1000
1198
+ value: 88.017
1199
+ - type: ndcg_at_3
1200
+ value: 85.368
1201
+ - type: ndcg_at_5
1202
+ value: 86.28099999999999
1203
+ - type: precision_at_1
1204
+ value: 81.113
1205
+ - type: precision_at_10
1206
+ value: 10.363
1207
+ - type: precision_at_100
1208
+ value: 1.102
1209
+ - type: precision_at_1000
1210
+ value: 0.11399999999999999
1211
+ - type: precision_at_3
1212
+ value: 32.507999999999996
1213
+ - type: precision_at_5
1214
+ value: 20.138
1215
+ - type: recall_at_1
1216
+ value: 75.424
1217
+ - type: recall_at_10
1218
+ value: 93.258
1219
+ - type: recall_at_100
1220
+ value: 96.545
1221
+ - type: recall_at_1000
1222
+ value: 98.284
1223
+ - type: recall_at_3
1224
+ value: 89.083
1225
+ - type: recall_at_5
1226
+ value: 91.445
1227
+ - task:
1228
+ type: Retrieval
1229
+ dataset:
1230
+ name: MTEB FiQA2018
1231
+ type: fiqa
1232
+ config: default
1233
+ split: test
1234
+ revision: None
1235
+ metrics:
1236
+ - type: map_at_1
1237
+ value: 22.532
1238
+ - type: map_at_10
1239
+ value: 37.141999999999996
1240
+ - type: map_at_100
1241
+ value: 39.162
1242
+ - type: map_at_1000
1243
+ value: 39.322
1244
+ - type: map_at_3
1245
+ value: 32.885
1246
+ - type: map_at_5
1247
+ value: 35.093999999999994
1248
+ - type: mrr_at_1
1249
+ value: 44.29
1250
+ - type: mrr_at_10
1251
+ value: 53.516
1252
+ - type: mrr_at_100
1253
+ value: 54.24
1254
+ - type: mrr_at_1000
1255
+ value: 54.273
1256
+ - type: mrr_at_3
1257
+ value: 51.286
1258
+ - type: mrr_at_5
1259
+ value: 52.413
1260
+ - type: ndcg_at_1
1261
+ value: 44.29
1262
+ - type: ndcg_at_10
1263
+ value: 45.268
1264
+ - type: ndcg_at_100
1265
+ value: 52.125
1266
+ - type: ndcg_at_1000
1267
+ value: 54.778000000000006
1268
+ - type: ndcg_at_3
1269
+ value: 41.829
1270
+ - type: ndcg_at_5
1271
+ value: 42.525
1272
+ - type: precision_at_1
1273
+ value: 44.29
1274
+ - type: precision_at_10
1275
+ value: 12.5
1276
+ - type: precision_at_100
1277
+ value: 1.9720000000000002
1278
+ - type: precision_at_1000
1279
+ value: 0.245
1280
+ - type: precision_at_3
1281
+ value: 28.035
1282
+ - type: precision_at_5
1283
+ value: 20.093
1284
+ - type: recall_at_1
1285
+ value: 22.532
1286
+ - type: recall_at_10
1287
+ value: 52.419000000000004
1288
+ - type: recall_at_100
1289
+ value: 77.43299999999999
1290
+ - type: recall_at_1000
1291
+ value: 93.379
1292
+ - type: recall_at_3
1293
+ value: 38.629000000000005
1294
+ - type: recall_at_5
1295
+ value: 43.858000000000004
1296
+ - task:
1297
+ type: Retrieval
1298
+ dataset:
1299
+ name: MTEB HotpotQA
1300
+ type: hotpotqa
1301
+ config: default
1302
+ split: test
1303
+ revision: None
1304
+ metrics:
1305
+ - type: map_at_1
1306
+ value: 39.359
1307
+ - type: map_at_10
1308
+ value: 63.966
1309
+ - type: map_at_100
1310
+ value: 64.87
1311
+ - type: map_at_1000
1312
+ value: 64.92599999999999
1313
+ - type: map_at_3
1314
+ value: 60.409
1315
+ - type: map_at_5
1316
+ value: 62.627
1317
+ - type: mrr_at_1
1318
+ value: 78.717
1319
+ - type: mrr_at_10
1320
+ value: 84.468
1321
+ - type: mrr_at_100
1322
+ value: 84.655
1323
+ - type: mrr_at_1000
1324
+ value: 84.661
1325
+ - type: mrr_at_3
1326
+ value: 83.554
1327
+ - type: mrr_at_5
1328
+ value: 84.133
1329
+ - type: ndcg_at_1
1330
+ value: 78.717
1331
+ - type: ndcg_at_10
1332
+ value: 72.03399999999999
1333
+ - type: ndcg_at_100
1334
+ value: 75.158
1335
+ - type: ndcg_at_1000
1336
+ value: 76.197
1337
+ - type: ndcg_at_3
1338
+ value: 67.049
1339
+ - type: ndcg_at_5
1340
+ value: 69.808
1341
+ - type: precision_at_1
1342
+ value: 78.717
1343
+ - type: precision_at_10
1344
+ value: 15.201
1345
+ - type: precision_at_100
1346
+ value: 1.764
1347
+ - type: precision_at_1000
1348
+ value: 0.19
1349
+ - type: precision_at_3
1350
+ value: 43.313
1351
+ - type: precision_at_5
1352
+ value: 28.165000000000003
1353
+ - type: recall_at_1
1354
+ value: 39.359
1355
+ - type: recall_at_10
1356
+ value: 76.003
1357
+ - type: recall_at_100
1358
+ value: 88.197
1359
+ - type: recall_at_1000
1360
+ value: 95.003
1361
+ - type: recall_at_3
1362
+ value: 64.97
1363
+ - type: recall_at_5
1364
+ value: 70.41199999999999
1365
+ - task:
1366
+ type: Classification
1367
+ dataset:
1368
+ name: MTEB ImdbClassification
1369
+ type: mteb/imdb
1370
+ config: default
1371
+ split: test
1372
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1373
+ metrics:
1374
+ - type: accuracy
1375
+ value: 92.83200000000001
1376
+ - type: ap
1377
+ value: 89.33560571859861
1378
+ - type: f1
1379
+ value: 92.82322915005167
1380
+ - task:
1381
+ type: Retrieval
1382
+ dataset:
1383
+ name: MTEB MSMARCO
1384
+ type: msmarco
1385
+ config: default
1386
+ split: dev
1387
+ revision: None
1388
+ metrics:
1389
+ - type: map_at_1
1390
+ value: 21.983
1391
+ - type: map_at_10
1392
+ value: 34.259
1393
+ - type: map_at_100
1394
+ value: 35.432
1395
+ - type: map_at_1000
1396
+ value: 35.482
1397
+ - type: map_at_3
1398
+ value: 30.275999999999996
1399
+ - type: map_at_5
1400
+ value: 32.566
1401
+ - type: mrr_at_1
1402
+ value: 22.579
1403
+ - type: mrr_at_10
1404
+ value: 34.882999999999996
1405
+ - type: mrr_at_100
1406
+ value: 35.984
1407
+ - type: mrr_at_1000
1408
+ value: 36.028
1409
+ - type: mrr_at_3
1410
+ value: 30.964999999999996
1411
+ - type: mrr_at_5
1412
+ value: 33.245000000000005
1413
+ - type: ndcg_at_1
1414
+ value: 22.564
1415
+ - type: ndcg_at_10
1416
+ value: 41.258
1417
+ - type: ndcg_at_100
1418
+ value: 46.824
1419
+ - type: ndcg_at_1000
1420
+ value: 48.037
1421
+ - type: ndcg_at_3
1422
+ value: 33.17
1423
+ - type: ndcg_at_5
1424
+ value: 37.263000000000005
1425
+ - type: precision_at_1
1426
+ value: 22.564
1427
+ - type: precision_at_10
1428
+ value: 6.572
1429
+ - type: precision_at_100
1430
+ value: 0.935
1431
+ - type: precision_at_1000
1432
+ value: 0.104
1433
+ - type: precision_at_3
1434
+ value: 14.130999999999998
1435
+ - type: precision_at_5
1436
+ value: 10.544
1437
+ - type: recall_at_1
1438
+ value: 21.983
1439
+ - type: recall_at_10
1440
+ value: 62.775000000000006
1441
+ - type: recall_at_100
1442
+ value: 88.389
1443
+ - type: recall_at_1000
1444
+ value: 97.603
1445
+ - type: recall_at_3
1446
+ value: 40.878
1447
+ - type: recall_at_5
1448
+ value: 50.690000000000005
1449
+ - task:
1450
+ type: Classification
1451
+ dataset:
1452
+ name: MTEB MTOPDomainClassification (en)
1453
+ type: mteb/mtop_domain
1454
+ config: en
1455
+ split: test
1456
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1457
+ metrics:
1458
+ - type: accuracy
1459
+ value: 93.95120839033288
1460
+ - type: f1
1461
+ value: 93.73824125055208
1462
+ - task:
1463
+ type: Classification
1464
+ dataset:
1465
+ name: MTEB MTOPIntentClassification (en)
1466
+ type: mteb/mtop_intent
1467
+ config: en
1468
+ split: test
1469
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1470
+ metrics:
1471
+ - type: accuracy
1472
+ value: 76.78978568171455
1473
+ - type: f1
1474
+ value: 57.50180552858304
1475
+ - task:
1476
+ type: Classification
1477
+ dataset:
1478
+ name: MTEB MassiveIntentClassification (en)
1479
+ type: mteb/amazon_massive_intent
1480
+ config: en
1481
+ split: test
1482
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1483
+ metrics:
1484
+ - type: accuracy
1485
+ value: 76.24411566913248
1486
+ - type: f1
1487
+ value: 74.37851403532832
1488
+ - task:
1489
+ type: Classification
1490
+ dataset:
1491
+ name: MTEB MassiveScenarioClassification (en)
1492
+ type: mteb/amazon_massive_scenario
1493
+ config: en
1494
+ split: test
1495
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1496
+ metrics:
1497
+ - type: accuracy
1498
+ value: 79.94620040349699
1499
+ - type: f1
1500
+ value: 80.21293397970435
1501
+ - task:
1502
+ type: Clustering
1503
+ dataset:
1504
+ name: MTEB MedrxivClusteringP2P
1505
+ type: mteb/medrxiv-clustering-p2p
1506
+ config: default
1507
+ split: test
1508
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1509
+ metrics:
1510
+ - type: v_measure
1511
+ value: 33.44403096245675
1512
+ - task:
1513
+ type: Clustering
1514
+ dataset:
1515
+ name: MTEB MedrxivClusteringS2S
1516
+ type: mteb/medrxiv-clustering-s2s
1517
+ config: default
1518
+ split: test
1519
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1520
+ metrics:
1521
+ - type: v_measure
1522
+ value: 31.659594631336812
1523
+ - task:
1524
+ type: Reranking
1525
+ dataset:
1526
+ name: MTEB MindSmallReranking
1527
+ type: mteb/mind_small
1528
+ config: default
1529
+ split: test
1530
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1531
+ metrics:
1532
+ - type: map
1533
+ value: 32.53833075108798
1534
+ - type: mrr
1535
+ value: 33.78840823218308
1536
+ - task:
1537
+ type: Retrieval
1538
+ dataset:
1539
+ name: MTEB NFCorpus
1540
+ type: nfcorpus
1541
+ config: default
1542
+ split: test
1543
+ revision: None
1544
+ metrics:
1545
+ - type: map_at_1
1546
+ value: 7.185999999999999
1547
+ - type: map_at_10
1548
+ value: 15.193999999999999
1549
+ - type: map_at_100
1550
+ value: 19.538
1551
+ - type: map_at_1000
1552
+ value: 21.178
1553
+ - type: map_at_3
1554
+ value: 11.208
1555
+ - type: map_at_5
1556
+ value: 12.745999999999999
1557
+ - type: mrr_at_1
1558
+ value: 48.916
1559
+ - type: mrr_at_10
1560
+ value: 58.141
1561
+ - type: mrr_at_100
1562
+ value: 58.656
1563
+ - type: mrr_at_1000
1564
+ value: 58.684999999999995
1565
+ - type: mrr_at_3
1566
+ value: 55.521
1567
+ - type: mrr_at_5
1568
+ value: 57.239
1569
+ - type: ndcg_at_1
1570
+ value: 47.059
1571
+ - type: ndcg_at_10
1572
+ value: 38.644
1573
+ - type: ndcg_at_100
1574
+ value: 36.272999999999996
1575
+ - type: ndcg_at_1000
1576
+ value: 44.996
1577
+ - type: ndcg_at_3
1578
+ value: 43.293
1579
+ - type: ndcg_at_5
1580
+ value: 40.819
1581
+ - type: precision_at_1
1582
+ value: 48.916
1583
+ - type: precision_at_10
1584
+ value: 28.607
1585
+ - type: precision_at_100
1586
+ value: 9.195
1587
+ - type: precision_at_1000
1588
+ value: 2.225
1589
+ - type: precision_at_3
1590
+ value: 40.454
1591
+ - type: precision_at_5
1592
+ value: 34.985
1593
+ - type: recall_at_1
1594
+ value: 7.185999999999999
1595
+ - type: recall_at_10
1596
+ value: 19.654
1597
+ - type: recall_at_100
1598
+ value: 37.224000000000004
1599
+ - type: recall_at_1000
1600
+ value: 68.663
1601
+ - type: recall_at_3
1602
+ value: 12.158
1603
+ - type: recall_at_5
1604
+ value: 14.674999999999999
1605
+ - task:
1606
+ type: Retrieval
1607
+ dataset:
1608
+ name: MTEB NQ
1609
+ type: nq
1610
+ config: default
1611
+ split: test
1612
+ revision: None
1613
+ metrics:
1614
+ - type: map_at_1
1615
+ value: 31.552000000000003
1616
+ - type: map_at_10
1617
+ value: 47.75
1618
+ - type: map_at_100
1619
+ value: 48.728
1620
+ - type: map_at_1000
1621
+ value: 48.754
1622
+ - type: map_at_3
1623
+ value: 43.156
1624
+ - type: map_at_5
1625
+ value: 45.883
1626
+ - type: mrr_at_1
1627
+ value: 35.66
1628
+ - type: mrr_at_10
1629
+ value: 50.269
1630
+ - type: mrr_at_100
1631
+ value: 50.974
1632
+ - type: mrr_at_1000
1633
+ value: 50.991
1634
+ - type: mrr_at_3
1635
+ value: 46.519
1636
+ - type: mrr_at_5
1637
+ value: 48.764
1638
+ - type: ndcg_at_1
1639
+ value: 35.632000000000005
1640
+ - type: ndcg_at_10
1641
+ value: 55.786
1642
+ - type: ndcg_at_100
1643
+ value: 59.748999999999995
1644
+ - type: ndcg_at_1000
1645
+ value: 60.339
1646
+ - type: ndcg_at_3
1647
+ value: 47.292
1648
+ - type: ndcg_at_5
1649
+ value: 51.766999999999996
1650
+ - type: precision_at_1
1651
+ value: 35.632000000000005
1652
+ - type: precision_at_10
1653
+ value: 9.267
1654
+ - type: precision_at_100
1655
+ value: 1.149
1656
+ - type: precision_at_1000
1657
+ value: 0.12
1658
+ - type: precision_at_3
1659
+ value: 21.601
1660
+ - type: precision_at_5
1661
+ value: 15.539
1662
+ - type: recall_at_1
1663
+ value: 31.552000000000003
1664
+ - type: recall_at_10
1665
+ value: 77.62400000000001
1666
+ - type: recall_at_100
1667
+ value: 94.527
1668
+ - type: recall_at_1000
1669
+ value: 98.919
1670
+ - type: recall_at_3
1671
+ value: 55.898
1672
+ - type: recall_at_5
1673
+ value: 66.121
1674
+ - task:
1675
+ type: Retrieval
1676
+ dataset:
1677
+ name: MTEB QuoraRetrieval
1678
+ type: quora
1679
+ config: default
1680
+ split: test
1681
+ revision: None
1682
+ metrics:
1683
+ - type: map_at_1
1684
+ value: 71.414
1685
+ - type: map_at_10
1686
+ value: 85.37400000000001
1687
+ - type: map_at_100
1688
+ value: 86.01100000000001
1689
+ - type: map_at_1000
1690
+ value: 86.027
1691
+ - type: map_at_3
1692
+ value: 82.562
1693
+ - type: map_at_5
1694
+ value: 84.284
1695
+ - type: mrr_at_1
1696
+ value: 82.24000000000001
1697
+ - type: mrr_at_10
1698
+ value: 88.225
1699
+ - type: mrr_at_100
1700
+ value: 88.324
1701
+ - type: mrr_at_1000
1702
+ value: 88.325
1703
+ - type: mrr_at_3
1704
+ value: 87.348
1705
+ - type: mrr_at_5
1706
+ value: 87.938
1707
+ - type: ndcg_at_1
1708
+ value: 82.24000000000001
1709
+ - type: ndcg_at_10
1710
+ value: 88.97699999999999
1711
+ - type: ndcg_at_100
1712
+ value: 90.16
1713
+ - type: ndcg_at_1000
1714
+ value: 90.236
1715
+ - type: ndcg_at_3
1716
+ value: 86.371
1717
+ - type: ndcg_at_5
1718
+ value: 87.746
1719
+ - type: precision_at_1
1720
+ value: 82.24000000000001
1721
+ - type: precision_at_10
1722
+ value: 13.481000000000002
1723
+ - type: precision_at_100
1724
+ value: 1.534
1725
+ - type: precision_at_1000
1726
+ value: 0.157
1727
+ - type: precision_at_3
1728
+ value: 37.86
1729
+ - type: precision_at_5
1730
+ value: 24.738
1731
+ - type: recall_at_1
1732
+ value: 71.414
1733
+ - type: recall_at_10
1734
+ value: 95.735
1735
+ - type: recall_at_100
1736
+ value: 99.696
1737
+ - type: recall_at_1000
1738
+ value: 99.979
1739
+ - type: recall_at_3
1740
+ value: 88.105
1741
+ - type: recall_at_5
1742
+ value: 92.17999999999999
1743
+ - task:
1744
+ type: Clustering
1745
+ dataset:
1746
+ name: MTEB RedditClustering
1747
+ type: mteb/reddit-clustering
1748
+ config: default
1749
+ split: test
1750
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1751
+ metrics:
1752
+ - type: v_measure
1753
+ value: 60.22146692057259
1754
+ - task:
1755
+ type: Clustering
1756
+ dataset:
1757
+ name: MTEB RedditClusteringP2P
1758
+ type: mteb/reddit-clustering-p2p
1759
+ config: default
1760
+ split: test
1761
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1762
+ metrics:
1763
+ - type: v_measure
1764
+ value: 65.29273320614578
1765
+ - task:
1766
+ type: Retrieval
1767
+ dataset:
1768
+ name: MTEB SCIDOCS
1769
+ type: scidocs
1770
+ config: default
1771
+ split: test
1772
+ revision: None
1773
+ metrics:
1774
+ - type: map_at_1
1775
+ value: 5.023
1776
+ - type: map_at_10
1777
+ value: 14.161000000000001
1778
+ - type: map_at_100
1779
+ value: 16.68
1780
+ - type: map_at_1000
1781
+ value: 17.072000000000003
1782
+ - type: map_at_3
1783
+ value: 9.763
1784
+ - type: map_at_5
1785
+ value: 11.977
1786
+ - type: mrr_at_1
1787
+ value: 24.8
1788
+ - type: mrr_at_10
1789
+ value: 37.602999999999994
1790
+ - type: mrr_at_100
1791
+ value: 38.618
1792
+ - type: mrr_at_1000
1793
+ value: 38.659
1794
+ - type: mrr_at_3
1795
+ value: 34.117
1796
+ - type: mrr_at_5
1797
+ value: 36.082
1798
+ - type: ndcg_at_1
1799
+ value: 24.8
1800
+ - type: ndcg_at_10
1801
+ value: 23.316
1802
+ - type: ndcg_at_100
1803
+ value: 32.613
1804
+ - type: ndcg_at_1000
1805
+ value: 38.609
1806
+ - type: ndcg_at_3
1807
+ value: 21.697
1808
+ - type: ndcg_at_5
1809
+ value: 19.241
1810
+ - type: precision_at_1
1811
+ value: 24.8
1812
+ - type: precision_at_10
1813
+ value: 12.36
1814
+ - type: precision_at_100
1815
+ value: 2.593
1816
+ - type: precision_at_1000
1817
+ value: 0.402
1818
+ - type: precision_at_3
1819
+ value: 20.767
1820
+ - type: precision_at_5
1821
+ value: 17.34
1822
+ - type: recall_at_1
1823
+ value: 5.023
1824
+ - type: recall_at_10
1825
+ value: 25.069999999999997
1826
+ - type: recall_at_100
1827
+ value: 52.563
1828
+ - type: recall_at_1000
1829
+ value: 81.525
1830
+ - type: recall_at_3
1831
+ value: 12.613
1832
+ - type: recall_at_5
1833
+ value: 17.583
1834
+ - task:
1835
+ type: STS
1836
+ dataset:
1837
+ name: MTEB SICK-R
1838
+ type: mteb/sickr-sts
1839
+ config: default
1840
+ split: test
1841
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1842
+ metrics:
1843
+ - type: cos_sim_pearson
1844
+ value: 87.71506247604255
1845
+ - type: cos_sim_spearman
1846
+ value: 82.91813463738802
1847
+ - type: euclidean_pearson
1848
+ value: 85.5154616194479
1849
+ - type: euclidean_spearman
1850
+ value: 82.91815254466314
1851
+ - type: manhattan_pearson
1852
+ value: 85.5280917850374
1853
+ - type: manhattan_spearman
1854
+ value: 82.92276537286398
1855
+ - task:
1856
+ type: STS
1857
+ dataset:
1858
+ name: MTEB STS12
1859
+ type: mteb/sts12-sts
1860
+ config: default
1861
+ split: test
1862
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1863
+ metrics:
1864
+ - type: cos_sim_pearson
1865
+ value: 87.43772054228462
1866
+ - type: cos_sim_spearman
1867
+ value: 78.75750601716682
1868
+ - type: euclidean_pearson
1869
+ value: 85.76074482955764
1870
+ - type: euclidean_spearman
1871
+ value: 78.75651057223058
1872
+ - type: manhattan_pearson
1873
+ value: 85.73390291701668
1874
+ - type: manhattan_spearman
1875
+ value: 78.72699385957797
1876
+ - task:
1877
+ type: STS
1878
+ dataset:
1879
+ name: MTEB STS13
1880
+ type: mteb/sts13-sts
1881
+ config: default
1882
+ split: test
1883
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1884
+ metrics:
1885
+ - type: cos_sim_pearson
1886
+ value: 89.58144067172472
1887
+ - type: cos_sim_spearman
1888
+ value: 90.3524512966946
1889
+ - type: euclidean_pearson
1890
+ value: 89.71365391594237
1891
+ - type: euclidean_spearman
1892
+ value: 90.35239632843408
1893
+ - type: manhattan_pearson
1894
+ value: 89.66905421746478
1895
+ - type: manhattan_spearman
1896
+ value: 90.31508211683513
1897
+ - task:
1898
+ type: STS
1899
+ dataset:
1900
+ name: MTEB STS14
1901
+ type: mteb/sts14-sts
1902
+ config: default
1903
+ split: test
1904
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1905
+ metrics:
1906
+ - type: cos_sim_pearson
1907
+ value: 87.77692637102102
1908
+ - type: cos_sim_spearman
1909
+ value: 85.45710562643485
1910
+ - type: euclidean_pearson
1911
+ value: 87.42456979928723
1912
+ - type: euclidean_spearman
1913
+ value: 85.45709386240908
1914
+ - type: manhattan_pearson
1915
+ value: 87.40754529526272
1916
+ - type: manhattan_spearman
1917
+ value: 85.44834854173303
1918
+ - task:
1919
+ type: STS
1920
+ dataset:
1921
+ name: MTEB STS15
1922
+ type: mteb/sts15-sts
1923
+ config: default
1924
+ split: test
1925
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1926
+ metrics:
1927
+ - type: cos_sim_pearson
1928
+ value: 88.28491331695997
1929
+ - type: cos_sim_spearman
1930
+ value: 89.62037029566964
1931
+ - type: euclidean_pearson
1932
+ value: 89.02479391362826
1933
+ - type: euclidean_spearman
1934
+ value: 89.62036733618466
1935
+ - type: manhattan_pearson
1936
+ value: 89.00394756040342
1937
+ - type: manhattan_spearman
1938
+ value: 89.60867744215236
1939
+ - task:
1940
+ type: STS
1941
+ dataset:
1942
+ name: MTEB STS16
1943
+ type: mteb/sts16-sts
1944
+ config: default
1945
+ split: test
1946
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1947
+ metrics:
1948
+ - type: cos_sim_pearson
1949
+ value: 85.08911381280191
1950
+ - type: cos_sim_spearman
1951
+ value: 86.5791780765767
1952
+ - type: euclidean_pearson
1953
+ value: 86.16063473577861
1954
+ - type: euclidean_spearman
1955
+ value: 86.57917745378766
1956
+ - type: manhattan_pearson
1957
+ value: 86.13677924604175
1958
+ - type: manhattan_spearman
1959
+ value: 86.56115615768685
1960
+ - task:
1961
+ type: STS
1962
+ dataset:
1963
+ name: MTEB STS17 (en-en)
1964
+ type: mteb/sts17-crosslingual-sts
1965
+ config: en-en
1966
+ split: test
1967
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1968
+ metrics:
1969
+ - type: cos_sim_pearson
1970
+ value: 89.58029496205235
1971
+ - type: cos_sim_spearman
1972
+ value: 89.49551253826998
1973
+ - type: euclidean_pearson
1974
+ value: 90.13714840963748
1975
+ - type: euclidean_spearman
1976
+ value: 89.49551253826998
1977
+ - type: manhattan_pearson
1978
+ value: 90.13039633601363
1979
+ - type: manhattan_spearman
1980
+ value: 89.4513453745516
1981
+ - task:
1982
+ type: STS
1983
+ dataset:
1984
+ name: MTEB STS22 (en)
1985
+ type: mteb/sts22-crosslingual-sts
1986
+ config: en
1987
+ split: test
1988
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1989
+ metrics:
1990
+ - type: cos_sim_pearson
1991
+ value: 69.01546399666435
1992
+ - type: cos_sim_spearman
1993
+ value: 69.33824484595624
1994
+ - type: euclidean_pearson
1995
+ value: 70.76511642998874
1996
+ - type: euclidean_spearman
1997
+ value: 69.33824484595624
1998
+ - type: manhattan_pearson
1999
+ value: 70.84320785047453
2000
+ - type: manhattan_spearman
2001
+ value: 69.54233632223537
2002
+ - task:
2003
+ type: STS
2004
+ dataset:
2005
+ name: MTEB STSBenchmark
2006
+ type: mteb/stsbenchmark-sts
2007
+ config: default
2008
+ split: test
2009
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2010
+ metrics:
2011
+ - type: cos_sim_pearson
2012
+ value: 87.26389196390119
2013
+ - type: cos_sim_spearman
2014
+ value: 89.09721478341385
2015
+ - type: euclidean_pearson
2016
+ value: 88.97208685922517
2017
+ - type: euclidean_spearman
2018
+ value: 89.09720927308881
2019
+ - type: manhattan_pearson
2020
+ value: 88.97513670502573
2021
+ - type: manhattan_spearman
2022
+ value: 89.07647853984004
2023
+ - task:
2024
+ type: Reranking
2025
+ dataset:
2026
+ name: MTEB SciDocsRR
2027
+ type: mteb/scidocs-reranking
2028
+ config: default
2029
+ split: test
2030
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2031
+ metrics:
2032
+ - type: map
2033
+ value: 87.53075025771936
2034
+ - type: mrr
2035
+ value: 96.24327651288436
2036
+ - task:
2037
+ type: Retrieval
2038
+ dataset:
2039
+ name: MTEB SciFact
2040
+ type: scifact
2041
+ config: default
2042
+ split: test
2043
+ revision: None
2044
+ metrics:
2045
+ - type: map_at_1
2046
+ value: 60.428000000000004
2047
+ - type: map_at_10
2048
+ value: 70.088
2049
+ - type: map_at_100
2050
+ value: 70.589
2051
+ - type: map_at_1000
2052
+ value: 70.614
2053
+ - type: map_at_3
2054
+ value: 67.191
2055
+ - type: map_at_5
2056
+ value: 68.515
2057
+ - type: mrr_at_1
2058
+ value: 63.333
2059
+ - type: mrr_at_10
2060
+ value: 71.13000000000001
2061
+ - type: mrr_at_100
2062
+ value: 71.545
2063
+ - type: mrr_at_1000
2064
+ value: 71.569
2065
+ - type: mrr_at_3
2066
+ value: 68.944
2067
+ - type: mrr_at_5
2068
+ value: 70.078
2069
+ - type: ndcg_at_1
2070
+ value: 63.333
2071
+ - type: ndcg_at_10
2072
+ value: 74.72800000000001
2073
+ - type: ndcg_at_100
2074
+ value: 76.64999999999999
2075
+ - type: ndcg_at_1000
2076
+ value: 77.176
2077
+ - type: ndcg_at_3
2078
+ value: 69.659
2079
+ - type: ndcg_at_5
2080
+ value: 71.626
2081
+ - type: precision_at_1
2082
+ value: 63.333
2083
+ - type: precision_at_10
2084
+ value: 10
2085
+ - type: precision_at_100
2086
+ value: 1.09
2087
+ - type: precision_at_1000
2088
+ value: 0.11299999999999999
2089
+ - type: precision_at_3
2090
+ value: 27.111
2091
+ - type: precision_at_5
2092
+ value: 17.666999999999998
2093
+ - type: recall_at_1
2094
+ value: 60.428000000000004
2095
+ - type: recall_at_10
2096
+ value: 87.98899999999999
2097
+ - type: recall_at_100
2098
+ value: 96.167
2099
+ - type: recall_at_1000
2100
+ value: 100
2101
+ - type: recall_at_3
2102
+ value: 74.006
2103
+ - type: recall_at_5
2104
+ value: 79.05
2105
+ - task:
2106
+ type: PairClassification
2107
+ dataset:
2108
+ name: MTEB SprintDuplicateQuestions
2109
+ type: mteb/sprintduplicatequestions-pairclassification
2110
+ config: default
2111
+ split: test
2112
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2113
+ metrics:
2114
+ - type: cos_sim_accuracy
2115
+ value: 99.87326732673267
2116
+ - type: cos_sim_ap
2117
+ value: 96.81770773701805
2118
+ - type: cos_sim_f1
2119
+ value: 93.6318407960199
2120
+ - type: cos_sim_precision
2121
+ value: 93.16831683168317
2122
+ - type: cos_sim_recall
2123
+ value: 94.1
2124
+ - type: dot_accuracy
2125
+ value: 99.87326732673267
2126
+ - type: dot_ap
2127
+ value: 96.8174218946665
2128
+ - type: dot_f1
2129
+ value: 93.6318407960199
2130
+ - type: dot_precision
2131
+ value: 93.16831683168317
2132
+ - type: dot_recall
2133
+ value: 94.1
2134
+ - type: euclidean_accuracy
2135
+ value: 99.87326732673267
2136
+ - type: euclidean_ap
2137
+ value: 96.81770773701807
2138
+ - type: euclidean_f1
2139
+ value: 93.6318407960199
2140
+ - type: euclidean_precision
2141
+ value: 93.16831683168317
2142
+ - type: euclidean_recall
2143
+ value: 94.1
2144
+ - type: manhattan_accuracy
2145
+ value: 99.87227722772278
2146
+ - type: manhattan_ap
2147
+ value: 96.83164126821747
2148
+ - type: manhattan_f1
2149
+ value: 93.54677338669335
2150
+ - type: manhattan_precision
2151
+ value: 93.5935935935936
2152
+ - type: manhattan_recall
2153
+ value: 93.5
2154
+ - type: max_accuracy
2155
+ value: 99.87326732673267
2156
+ - type: max_ap
2157
+ value: 96.83164126821747
2158
+ - type: max_f1
2159
+ value: 93.6318407960199
2160
+ - task:
2161
+ type: Clustering
2162
+ dataset:
2163
+ name: MTEB StackExchangeClustering
2164
+ type: mteb/stackexchange-clustering
2165
+ config: default
2166
+ split: test
2167
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2168
+ metrics:
2169
+ - type: v_measure
2170
+ value: 65.6212042420246
2171
+ - task:
2172
+ type: Clustering
2173
+ dataset:
2174
+ name: MTEB StackExchangeClusteringP2P
2175
+ type: mteb/stackexchange-clustering-p2p
2176
+ config: default
2177
+ split: test
2178
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2179
+ metrics:
2180
+ - type: v_measure
2181
+ value: 35.779230635982564
2182
+ - task:
2183
+ type: Reranking
2184
+ dataset:
2185
+ name: MTEB StackOverflowDupQuestions
2186
+ type: mteb/stackoverflowdupquestions-reranking
2187
+ config: default
2188
+ split: test
2189
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2190
+ metrics:
2191
+ - type: map
2192
+ value: 55.217701909036286
2193
+ - type: mrr
2194
+ value: 56.17658995416349
2195
+ - task:
2196
+ type: Summarization
2197
+ dataset:
2198
+ name: MTEB SummEval
2199
+ type: mteb/summeval
2200
+ config: default
2201
+ split: test
2202
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2203
+ metrics:
2204
+ - type: cos_sim_pearson
2205
+ value: 30.954206018888453
2206
+ - type: cos_sim_spearman
2207
+ value: 32.71062599450096
2208
+ - type: dot_pearson
2209
+ value: 30.95420929056943
2210
+ - type: dot_spearman
2211
+ value: 32.71062599450096
2212
+ - task:
2213
+ type: Retrieval
2214
+ dataset:
2215
+ name: MTEB TRECCOVID
2216
+ type: trec-covid
2217
+ config: default
2218
+ split: test
2219
+ revision: None
2220
+ metrics:
2221
+ - type: map_at_1
2222
+ value: 0.22699999999999998
2223
+ - type: map_at_10
2224
+ value: 1.924
2225
+ - type: map_at_100
2226
+ value: 10.525
2227
+ - type: map_at_1000
2228
+ value: 24.973
2229
+ - type: map_at_3
2230
+ value: 0.638
2231
+ - type: map_at_5
2232
+ value: 1.0659999999999998
2233
+ - type: mrr_at_1
2234
+ value: 84
2235
+ - type: mrr_at_10
2236
+ value: 91.067
2237
+ - type: mrr_at_100
2238
+ value: 91.067
2239
+ - type: mrr_at_1000
2240
+ value: 91.067
2241
+ - type: mrr_at_3
2242
+ value: 90.667
2243
+ - type: mrr_at_5
2244
+ value: 91.067
2245
+ - type: ndcg_at_1
2246
+ value: 81
2247
+ - type: ndcg_at_10
2248
+ value: 75.566
2249
+ - type: ndcg_at_100
2250
+ value: 56.387
2251
+ - type: ndcg_at_1000
2252
+ value: 49.834
2253
+ - type: ndcg_at_3
2254
+ value: 80.899
2255
+ - type: ndcg_at_5
2256
+ value: 80.75099999999999
2257
+ - type: precision_at_1
2258
+ value: 84
2259
+ - type: precision_at_10
2260
+ value: 79
2261
+ - type: precision_at_100
2262
+ value: 57.56
2263
+ - type: precision_at_1000
2264
+ value: 21.8
2265
+ - type: precision_at_3
2266
+ value: 84.667
2267
+ - type: precision_at_5
2268
+ value: 85.2
2269
+ - type: recall_at_1
2270
+ value: 0.22699999999999998
2271
+ - type: recall_at_10
2272
+ value: 2.136
2273
+ - type: recall_at_100
2274
+ value: 13.861
2275
+ - type: recall_at_1000
2276
+ value: 46.299
2277
+ - type: recall_at_3
2278
+ value: 0.6649999999999999
2279
+ - type: recall_at_5
2280
+ value: 1.145
2281
+ - task:
2282
+ type: Retrieval
2283
+ dataset:
2284
+ name: MTEB Touche2020
2285
+ type: webis-touche2020
2286
+ config: default
2287
+ split: test
2288
+ revision: None
2289
+ metrics:
2290
+ - type: map_at_1
2291
+ value: 2.752
2292
+ - type: map_at_10
2293
+ value: 9.951
2294
+ - type: map_at_100
2295
+ value: 16.794999999999998
2296
+ - type: map_at_1000
2297
+ value: 18.251
2298
+ - type: map_at_3
2299
+ value: 5.288
2300
+ - type: map_at_5
2301
+ value: 6.954000000000001
2302
+ - type: mrr_at_1
2303
+ value: 38.775999999999996
2304
+ - type: mrr_at_10
2305
+ value: 50.458000000000006
2306
+ - type: mrr_at_100
2307
+ value: 51.324999999999996
2308
+ - type: mrr_at_1000
2309
+ value: 51.339999999999996
2310
+ - type: mrr_at_3
2311
+ value: 46.939
2312
+ - type: mrr_at_5
2313
+ value: 47.857
2314
+ - type: ndcg_at_1
2315
+ value: 36.735
2316
+ - type: ndcg_at_10
2317
+ value: 25.198999999999998
2318
+ - type: ndcg_at_100
2319
+ value: 37.938
2320
+ - type: ndcg_at_1000
2321
+ value: 49.145
2322
+ - type: ndcg_at_3
2323
+ value: 29.348000000000003
2324
+ - type: ndcg_at_5
2325
+ value: 25.804
2326
+ - type: precision_at_1
2327
+ value: 38.775999999999996
2328
+ - type: precision_at_10
2329
+ value: 22.041
2330
+ - type: precision_at_100
2331
+ value: 7.939
2332
+ - type: precision_at_1000
2333
+ value: 1.555
2334
+ - type: precision_at_3
2335
+ value: 29.932
2336
+ - type: precision_at_5
2337
+ value: 24.490000000000002
2338
+ - type: recall_at_1
2339
+ value: 2.752
2340
+ - type: recall_at_10
2341
+ value: 16.197
2342
+ - type: recall_at_100
2343
+ value: 49.166
2344
+ - type: recall_at_1000
2345
+ value: 84.18900000000001
2346
+ - type: recall_at_3
2347
+ value: 6.438000000000001
2348
+ - type: recall_at_5
2349
+ value: 9.093
2350
+ - task:
2351
+ type: Classification
2352
+ dataset:
2353
+ name: MTEB ToxicConversationsClassification
2354
+ type: mteb/toxic_conversations_50k
2355
+ config: default
2356
+ split: test
2357
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2358
+ metrics:
2359
+ - type: accuracy
2360
+ value: 71.47980000000001
2361
+ - type: ap
2362
+ value: 14.605194452178754
2363
+ - type: f1
2364
+ value: 55.07362924988948
2365
+ - task:
2366
+ type: Classification
2367
+ dataset:
2368
+ name: MTEB TweetSentimentExtractionClassification
2369
+ type: mteb/tweet_sentiment_extraction
2370
+ config: default
2371
+ split: test
2372
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2373
+ metrics:
2374
+ - type: accuracy
2375
+ value: 59.708545557441994
2376
+ - type: f1
2377
+ value: 60.04751270975683
2378
+ - task:
2379
+ type: Clustering
2380
+ dataset:
2381
+ name: MTEB TwentyNewsgroupsClustering
2382
+ type: mteb/twentynewsgroups-clustering
2383
+ config: default
2384
+ split: test
2385
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2386
+ metrics:
2387
+ - type: v_measure
2388
+ value: 53.21105960597211
2389
+ - task:
2390
+ type: PairClassification
2391
+ dataset:
2392
+ name: MTEB TwitterSemEval2015
2393
+ type: mteb/twittersemeval2015-pairclassification
2394
+ config: default
2395
+ split: test
2396
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2397
+ metrics:
2398
+ - type: cos_sim_accuracy
2399
+ value: 87.58419264469214
2400
+ - type: cos_sim_ap
2401
+ value: 78.55300004517404
2402
+ - type: cos_sim_f1
2403
+ value: 71.49673530889001
2404
+ - type: cos_sim_precision
2405
+ value: 68.20795400095831
2406
+ - type: cos_sim_recall
2407
+ value: 75.11873350923483
2408
+ - type: dot_accuracy
2409
+ value: 87.58419264469214
2410
+ - type: dot_ap
2411
+ value: 78.55297659559511
2412
+ - type: dot_f1
2413
+ value: 71.49673530889001
2414
+ - type: dot_precision
2415
+ value: 68.20795400095831
2416
+ - type: dot_recall
2417
+ value: 75.11873350923483
2418
+ - type: euclidean_accuracy
2419
+ value: 87.58419264469214
2420
+ - type: euclidean_ap
2421
+ value: 78.55300477331477
2422
+ - type: euclidean_f1
2423
+ value: 71.49673530889001
2424
+ - type: euclidean_precision
2425
+ value: 68.20795400095831
2426
+ - type: euclidean_recall
2427
+ value: 75.11873350923483
2428
+ - type: manhattan_accuracy
2429
+ value: 87.5663110210407
2430
+ - type: manhattan_ap
2431
+ value: 78.49982050876562
2432
+ - type: manhattan_f1
2433
+ value: 71.35488740722104
2434
+ - type: manhattan_precision
2435
+ value: 68.18946862226497
2436
+ - type: manhattan_recall
2437
+ value: 74.82849604221636
2438
+ - type: max_accuracy
2439
+ value: 87.58419264469214
2440
+ - type: max_ap
2441
+ value: 78.55300477331477
2442
+ - type: max_f1
2443
+ value: 71.49673530889001
2444
+ - task:
2445
+ type: PairClassification
2446
+ dataset:
2447
+ name: MTEB TwitterURLCorpus
2448
+ type: mteb/twitterurlcorpus-pairclassification
2449
+ config: default
2450
+ split: test
2451
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2452
+ metrics:
2453
+ - type: cos_sim_accuracy
2454
+ value: 89.09069740365584
2455
+ - type: cos_sim_ap
2456
+ value: 86.22749303724757
2457
+ - type: cos_sim_f1
2458
+ value: 78.36863452005407
2459
+ - type: cos_sim_precision
2460
+ value: 76.49560117302053
2461
+ - type: cos_sim_recall
2462
+ value: 80.33569448721897
2463
+ - type: dot_accuracy
2464
+ value: 89.09069740365584
2465
+ - type: dot_ap
2466
+ value: 86.22750233655673
2467
+ - type: dot_f1
2468
+ value: 78.36863452005407
2469
+ - type: dot_precision
2470
+ value: 76.49560117302053
2471
+ - type: dot_recall
2472
+ value: 80.33569448721897
2473
+ - type: euclidean_accuracy
2474
+ value: 89.09069740365584
2475
+ - type: euclidean_ap
2476
+ value: 86.22749355597347
2477
+ - type: euclidean_f1
2478
+ value: 78.36863452005407
2479
+ - type: euclidean_precision
2480
+ value: 76.49560117302053
2481
+ - type: euclidean_recall
2482
+ value: 80.33569448721897
2483
+ - type: manhattan_accuracy
2484
+ value: 89.08293553770326
2485
+ - type: manhattan_ap
2486
+ value: 86.21913616084771
2487
+ - type: manhattan_f1
2488
+ value: 78.3907031479847
2489
+ - type: manhattan_precision
2490
+ value: 75.0352013517319
2491
+ - type: manhattan_recall
2492
+ value: 82.06036341238065
2493
+ - type: max_accuracy
2494
+ value: 89.09069740365584
2495
+ - type: max_ap
2496
+ value: 86.22750233655673
2497
+ - type: max_f1
2498
+ value: 78.3907031479847
2499
  ---
2500
+
2501
+ # mlx-community/mxbai-embed-large-v1
2502
+
2503
+ The Model [mlx-community/mxbai-embed-large-v1](https://huggingface.co/mlx-community/mxbai-embed-large-v1) was converted to MLX format from [mixedbread-ai/mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) using mlx-lm version **0.0.3**.
2504
+
2505
+ ## Use with mlx
2506
+
2507
+ ```bash
2508
+ pip install mlx-embeddings
2509
+ ```
2510
+
2511
+ ```python
2512
+ from mlx_embeddings import load, generate
2513
+ import mlx.core as mx
2514
+
2515
+ model, tokenizer = load("mlx-community/mxbai-embed-large-v1")
2516
+
2517
+ # For text embeddings
2518
+ output = generate(model, processor, texts=["I like grapes", "I like fruits"])
2519
+ embeddings = output.text_embeds # Normalized embeddings
2520
+
2521
+ # Compute dot product between normalized embeddings
2522
+ similarity_matrix = mx.matmul(embeddings, embeddings.T)
2523
+
2524
+ print("Similarity matrix between texts:")
2525
+ print(similarity_matrix)
2526
+
2527
+
2528
+ ```
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "gradient_checkpointing": false,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 1024,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 4096,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 16,
17
+ "num_hidden_layers": 24,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "quantization": {
21
+ "group_size": 64,
22
+ "bits": 4
23
+ },
24
+ "torch_dtype": "float16",
25
+ "transformers_version": "4.38.2",
26
+ "type_vocab_size": 2,
27
+ "use_cache": false,
28
+ "vocab_size": 30522
29
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "2.5.1",
4
+ "transformers": "4.37.0",
5
+ "pytorch": "2.1.0+cu121"
6
+ },
7
+ "prompts": {
8
+ "query": "Represent this sentence for searching relevant passages: ",
9
+ "passage": ""
10
+ },
11
+ "default_prompt_name": null
12
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:136d41998211d77b0b5e998538c7892547672ecbdc8701e48f4427269b7d7c13
3
+ size 189058926
model.safetensors.index.json ADDED
@@ -0,0 +1,694 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 188980992
4
+ },
5
+ "weight_map": {
6
+ "embeddings.LayerNorm.bias": "model.safetensors",
7
+ "embeddings.LayerNorm.weight": "model.safetensors",
8
+ "embeddings.position_embeddings.biases": "model.safetensors",
9
+ "embeddings.position_embeddings.scales": "model.safetensors",
10
+ "embeddings.position_embeddings.weight": "model.safetensors",
11
+ "embeddings.token_type_embeddings.biases": "model.safetensors",
12
+ "embeddings.token_type_embeddings.scales": "model.safetensors",
13
+ "embeddings.token_type_embeddings.weight": "model.safetensors",
14
+ "embeddings.word_embeddings.biases": "model.safetensors",
15
+ "embeddings.word_embeddings.scales": "model.safetensors",
16
+ "embeddings.word_embeddings.weight": "model.safetensors",
17
+ "encoder.layer.0.attention.output.LayerNorm.bias": "model.safetensors",
18
+ "encoder.layer.0.attention.output.LayerNorm.weight": "model.safetensors",
19
+ "encoder.layer.0.attention.output.dense.bias": "model.safetensors",
20
+ "encoder.layer.0.attention.output.dense.biases": "model.safetensors",
21
+ "encoder.layer.0.attention.output.dense.scales": "model.safetensors",
22
+ "encoder.layer.0.attention.output.dense.weight": "model.safetensors",
23
+ "encoder.layer.0.attention.self.key.bias": "model.safetensors",
24
+ "encoder.layer.0.attention.self.key.biases": "model.safetensors",
25
+ "encoder.layer.0.attention.self.key.scales": "model.safetensors",
26
+ "encoder.layer.0.attention.self.key.weight": "model.safetensors",
27
+ "encoder.layer.0.attention.self.query.bias": "model.safetensors",
28
+ "encoder.layer.0.attention.self.query.biases": "model.safetensors",
29
+ "encoder.layer.0.attention.self.query.scales": "model.safetensors",
30
+ "encoder.layer.0.attention.self.query.weight": "model.safetensors",
31
+ "encoder.layer.0.attention.self.value.bias": "model.safetensors",
32
+ "encoder.layer.0.attention.self.value.biases": "model.safetensors",
33
+ "encoder.layer.0.attention.self.value.scales": "model.safetensors",
34
+ "encoder.layer.0.attention.self.value.weight": "model.safetensors",
35
+ "encoder.layer.0.intermediate.dense.bias": "model.safetensors",
36
+ "encoder.layer.0.intermediate.dense.biases": "model.safetensors",
37
+ "encoder.layer.0.intermediate.dense.scales": "model.safetensors",
38
+ "encoder.layer.0.intermediate.dense.weight": "model.safetensors",
39
+ "encoder.layer.0.output.LayerNorm.bias": "model.safetensors",
40
+ "encoder.layer.0.output.LayerNorm.weight": "model.safetensors",
41
+ "encoder.layer.0.output.dense.bias": "model.safetensors",
42
+ "encoder.layer.0.output.dense.biases": "model.safetensors",
43
+ "encoder.layer.0.output.dense.scales": "model.safetensors",
44
+ "encoder.layer.0.output.dense.weight": "model.safetensors",
45
+ "encoder.layer.1.attention.output.LayerNorm.bias": "model.safetensors",
46
+ "encoder.layer.1.attention.output.LayerNorm.weight": "model.safetensors",
47
+ "encoder.layer.1.attention.output.dense.bias": "model.safetensors",
48
+ "encoder.layer.1.attention.output.dense.biases": "model.safetensors",
49
+ "encoder.layer.1.attention.output.dense.scales": "model.safetensors",
50
+ "encoder.layer.1.attention.output.dense.weight": "model.safetensors",
51
+ "encoder.layer.1.attention.self.key.bias": "model.safetensors",
52
+ "encoder.layer.1.attention.self.key.biases": "model.safetensors",
53
+ "encoder.layer.1.attention.self.key.scales": "model.safetensors",
54
+ "encoder.layer.1.attention.self.key.weight": "model.safetensors",
55
+ "encoder.layer.1.attention.self.query.bias": "model.safetensors",
56
+ "encoder.layer.1.attention.self.query.biases": "model.safetensors",
57
+ "encoder.layer.1.attention.self.query.scales": "model.safetensors",
58
+ "encoder.layer.1.attention.self.query.weight": "model.safetensors",
59
+ "encoder.layer.1.attention.self.value.bias": "model.safetensors",
60
+ "encoder.layer.1.attention.self.value.biases": "model.safetensors",
61
+ "encoder.layer.1.attention.self.value.scales": "model.safetensors",
62
+ "encoder.layer.1.attention.self.value.weight": "model.safetensors",
63
+ "encoder.layer.1.intermediate.dense.bias": "model.safetensors",
64
+ "encoder.layer.1.intermediate.dense.biases": "model.safetensors",
65
+ "encoder.layer.1.intermediate.dense.scales": "model.safetensors",
66
+ "encoder.layer.1.intermediate.dense.weight": "model.safetensors",
67
+ "encoder.layer.1.output.LayerNorm.bias": "model.safetensors",
68
+ "encoder.layer.1.output.LayerNorm.weight": "model.safetensors",
69
+ "encoder.layer.1.output.dense.bias": "model.safetensors",
70
+ "encoder.layer.1.output.dense.biases": "model.safetensors",
71
+ "encoder.layer.1.output.dense.scales": "model.safetensors",
72
+ "encoder.layer.1.output.dense.weight": "model.safetensors",
73
+ "encoder.layer.10.attention.output.LayerNorm.bias": "model.safetensors",
74
+ "encoder.layer.10.attention.output.LayerNorm.weight": "model.safetensors",
75
+ "encoder.layer.10.attention.output.dense.bias": "model.safetensors",
76
+ "encoder.layer.10.attention.output.dense.biases": "model.safetensors",
77
+ "encoder.layer.10.attention.output.dense.scales": "model.safetensors",
78
+ "encoder.layer.10.attention.output.dense.weight": "model.safetensors",
79
+ "encoder.layer.10.attention.self.key.bias": "model.safetensors",
80
+ "encoder.layer.10.attention.self.key.biases": "model.safetensors",
81
+ "encoder.layer.10.attention.self.key.scales": "model.safetensors",
82
+ "encoder.layer.10.attention.self.key.weight": "model.safetensors",
83
+ "encoder.layer.10.attention.self.query.bias": "model.safetensors",
84
+ "encoder.layer.10.attention.self.query.biases": "model.safetensors",
85
+ "encoder.layer.10.attention.self.query.scales": "model.safetensors",
86
+ "encoder.layer.10.attention.self.query.weight": "model.safetensors",
87
+ "encoder.layer.10.attention.self.value.bias": "model.safetensors",
88
+ "encoder.layer.10.attention.self.value.biases": "model.safetensors",
89
+ "encoder.layer.10.attention.self.value.scales": "model.safetensors",
90
+ "encoder.layer.10.attention.self.value.weight": "model.safetensors",
91
+ "encoder.layer.10.intermediate.dense.bias": "model.safetensors",
92
+ "encoder.layer.10.intermediate.dense.biases": "model.safetensors",
93
+ "encoder.layer.10.intermediate.dense.scales": "model.safetensors",
94
+ "encoder.layer.10.intermediate.dense.weight": "model.safetensors",
95
+ "encoder.layer.10.output.LayerNorm.bias": "model.safetensors",
96
+ "encoder.layer.10.output.LayerNorm.weight": "model.safetensors",
97
+ "encoder.layer.10.output.dense.bias": "model.safetensors",
98
+ "encoder.layer.10.output.dense.biases": "model.safetensors",
99
+ "encoder.layer.10.output.dense.scales": "model.safetensors",
100
+ "encoder.layer.10.output.dense.weight": "model.safetensors",
101
+ "encoder.layer.11.attention.output.LayerNorm.bias": "model.safetensors",
102
+ "encoder.layer.11.attention.output.LayerNorm.weight": "model.safetensors",
103
+ "encoder.layer.11.attention.output.dense.bias": "model.safetensors",
104
+ "encoder.layer.11.attention.output.dense.biases": "model.safetensors",
105
+ "encoder.layer.11.attention.output.dense.scales": "model.safetensors",
106
+ "encoder.layer.11.attention.output.dense.weight": "model.safetensors",
107
+ "encoder.layer.11.attention.self.key.bias": "model.safetensors",
108
+ "encoder.layer.11.attention.self.key.biases": "model.safetensors",
109
+ "encoder.layer.11.attention.self.key.scales": "model.safetensors",
110
+ "encoder.layer.11.attention.self.key.weight": "model.safetensors",
111
+ "encoder.layer.11.attention.self.query.bias": "model.safetensors",
112
+ "encoder.layer.11.attention.self.query.biases": "model.safetensors",
113
+ "encoder.layer.11.attention.self.query.scales": "model.safetensors",
114
+ "encoder.layer.11.attention.self.query.weight": "model.safetensors",
115
+ "encoder.layer.11.attention.self.value.bias": "model.safetensors",
116
+ "encoder.layer.11.attention.self.value.biases": "model.safetensors",
117
+ "encoder.layer.11.attention.self.value.scales": "model.safetensors",
118
+ "encoder.layer.11.attention.self.value.weight": "model.safetensors",
119
+ "encoder.layer.11.intermediate.dense.bias": "model.safetensors",
120
+ "encoder.layer.11.intermediate.dense.biases": "model.safetensors",
121
+ "encoder.layer.11.intermediate.dense.scales": "model.safetensors",
122
+ "encoder.layer.11.intermediate.dense.weight": "model.safetensors",
123
+ "encoder.layer.11.output.LayerNorm.bias": "model.safetensors",
124
+ "encoder.layer.11.output.LayerNorm.weight": "model.safetensors",
125
+ "encoder.layer.11.output.dense.bias": "model.safetensors",
126
+ "encoder.layer.11.output.dense.biases": "model.safetensors",
127
+ "encoder.layer.11.output.dense.scales": "model.safetensors",
128
+ "encoder.layer.11.output.dense.weight": "model.safetensors",
129
+ "encoder.layer.12.attention.output.LayerNorm.bias": "model.safetensors",
130
+ "encoder.layer.12.attention.output.LayerNorm.weight": "model.safetensors",
131
+ "encoder.layer.12.attention.output.dense.bias": "model.safetensors",
132
+ "encoder.layer.12.attention.output.dense.biases": "model.safetensors",
133
+ "encoder.layer.12.attention.output.dense.scales": "model.safetensors",
134
+ "encoder.layer.12.attention.output.dense.weight": "model.safetensors",
135
+ "encoder.layer.12.attention.self.key.bias": "model.safetensors",
136
+ "encoder.layer.12.attention.self.key.biases": "model.safetensors",
137
+ "encoder.layer.12.attention.self.key.scales": "model.safetensors",
138
+ "encoder.layer.12.attention.self.key.weight": "model.safetensors",
139
+ "encoder.layer.12.attention.self.query.bias": "model.safetensors",
140
+ "encoder.layer.12.attention.self.query.biases": "model.safetensors",
141
+ "encoder.layer.12.attention.self.query.scales": "model.safetensors",
142
+ "encoder.layer.12.attention.self.query.weight": "model.safetensors",
143
+ "encoder.layer.12.attention.self.value.bias": "model.safetensors",
144
+ "encoder.layer.12.attention.self.value.biases": "model.safetensors",
145
+ "encoder.layer.12.attention.self.value.scales": "model.safetensors",
146
+ "encoder.layer.12.attention.self.value.weight": "model.safetensors",
147
+ "encoder.layer.12.intermediate.dense.bias": "model.safetensors",
148
+ "encoder.layer.12.intermediate.dense.biases": "model.safetensors",
149
+ "encoder.layer.12.intermediate.dense.scales": "model.safetensors",
150
+ "encoder.layer.12.intermediate.dense.weight": "model.safetensors",
151
+ "encoder.layer.12.output.LayerNorm.bias": "model.safetensors",
152
+ "encoder.layer.12.output.LayerNorm.weight": "model.safetensors",
153
+ "encoder.layer.12.output.dense.bias": "model.safetensors",
154
+ "encoder.layer.12.output.dense.biases": "model.safetensors",
155
+ "encoder.layer.12.output.dense.scales": "model.safetensors",
156
+ "encoder.layer.12.output.dense.weight": "model.safetensors",
157
+ "encoder.layer.13.attention.output.LayerNorm.bias": "model.safetensors",
158
+ "encoder.layer.13.attention.output.LayerNorm.weight": "model.safetensors",
159
+ "encoder.layer.13.attention.output.dense.bias": "model.safetensors",
160
+ "encoder.layer.13.attention.output.dense.biases": "model.safetensors",
161
+ "encoder.layer.13.attention.output.dense.scales": "model.safetensors",
162
+ "encoder.layer.13.attention.output.dense.weight": "model.safetensors",
163
+ "encoder.layer.13.attention.self.key.bias": "model.safetensors",
164
+ "encoder.layer.13.attention.self.key.biases": "model.safetensors",
165
+ "encoder.layer.13.attention.self.key.scales": "model.safetensors",
166
+ "encoder.layer.13.attention.self.key.weight": "model.safetensors",
167
+ "encoder.layer.13.attention.self.query.bias": "model.safetensors",
168
+ "encoder.layer.13.attention.self.query.biases": "model.safetensors",
169
+ "encoder.layer.13.attention.self.query.scales": "model.safetensors",
170
+ "encoder.layer.13.attention.self.query.weight": "model.safetensors",
171
+ "encoder.layer.13.attention.self.value.bias": "model.safetensors",
172
+ "encoder.layer.13.attention.self.value.biases": "model.safetensors",
173
+ "encoder.layer.13.attention.self.value.scales": "model.safetensors",
174
+ "encoder.layer.13.attention.self.value.weight": "model.safetensors",
175
+ "encoder.layer.13.intermediate.dense.bias": "model.safetensors",
176
+ "encoder.layer.13.intermediate.dense.biases": "model.safetensors",
177
+ "encoder.layer.13.intermediate.dense.scales": "model.safetensors",
178
+ "encoder.layer.13.intermediate.dense.weight": "model.safetensors",
179
+ "encoder.layer.13.output.LayerNorm.bias": "model.safetensors",
180
+ "encoder.layer.13.output.LayerNorm.weight": "model.safetensors",
181
+ "encoder.layer.13.output.dense.bias": "model.safetensors",
182
+ "encoder.layer.13.output.dense.biases": "model.safetensors",
183
+ "encoder.layer.13.output.dense.scales": "model.safetensors",
184
+ "encoder.layer.13.output.dense.weight": "model.safetensors",
185
+ "encoder.layer.14.attention.output.LayerNorm.bias": "model.safetensors",
186
+ "encoder.layer.14.attention.output.LayerNorm.weight": "model.safetensors",
187
+ "encoder.layer.14.attention.output.dense.bias": "model.safetensors",
188
+ "encoder.layer.14.attention.output.dense.biases": "model.safetensors",
189
+ "encoder.layer.14.attention.output.dense.scales": "model.safetensors",
190
+ "encoder.layer.14.attention.output.dense.weight": "model.safetensors",
191
+ "encoder.layer.14.attention.self.key.bias": "model.safetensors",
192
+ "encoder.layer.14.attention.self.key.biases": "model.safetensors",
193
+ "encoder.layer.14.attention.self.key.scales": "model.safetensors",
194
+ "encoder.layer.14.attention.self.key.weight": "model.safetensors",
195
+ "encoder.layer.14.attention.self.query.bias": "model.safetensors",
196
+ "encoder.layer.14.attention.self.query.biases": "model.safetensors",
197
+ "encoder.layer.14.attention.self.query.scales": "model.safetensors",
198
+ "encoder.layer.14.attention.self.query.weight": "model.safetensors",
199
+ "encoder.layer.14.attention.self.value.bias": "model.safetensors",
200
+ "encoder.layer.14.attention.self.value.biases": "model.safetensors",
201
+ "encoder.layer.14.attention.self.value.scales": "model.safetensors",
202
+ "encoder.layer.14.attention.self.value.weight": "model.safetensors",
203
+ "encoder.layer.14.intermediate.dense.bias": "model.safetensors",
204
+ "encoder.layer.14.intermediate.dense.biases": "model.safetensors",
205
+ "encoder.layer.14.intermediate.dense.scales": "model.safetensors",
206
+ "encoder.layer.14.intermediate.dense.weight": "model.safetensors",
207
+ "encoder.layer.14.output.LayerNorm.bias": "model.safetensors",
208
+ "encoder.layer.14.output.LayerNorm.weight": "model.safetensors",
209
+ "encoder.layer.14.output.dense.bias": "model.safetensors",
210
+ "encoder.layer.14.output.dense.biases": "model.safetensors",
211
+ "encoder.layer.14.output.dense.scales": "model.safetensors",
212
+ "encoder.layer.14.output.dense.weight": "model.safetensors",
213
+ "encoder.layer.15.attention.output.LayerNorm.bias": "model.safetensors",
214
+ "encoder.layer.15.attention.output.LayerNorm.weight": "model.safetensors",
215
+ "encoder.layer.15.attention.output.dense.bias": "model.safetensors",
216
+ "encoder.layer.15.attention.output.dense.biases": "model.safetensors",
217
+ "encoder.layer.15.attention.output.dense.scales": "model.safetensors",
218
+ "encoder.layer.15.attention.output.dense.weight": "model.safetensors",
219
+ "encoder.layer.15.attention.self.key.bias": "model.safetensors",
220
+ "encoder.layer.15.attention.self.key.biases": "model.safetensors",
221
+ "encoder.layer.15.attention.self.key.scales": "model.safetensors",
222
+ "encoder.layer.15.attention.self.key.weight": "model.safetensors",
223
+ "encoder.layer.15.attention.self.query.bias": "model.safetensors",
224
+ "encoder.layer.15.attention.self.query.biases": "model.safetensors",
225
+ "encoder.layer.15.attention.self.query.scales": "model.safetensors",
226
+ "encoder.layer.15.attention.self.query.weight": "model.safetensors",
227
+ "encoder.layer.15.attention.self.value.bias": "model.safetensors",
228
+ "encoder.layer.15.attention.self.value.biases": "model.safetensors",
229
+ "encoder.layer.15.attention.self.value.scales": "model.safetensors",
230
+ "encoder.layer.15.attention.self.value.weight": "model.safetensors",
231
+ "encoder.layer.15.intermediate.dense.bias": "model.safetensors",
232
+ "encoder.layer.15.intermediate.dense.biases": "model.safetensors",
233
+ "encoder.layer.15.intermediate.dense.scales": "model.safetensors",
234
+ "encoder.layer.15.intermediate.dense.weight": "model.safetensors",
235
+ "encoder.layer.15.output.LayerNorm.bias": "model.safetensors",
236
+ "encoder.layer.15.output.LayerNorm.weight": "model.safetensors",
237
+ "encoder.layer.15.output.dense.bias": "model.safetensors",
238
+ "encoder.layer.15.output.dense.biases": "model.safetensors",
239
+ "encoder.layer.15.output.dense.scales": "model.safetensors",
240
+ "encoder.layer.15.output.dense.weight": "model.safetensors",
241
+ "encoder.layer.16.attention.output.LayerNorm.bias": "model.safetensors",
242
+ "encoder.layer.16.attention.output.LayerNorm.weight": "model.safetensors",
243
+ "encoder.layer.16.attention.output.dense.bias": "model.safetensors",
244
+ "encoder.layer.16.attention.output.dense.biases": "model.safetensors",
245
+ "encoder.layer.16.attention.output.dense.scales": "model.safetensors",
246
+ "encoder.layer.16.attention.output.dense.weight": "model.safetensors",
247
+ "encoder.layer.16.attention.self.key.bias": "model.safetensors",
248
+ "encoder.layer.16.attention.self.key.biases": "model.safetensors",
249
+ "encoder.layer.16.attention.self.key.scales": "model.safetensors",
250
+ "encoder.layer.16.attention.self.key.weight": "model.safetensors",
251
+ "encoder.layer.16.attention.self.query.bias": "model.safetensors",
252
+ "encoder.layer.16.attention.self.query.biases": "model.safetensors",
253
+ "encoder.layer.16.attention.self.query.scales": "model.safetensors",
254
+ "encoder.layer.16.attention.self.query.weight": "model.safetensors",
255
+ "encoder.layer.16.attention.self.value.bias": "model.safetensors",
256
+ "encoder.layer.16.attention.self.value.biases": "model.safetensors",
257
+ "encoder.layer.16.attention.self.value.scales": "model.safetensors",
258
+ "encoder.layer.16.attention.self.value.weight": "model.safetensors",
259
+ "encoder.layer.16.intermediate.dense.bias": "model.safetensors",
260
+ "encoder.layer.16.intermediate.dense.biases": "model.safetensors",
261
+ "encoder.layer.16.intermediate.dense.scales": "model.safetensors",
262
+ "encoder.layer.16.intermediate.dense.weight": "model.safetensors",
263
+ "encoder.layer.16.output.LayerNorm.bias": "model.safetensors",
264
+ "encoder.layer.16.output.LayerNorm.weight": "model.safetensors",
265
+ "encoder.layer.16.output.dense.bias": "model.safetensors",
266
+ "encoder.layer.16.output.dense.biases": "model.safetensors",
267
+ "encoder.layer.16.output.dense.scales": "model.safetensors",
268
+ "encoder.layer.16.output.dense.weight": "model.safetensors",
269
+ "encoder.layer.17.attention.output.LayerNorm.bias": "model.safetensors",
270
+ "encoder.layer.17.attention.output.LayerNorm.weight": "model.safetensors",
271
+ "encoder.layer.17.attention.output.dense.bias": "model.safetensors",
272
+ "encoder.layer.17.attention.output.dense.biases": "model.safetensors",
273
+ "encoder.layer.17.attention.output.dense.scales": "model.safetensors",
274
+ "encoder.layer.17.attention.output.dense.weight": "model.safetensors",
275
+ "encoder.layer.17.attention.self.key.bias": "model.safetensors",
276
+ "encoder.layer.17.attention.self.key.biases": "model.safetensors",
277
+ "encoder.layer.17.attention.self.key.scales": "model.safetensors",
278
+ "encoder.layer.17.attention.self.key.weight": "model.safetensors",
279
+ "encoder.layer.17.attention.self.query.bias": "model.safetensors",
280
+ "encoder.layer.17.attention.self.query.biases": "model.safetensors",
281
+ "encoder.layer.17.attention.self.query.scales": "model.safetensors",
282
+ "encoder.layer.17.attention.self.query.weight": "model.safetensors",
283
+ "encoder.layer.17.attention.self.value.bias": "model.safetensors",
284
+ "encoder.layer.17.attention.self.value.biases": "model.safetensors",
285
+ "encoder.layer.17.attention.self.value.scales": "model.safetensors",
286
+ "encoder.layer.17.attention.self.value.weight": "model.safetensors",
287
+ "encoder.layer.17.intermediate.dense.bias": "model.safetensors",
288
+ "encoder.layer.17.intermediate.dense.biases": "model.safetensors",
289
+ "encoder.layer.17.intermediate.dense.scales": "model.safetensors",
290
+ "encoder.layer.17.intermediate.dense.weight": "model.safetensors",
291
+ "encoder.layer.17.output.LayerNorm.bias": "model.safetensors",
292
+ "encoder.layer.17.output.LayerNorm.weight": "model.safetensors",
293
+ "encoder.layer.17.output.dense.bias": "model.safetensors",
294
+ "encoder.layer.17.output.dense.biases": "model.safetensors",
295
+ "encoder.layer.17.output.dense.scales": "model.safetensors",
296
+ "encoder.layer.17.output.dense.weight": "model.safetensors",
297
+ "encoder.layer.18.attention.output.LayerNorm.bias": "model.safetensors",
298
+ "encoder.layer.18.attention.output.LayerNorm.weight": "model.safetensors",
299
+ "encoder.layer.18.attention.output.dense.bias": "model.safetensors",
300
+ "encoder.layer.18.attention.output.dense.biases": "model.safetensors",
301
+ "encoder.layer.18.attention.output.dense.scales": "model.safetensors",
302
+ "encoder.layer.18.attention.output.dense.weight": "model.safetensors",
303
+ "encoder.layer.18.attention.self.key.bias": "model.safetensors",
304
+ "encoder.layer.18.attention.self.key.biases": "model.safetensors",
305
+ "encoder.layer.18.attention.self.key.scales": "model.safetensors",
306
+ "encoder.layer.18.attention.self.key.weight": "model.safetensors",
307
+ "encoder.layer.18.attention.self.query.bias": "model.safetensors",
308
+ "encoder.layer.18.attention.self.query.biases": "model.safetensors",
309
+ "encoder.layer.18.attention.self.query.scales": "model.safetensors",
310
+ "encoder.layer.18.attention.self.query.weight": "model.safetensors",
311
+ "encoder.layer.18.attention.self.value.bias": "model.safetensors",
312
+ "encoder.layer.18.attention.self.value.biases": "model.safetensors",
313
+ "encoder.layer.18.attention.self.value.scales": "model.safetensors",
314
+ "encoder.layer.18.attention.self.value.weight": "model.safetensors",
315
+ "encoder.layer.18.intermediate.dense.bias": "model.safetensors",
316
+ "encoder.layer.18.intermediate.dense.biases": "model.safetensors",
317
+ "encoder.layer.18.intermediate.dense.scales": "model.safetensors",
318
+ "encoder.layer.18.intermediate.dense.weight": "model.safetensors",
319
+ "encoder.layer.18.output.LayerNorm.bias": "model.safetensors",
320
+ "encoder.layer.18.output.LayerNorm.weight": "model.safetensors",
321
+ "encoder.layer.18.output.dense.bias": "model.safetensors",
322
+ "encoder.layer.18.output.dense.biases": "model.safetensors",
323
+ "encoder.layer.18.output.dense.scales": "model.safetensors",
324
+ "encoder.layer.18.output.dense.weight": "model.safetensors",
325
+ "encoder.layer.19.attention.output.LayerNorm.bias": "model.safetensors",
326
+ "encoder.layer.19.attention.output.LayerNorm.weight": "model.safetensors",
327
+ "encoder.layer.19.attention.output.dense.bias": "model.safetensors",
328
+ "encoder.layer.19.attention.output.dense.biases": "model.safetensors",
329
+ "encoder.layer.19.attention.output.dense.scales": "model.safetensors",
330
+ "encoder.layer.19.attention.output.dense.weight": "model.safetensors",
331
+ "encoder.layer.19.attention.self.key.bias": "model.safetensors",
332
+ "encoder.layer.19.attention.self.key.biases": "model.safetensors",
333
+ "encoder.layer.19.attention.self.key.scales": "model.safetensors",
334
+ "encoder.layer.19.attention.self.key.weight": "model.safetensors",
335
+ "encoder.layer.19.attention.self.query.bias": "model.safetensors",
336
+ "encoder.layer.19.attention.self.query.biases": "model.safetensors",
337
+ "encoder.layer.19.attention.self.query.scales": "model.safetensors",
338
+ "encoder.layer.19.attention.self.query.weight": "model.safetensors",
339
+ "encoder.layer.19.attention.self.value.bias": "model.safetensors",
340
+ "encoder.layer.19.attention.self.value.biases": "model.safetensors",
341
+ "encoder.layer.19.attention.self.value.scales": "model.safetensors",
342
+ "encoder.layer.19.attention.self.value.weight": "model.safetensors",
343
+ "encoder.layer.19.intermediate.dense.bias": "model.safetensors",
344
+ "encoder.layer.19.intermediate.dense.biases": "model.safetensors",
345
+ "encoder.layer.19.intermediate.dense.scales": "model.safetensors",
346
+ "encoder.layer.19.intermediate.dense.weight": "model.safetensors",
347
+ "encoder.layer.19.output.LayerNorm.bias": "model.safetensors",
348
+ "encoder.layer.19.output.LayerNorm.weight": "model.safetensors",
349
+ "encoder.layer.19.output.dense.bias": "model.safetensors",
350
+ "encoder.layer.19.output.dense.biases": "model.safetensors",
351
+ "encoder.layer.19.output.dense.scales": "model.safetensors",
352
+ "encoder.layer.19.output.dense.weight": "model.safetensors",
353
+ "encoder.layer.2.attention.output.LayerNorm.bias": "model.safetensors",
354
+ "encoder.layer.2.attention.output.LayerNorm.weight": "model.safetensors",
355
+ "encoder.layer.2.attention.output.dense.bias": "model.safetensors",
356
+ "encoder.layer.2.attention.output.dense.biases": "model.safetensors",
357
+ "encoder.layer.2.attention.output.dense.scales": "model.safetensors",
358
+ "encoder.layer.2.attention.output.dense.weight": "model.safetensors",
359
+ "encoder.layer.2.attention.self.key.bias": "model.safetensors",
360
+ "encoder.layer.2.attention.self.key.biases": "model.safetensors",
361
+ "encoder.layer.2.attention.self.key.scales": "model.safetensors",
362
+ "encoder.layer.2.attention.self.key.weight": "model.safetensors",
363
+ "encoder.layer.2.attention.self.query.bias": "model.safetensors",
364
+ "encoder.layer.2.attention.self.query.biases": "model.safetensors",
365
+ "encoder.layer.2.attention.self.query.scales": "model.safetensors",
366
+ "encoder.layer.2.attention.self.query.weight": "model.safetensors",
367
+ "encoder.layer.2.attention.self.value.bias": "model.safetensors",
368
+ "encoder.layer.2.attention.self.value.biases": "model.safetensors",
369
+ "encoder.layer.2.attention.self.value.scales": "model.safetensors",
370
+ "encoder.layer.2.attention.self.value.weight": "model.safetensors",
371
+ "encoder.layer.2.intermediate.dense.bias": "model.safetensors",
372
+ "encoder.layer.2.intermediate.dense.biases": "model.safetensors",
373
+ "encoder.layer.2.intermediate.dense.scales": "model.safetensors",
374
+ "encoder.layer.2.intermediate.dense.weight": "model.safetensors",
375
+ "encoder.layer.2.output.LayerNorm.bias": "model.safetensors",
376
+ "encoder.layer.2.output.LayerNorm.weight": "model.safetensors",
377
+ "encoder.layer.2.output.dense.bias": "model.safetensors",
378
+ "encoder.layer.2.output.dense.biases": "model.safetensors",
379
+ "encoder.layer.2.output.dense.scales": "model.safetensors",
380
+ "encoder.layer.2.output.dense.weight": "model.safetensors",
381
+ "encoder.layer.20.attention.output.LayerNorm.bias": "model.safetensors",
382
+ "encoder.layer.20.attention.output.LayerNorm.weight": "model.safetensors",
383
+ "encoder.layer.20.attention.output.dense.bias": "model.safetensors",
384
+ "encoder.layer.20.attention.output.dense.biases": "model.safetensors",
385
+ "encoder.layer.20.attention.output.dense.scales": "model.safetensors",
386
+ "encoder.layer.20.attention.output.dense.weight": "model.safetensors",
387
+ "encoder.layer.20.attention.self.key.bias": "model.safetensors",
388
+ "encoder.layer.20.attention.self.key.biases": "model.safetensors",
389
+ "encoder.layer.20.attention.self.key.scales": "model.safetensors",
390
+ "encoder.layer.20.attention.self.key.weight": "model.safetensors",
391
+ "encoder.layer.20.attention.self.query.bias": "model.safetensors",
392
+ "encoder.layer.20.attention.self.query.biases": "model.safetensors",
393
+ "encoder.layer.20.attention.self.query.scales": "model.safetensors",
394
+ "encoder.layer.20.attention.self.query.weight": "model.safetensors",
395
+ "encoder.layer.20.attention.self.value.bias": "model.safetensors",
396
+ "encoder.layer.20.attention.self.value.biases": "model.safetensors",
397
+ "encoder.layer.20.attention.self.value.scales": "model.safetensors",
398
+ "encoder.layer.20.attention.self.value.weight": "model.safetensors",
399
+ "encoder.layer.20.intermediate.dense.bias": "model.safetensors",
400
+ "encoder.layer.20.intermediate.dense.biases": "model.safetensors",
401
+ "encoder.layer.20.intermediate.dense.scales": "model.safetensors",
402
+ "encoder.layer.20.intermediate.dense.weight": "model.safetensors",
403
+ "encoder.layer.20.output.LayerNorm.bias": "model.safetensors",
404
+ "encoder.layer.20.output.LayerNorm.weight": "model.safetensors",
405
+ "encoder.layer.20.output.dense.bias": "model.safetensors",
406
+ "encoder.layer.20.output.dense.biases": "model.safetensors",
407
+ "encoder.layer.20.output.dense.scales": "model.safetensors",
408
+ "encoder.layer.20.output.dense.weight": "model.safetensors",
409
+ "encoder.layer.21.attention.output.LayerNorm.bias": "model.safetensors",
410
+ "encoder.layer.21.attention.output.LayerNorm.weight": "model.safetensors",
411
+ "encoder.layer.21.attention.output.dense.bias": "model.safetensors",
412
+ "encoder.layer.21.attention.output.dense.biases": "model.safetensors",
413
+ "encoder.layer.21.attention.output.dense.scales": "model.safetensors",
414
+ "encoder.layer.21.attention.output.dense.weight": "model.safetensors",
415
+ "encoder.layer.21.attention.self.key.bias": "model.safetensors",
416
+ "encoder.layer.21.attention.self.key.biases": "model.safetensors",
417
+ "encoder.layer.21.attention.self.key.scales": "model.safetensors",
418
+ "encoder.layer.21.attention.self.key.weight": "model.safetensors",
419
+ "encoder.layer.21.attention.self.query.bias": "model.safetensors",
420
+ "encoder.layer.21.attention.self.query.biases": "model.safetensors",
421
+ "encoder.layer.21.attention.self.query.scales": "model.safetensors",
422
+ "encoder.layer.21.attention.self.query.weight": "model.safetensors",
423
+ "encoder.layer.21.attention.self.value.bias": "model.safetensors",
424
+ "encoder.layer.21.attention.self.value.biases": "model.safetensors",
425
+ "encoder.layer.21.attention.self.value.scales": "model.safetensors",
426
+ "encoder.layer.21.attention.self.value.weight": "model.safetensors",
427
+ "encoder.layer.21.intermediate.dense.bias": "model.safetensors",
428
+ "encoder.layer.21.intermediate.dense.biases": "model.safetensors",
429
+ "encoder.layer.21.intermediate.dense.scales": "model.safetensors",
430
+ "encoder.layer.21.intermediate.dense.weight": "model.safetensors",
431
+ "encoder.layer.21.output.LayerNorm.bias": "model.safetensors",
432
+ "encoder.layer.21.output.LayerNorm.weight": "model.safetensors",
433
+ "encoder.layer.21.output.dense.bias": "model.safetensors",
434
+ "encoder.layer.21.output.dense.biases": "model.safetensors",
435
+ "encoder.layer.21.output.dense.scales": "model.safetensors",
436
+ "encoder.layer.21.output.dense.weight": "model.safetensors",
437
+ "encoder.layer.22.attention.output.LayerNorm.bias": "model.safetensors",
438
+ "encoder.layer.22.attention.output.LayerNorm.weight": "model.safetensors",
439
+ "encoder.layer.22.attention.output.dense.bias": "model.safetensors",
440
+ "encoder.layer.22.attention.output.dense.biases": "model.safetensors",
441
+ "encoder.layer.22.attention.output.dense.scales": "model.safetensors",
442
+ "encoder.layer.22.attention.output.dense.weight": "model.safetensors",
443
+ "encoder.layer.22.attention.self.key.bias": "model.safetensors",
444
+ "encoder.layer.22.attention.self.key.biases": "model.safetensors",
445
+ "encoder.layer.22.attention.self.key.scales": "model.safetensors",
446
+ "encoder.layer.22.attention.self.key.weight": "model.safetensors",
447
+ "encoder.layer.22.attention.self.query.bias": "model.safetensors",
448
+ "encoder.layer.22.attention.self.query.biases": "model.safetensors",
449
+ "encoder.layer.22.attention.self.query.scales": "model.safetensors",
450
+ "encoder.layer.22.attention.self.query.weight": "model.safetensors",
451
+ "encoder.layer.22.attention.self.value.bias": "model.safetensors",
452
+ "encoder.layer.22.attention.self.value.biases": "model.safetensors",
453
+ "encoder.layer.22.attention.self.value.scales": "model.safetensors",
454
+ "encoder.layer.22.attention.self.value.weight": "model.safetensors",
455
+ "encoder.layer.22.intermediate.dense.bias": "model.safetensors",
456
+ "encoder.layer.22.intermediate.dense.biases": "model.safetensors",
457
+ "encoder.layer.22.intermediate.dense.scales": "model.safetensors",
458
+ "encoder.layer.22.intermediate.dense.weight": "model.safetensors",
459
+ "encoder.layer.22.output.LayerNorm.bias": "model.safetensors",
460
+ "encoder.layer.22.output.LayerNorm.weight": "model.safetensors",
461
+ "encoder.layer.22.output.dense.bias": "model.safetensors",
462
+ "encoder.layer.22.output.dense.biases": "model.safetensors",
463
+ "encoder.layer.22.output.dense.scales": "model.safetensors",
464
+ "encoder.layer.22.output.dense.weight": "model.safetensors",
465
+ "encoder.layer.23.attention.output.LayerNorm.bias": "model.safetensors",
466
+ "encoder.layer.23.attention.output.LayerNorm.weight": "model.safetensors",
467
+ "encoder.layer.23.attention.output.dense.bias": "model.safetensors",
468
+ "encoder.layer.23.attention.output.dense.biases": "model.safetensors",
469
+ "encoder.layer.23.attention.output.dense.scales": "model.safetensors",
470
+ "encoder.layer.23.attention.output.dense.weight": "model.safetensors",
471
+ "encoder.layer.23.attention.self.key.bias": "model.safetensors",
472
+ "encoder.layer.23.attention.self.key.biases": "model.safetensors",
473
+ "encoder.layer.23.attention.self.key.scales": "model.safetensors",
474
+ "encoder.layer.23.attention.self.key.weight": "model.safetensors",
475
+ "encoder.layer.23.attention.self.query.bias": "model.safetensors",
476
+ "encoder.layer.23.attention.self.query.biases": "model.safetensors",
477
+ "encoder.layer.23.attention.self.query.scales": "model.safetensors",
478
+ "encoder.layer.23.attention.self.query.weight": "model.safetensors",
479
+ "encoder.layer.23.attention.self.value.bias": "model.safetensors",
480
+ "encoder.layer.23.attention.self.value.biases": "model.safetensors",
481
+ "encoder.layer.23.attention.self.value.scales": "model.safetensors",
482
+ "encoder.layer.23.attention.self.value.weight": "model.safetensors",
483
+ "encoder.layer.23.intermediate.dense.bias": "model.safetensors",
484
+ "encoder.layer.23.intermediate.dense.biases": "model.safetensors",
485
+ "encoder.layer.23.intermediate.dense.scales": "model.safetensors",
486
+ "encoder.layer.23.intermediate.dense.weight": "model.safetensors",
487
+ "encoder.layer.23.output.LayerNorm.bias": "model.safetensors",
488
+ "encoder.layer.23.output.LayerNorm.weight": "model.safetensors",
489
+ "encoder.layer.23.output.dense.bias": "model.safetensors",
490
+ "encoder.layer.23.output.dense.biases": "model.safetensors",
491
+ "encoder.layer.23.output.dense.scales": "model.safetensors",
492
+ "encoder.layer.23.output.dense.weight": "model.safetensors",
493
+ "encoder.layer.3.attention.output.LayerNorm.bias": "model.safetensors",
494
+ "encoder.layer.3.attention.output.LayerNorm.weight": "model.safetensors",
495
+ "encoder.layer.3.attention.output.dense.bias": "model.safetensors",
496
+ "encoder.layer.3.attention.output.dense.biases": "model.safetensors",
497
+ "encoder.layer.3.attention.output.dense.scales": "model.safetensors",
498
+ "encoder.layer.3.attention.output.dense.weight": "model.safetensors",
499
+ "encoder.layer.3.attention.self.key.bias": "model.safetensors",
500
+ "encoder.layer.3.attention.self.key.biases": "model.safetensors",
501
+ "encoder.layer.3.attention.self.key.scales": "model.safetensors",
502
+ "encoder.layer.3.attention.self.key.weight": "model.safetensors",
503
+ "encoder.layer.3.attention.self.query.bias": "model.safetensors",
504
+ "encoder.layer.3.attention.self.query.biases": "model.safetensors",
505
+ "encoder.layer.3.attention.self.query.scales": "model.safetensors",
506
+ "encoder.layer.3.attention.self.query.weight": "model.safetensors",
507
+ "encoder.layer.3.attention.self.value.bias": "model.safetensors",
508
+ "encoder.layer.3.attention.self.value.biases": "model.safetensors",
509
+ "encoder.layer.3.attention.self.value.scales": "model.safetensors",
510
+ "encoder.layer.3.attention.self.value.weight": "model.safetensors",
511
+ "encoder.layer.3.intermediate.dense.bias": "model.safetensors",
512
+ "encoder.layer.3.intermediate.dense.biases": "model.safetensors",
513
+ "encoder.layer.3.intermediate.dense.scales": "model.safetensors",
514
+ "encoder.layer.3.intermediate.dense.weight": "model.safetensors",
515
+ "encoder.layer.3.output.LayerNorm.bias": "model.safetensors",
516
+ "encoder.layer.3.output.LayerNorm.weight": "model.safetensors",
517
+ "encoder.layer.3.output.dense.bias": "model.safetensors",
518
+ "encoder.layer.3.output.dense.biases": "model.safetensors",
519
+ "encoder.layer.3.output.dense.scales": "model.safetensors",
520
+ "encoder.layer.3.output.dense.weight": "model.safetensors",
521
+ "encoder.layer.4.attention.output.LayerNorm.bias": "model.safetensors",
522
+ "encoder.layer.4.attention.output.LayerNorm.weight": "model.safetensors",
523
+ "encoder.layer.4.attention.output.dense.bias": "model.safetensors",
524
+ "encoder.layer.4.attention.output.dense.biases": "model.safetensors",
525
+ "encoder.layer.4.attention.output.dense.scales": "model.safetensors",
526
+ "encoder.layer.4.attention.output.dense.weight": "model.safetensors",
527
+ "encoder.layer.4.attention.self.key.bias": "model.safetensors",
528
+ "encoder.layer.4.attention.self.key.biases": "model.safetensors",
529
+ "encoder.layer.4.attention.self.key.scales": "model.safetensors",
530
+ "encoder.layer.4.attention.self.key.weight": "model.safetensors",
531
+ "encoder.layer.4.attention.self.query.bias": "model.safetensors",
532
+ "encoder.layer.4.attention.self.query.biases": "model.safetensors",
533
+ "encoder.layer.4.attention.self.query.scales": "model.safetensors",
534
+ "encoder.layer.4.attention.self.query.weight": "model.safetensors",
535
+ "encoder.layer.4.attention.self.value.bias": "model.safetensors",
536
+ "encoder.layer.4.attention.self.value.biases": "model.safetensors",
537
+ "encoder.layer.4.attention.self.value.scales": "model.safetensors",
538
+ "encoder.layer.4.attention.self.value.weight": "model.safetensors",
539
+ "encoder.layer.4.intermediate.dense.bias": "model.safetensors",
540
+ "encoder.layer.4.intermediate.dense.biases": "model.safetensors",
541
+ "encoder.layer.4.intermediate.dense.scales": "model.safetensors",
542
+ "encoder.layer.4.intermediate.dense.weight": "model.safetensors",
543
+ "encoder.layer.4.output.LayerNorm.bias": "model.safetensors",
544
+ "encoder.layer.4.output.LayerNorm.weight": "model.safetensors",
545
+ "encoder.layer.4.output.dense.bias": "model.safetensors",
546
+ "encoder.layer.4.output.dense.biases": "model.safetensors",
547
+ "encoder.layer.4.output.dense.scales": "model.safetensors",
548
+ "encoder.layer.4.output.dense.weight": "model.safetensors",
549
+ "encoder.layer.5.attention.output.LayerNorm.bias": "model.safetensors",
550
+ "encoder.layer.5.attention.output.LayerNorm.weight": "model.safetensors",
551
+ "encoder.layer.5.attention.output.dense.bias": "model.safetensors",
552
+ "encoder.layer.5.attention.output.dense.biases": "model.safetensors",
553
+ "encoder.layer.5.attention.output.dense.scales": "model.safetensors",
554
+ "encoder.layer.5.attention.output.dense.weight": "model.safetensors",
555
+ "encoder.layer.5.attention.self.key.bias": "model.safetensors",
556
+ "encoder.layer.5.attention.self.key.biases": "model.safetensors",
557
+ "encoder.layer.5.attention.self.key.scales": "model.safetensors",
558
+ "encoder.layer.5.attention.self.key.weight": "model.safetensors",
559
+ "encoder.layer.5.attention.self.query.bias": "model.safetensors",
560
+ "encoder.layer.5.attention.self.query.biases": "model.safetensors",
561
+ "encoder.layer.5.attention.self.query.scales": "model.safetensors",
562
+ "encoder.layer.5.attention.self.query.weight": "model.safetensors",
563
+ "encoder.layer.5.attention.self.value.bias": "model.safetensors",
564
+ "encoder.layer.5.attention.self.value.biases": "model.safetensors",
565
+ "encoder.layer.5.attention.self.value.scales": "model.safetensors",
566
+ "encoder.layer.5.attention.self.value.weight": "model.safetensors",
567
+ "encoder.layer.5.intermediate.dense.bias": "model.safetensors",
568
+ "encoder.layer.5.intermediate.dense.biases": "model.safetensors",
569
+ "encoder.layer.5.intermediate.dense.scales": "model.safetensors",
570
+ "encoder.layer.5.intermediate.dense.weight": "model.safetensors",
571
+ "encoder.layer.5.output.LayerNorm.bias": "model.safetensors",
572
+ "encoder.layer.5.output.LayerNorm.weight": "model.safetensors",
573
+ "encoder.layer.5.output.dense.bias": "model.safetensors",
574
+ "encoder.layer.5.output.dense.biases": "model.safetensors",
575
+ "encoder.layer.5.output.dense.scales": "model.safetensors",
576
+ "encoder.layer.5.output.dense.weight": "model.safetensors",
577
+ "encoder.layer.6.attention.output.LayerNorm.bias": "model.safetensors",
578
+ "encoder.layer.6.attention.output.LayerNorm.weight": "model.safetensors",
579
+ "encoder.layer.6.attention.output.dense.bias": "model.safetensors",
580
+ "encoder.layer.6.attention.output.dense.biases": "model.safetensors",
581
+ "encoder.layer.6.attention.output.dense.scales": "model.safetensors",
582
+ "encoder.layer.6.attention.output.dense.weight": "model.safetensors",
583
+ "encoder.layer.6.attention.self.key.bias": "model.safetensors",
584
+ "encoder.layer.6.attention.self.key.biases": "model.safetensors",
585
+ "encoder.layer.6.attention.self.key.scales": "model.safetensors",
586
+ "encoder.layer.6.attention.self.key.weight": "model.safetensors",
587
+ "encoder.layer.6.attention.self.query.bias": "model.safetensors",
588
+ "encoder.layer.6.attention.self.query.biases": "model.safetensors",
589
+ "encoder.layer.6.attention.self.query.scales": "model.safetensors",
590
+ "encoder.layer.6.attention.self.query.weight": "model.safetensors",
591
+ "encoder.layer.6.attention.self.value.bias": "model.safetensors",
592
+ "encoder.layer.6.attention.self.value.biases": "model.safetensors",
593
+ "encoder.layer.6.attention.self.value.scales": "model.safetensors",
594
+ "encoder.layer.6.attention.self.value.weight": "model.safetensors",
595
+ "encoder.layer.6.intermediate.dense.bias": "model.safetensors",
596
+ "encoder.layer.6.intermediate.dense.biases": "model.safetensors",
597
+ "encoder.layer.6.intermediate.dense.scales": "model.safetensors",
598
+ "encoder.layer.6.intermediate.dense.weight": "model.safetensors",
599
+ "encoder.layer.6.output.LayerNorm.bias": "model.safetensors",
600
+ "encoder.layer.6.output.LayerNorm.weight": "model.safetensors",
601
+ "encoder.layer.6.output.dense.bias": "model.safetensors",
602
+ "encoder.layer.6.output.dense.biases": "model.safetensors",
603
+ "encoder.layer.6.output.dense.scales": "model.safetensors",
604
+ "encoder.layer.6.output.dense.weight": "model.safetensors",
605
+ "encoder.layer.7.attention.output.LayerNorm.bias": "model.safetensors",
606
+ "encoder.layer.7.attention.output.LayerNorm.weight": "model.safetensors",
607
+ "encoder.layer.7.attention.output.dense.bias": "model.safetensors",
608
+ "encoder.layer.7.attention.output.dense.biases": "model.safetensors",
609
+ "encoder.layer.7.attention.output.dense.scales": "model.safetensors",
610
+ "encoder.layer.7.attention.output.dense.weight": "model.safetensors",
611
+ "encoder.layer.7.attention.self.key.bias": "model.safetensors",
612
+ "encoder.layer.7.attention.self.key.biases": "model.safetensors",
613
+ "encoder.layer.7.attention.self.key.scales": "model.safetensors",
614
+ "encoder.layer.7.attention.self.key.weight": "model.safetensors",
615
+ "encoder.layer.7.attention.self.query.bias": "model.safetensors",
616
+ "encoder.layer.7.attention.self.query.biases": "model.safetensors",
617
+ "encoder.layer.7.attention.self.query.scales": "model.safetensors",
618
+ "encoder.layer.7.attention.self.query.weight": "model.safetensors",
619
+ "encoder.layer.7.attention.self.value.bias": "model.safetensors",
620
+ "encoder.layer.7.attention.self.value.biases": "model.safetensors",
621
+ "encoder.layer.7.attention.self.value.scales": "model.safetensors",
622
+ "encoder.layer.7.attention.self.value.weight": "model.safetensors",
623
+ "encoder.layer.7.intermediate.dense.bias": "model.safetensors",
624
+ "encoder.layer.7.intermediate.dense.biases": "model.safetensors",
625
+ "encoder.layer.7.intermediate.dense.scales": "model.safetensors",
626
+ "encoder.layer.7.intermediate.dense.weight": "model.safetensors",
627
+ "encoder.layer.7.output.LayerNorm.bias": "model.safetensors",
628
+ "encoder.layer.7.output.LayerNorm.weight": "model.safetensors",
629
+ "encoder.layer.7.output.dense.bias": "model.safetensors",
630
+ "encoder.layer.7.output.dense.biases": "model.safetensors",
631
+ "encoder.layer.7.output.dense.scales": "model.safetensors",
632
+ "encoder.layer.7.output.dense.weight": "model.safetensors",
633
+ "encoder.layer.8.attention.output.LayerNorm.bias": "model.safetensors",
634
+ "encoder.layer.8.attention.output.LayerNorm.weight": "model.safetensors",
635
+ "encoder.layer.8.attention.output.dense.bias": "model.safetensors",
636
+ "encoder.layer.8.attention.output.dense.biases": "model.safetensors",
637
+ "encoder.layer.8.attention.output.dense.scales": "model.safetensors",
638
+ "encoder.layer.8.attention.output.dense.weight": "model.safetensors",
639
+ "encoder.layer.8.attention.self.key.bias": "model.safetensors",
640
+ "encoder.layer.8.attention.self.key.biases": "model.safetensors",
641
+ "encoder.layer.8.attention.self.key.scales": "model.safetensors",
642
+ "encoder.layer.8.attention.self.key.weight": "model.safetensors",
643
+ "encoder.layer.8.attention.self.query.bias": "model.safetensors",
644
+ "encoder.layer.8.attention.self.query.biases": "model.safetensors",
645
+ "encoder.layer.8.attention.self.query.scales": "model.safetensors",
646
+ "encoder.layer.8.attention.self.query.weight": "model.safetensors",
647
+ "encoder.layer.8.attention.self.value.bias": "model.safetensors",
648
+ "encoder.layer.8.attention.self.value.biases": "model.safetensors",
649
+ "encoder.layer.8.attention.self.value.scales": "model.safetensors",
650
+ "encoder.layer.8.attention.self.value.weight": "model.safetensors",
651
+ "encoder.layer.8.intermediate.dense.bias": "model.safetensors",
652
+ "encoder.layer.8.intermediate.dense.biases": "model.safetensors",
653
+ "encoder.layer.8.intermediate.dense.scales": "model.safetensors",
654
+ "encoder.layer.8.intermediate.dense.weight": "model.safetensors",
655
+ "encoder.layer.8.output.LayerNorm.bias": "model.safetensors",
656
+ "encoder.layer.8.output.LayerNorm.weight": "model.safetensors",
657
+ "encoder.layer.8.output.dense.bias": "model.safetensors",
658
+ "encoder.layer.8.output.dense.biases": "model.safetensors",
659
+ "encoder.layer.8.output.dense.scales": "model.safetensors",
660
+ "encoder.layer.8.output.dense.weight": "model.safetensors",
661
+ "encoder.layer.9.attention.output.LayerNorm.bias": "model.safetensors",
662
+ "encoder.layer.9.attention.output.LayerNorm.weight": "model.safetensors",
663
+ "encoder.layer.9.attention.output.dense.bias": "model.safetensors",
664
+ "encoder.layer.9.attention.output.dense.biases": "model.safetensors",
665
+ "encoder.layer.9.attention.output.dense.scales": "model.safetensors",
666
+ "encoder.layer.9.attention.output.dense.weight": "model.safetensors",
667
+ "encoder.layer.9.attention.self.key.bias": "model.safetensors",
668
+ "encoder.layer.9.attention.self.key.biases": "model.safetensors",
669
+ "encoder.layer.9.attention.self.key.scales": "model.safetensors",
670
+ "encoder.layer.9.attention.self.key.weight": "model.safetensors",
671
+ "encoder.layer.9.attention.self.query.bias": "model.safetensors",
672
+ "encoder.layer.9.attention.self.query.biases": "model.safetensors",
673
+ "encoder.layer.9.attention.self.query.scales": "model.safetensors",
674
+ "encoder.layer.9.attention.self.query.weight": "model.safetensors",
675
+ "encoder.layer.9.attention.self.value.bias": "model.safetensors",
676
+ "encoder.layer.9.attention.self.value.biases": "model.safetensors",
677
+ "encoder.layer.9.attention.self.value.scales": "model.safetensors",
678
+ "encoder.layer.9.attention.self.value.weight": "model.safetensors",
679
+ "encoder.layer.9.intermediate.dense.bias": "model.safetensors",
680
+ "encoder.layer.9.intermediate.dense.biases": "model.safetensors",
681
+ "encoder.layer.9.intermediate.dense.scales": "model.safetensors",
682
+ "encoder.layer.9.intermediate.dense.weight": "model.safetensors",
683
+ "encoder.layer.9.output.LayerNorm.bias": "model.safetensors",
684
+ "encoder.layer.9.output.LayerNorm.weight": "model.safetensors",
685
+ "encoder.layer.9.output.dense.bias": "model.safetensors",
686
+ "encoder.layer.9.output.dense.biases": "model.safetensors",
687
+ "encoder.layer.9.output.dense.scales": "model.safetensors",
688
+ "encoder.layer.9.output.dense.weight": "model.safetensors",
689
+ "pooler.dense.bias": "model.safetensors",
690
+ "pooler.dense.biases": "model.safetensors",
691
+ "pooler.dense.scales": "model.safetensors",
692
+ "pooler.dense.weight": "model.safetensors"
693
+ }
694
+ }
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 512,
51
+ "never_split": null,
52
+ "pad_token": "[PAD]",
53
+ "sep_token": "[SEP]",
54
+ "strip_accents": null,
55
+ "tokenize_chinese_chars": true,
56
+ "tokenizer_class": "BertTokenizer",
57
+ "unk_token": "[UNK]"
58
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff