bwang0911 commited on
Commit
e561a49
·
verified ·
1 Parent(s): 8e31e80

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -8
README.md CHANGED
@@ -304,14 +304,29 @@ Additionally, we provide the following embedding models, you can also use them f
304
  Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
305
 
306
  ```
307
- @misc{jha2024jinacolbertv2generalpurposemultilinguallate,
308
- title={Jina-ColBERT-v2: A General-Purpose Multilingual Late Interaction Retriever},
309
- author={Rohan Jha and Bo Wang and Michael Günther and Saba Sturua and Mohammad Kalim Akram and Han Xiao},
310
- year={2024},
311
- eprint={2408.16672},
312
- archivePrefix={arXiv},
313
- primaryClass={cs.IR},
314
- url={https://arxiv.org/abs/2408.16672},
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
315
  }
316
  ```
317
 
 
304
  Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
305
 
306
  ```
307
+ @inproceedings{xiao-etal-2024-jina,
308
+ title = "{J}ina-{C}ol{BERT}-v2: A General-Purpose Multilingual Late Interaction Retriever",
309
+ author = {Jha, Rohan and
310
+ Wang, Bo and
311
+ G{\"u}nther, Michael and
312
+ Mastrapas, Georgios and
313
+ Sturua, Saba and
314
+ Mohr, Isabelle and
315
+ Koukounas, Andreas and
316
+ Wang, Mohammad Kalim and
317
+ Wang, Nan and
318
+ Xiao, Han},
319
+ editor = {S{\"a}lev{\"a}, Jonne and
320
+ Owodunni, Abraham},
321
+ booktitle = "Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024)",
322
+ month = nov,
323
+ year = "2024",
324
+ address = "Miami, Florida, USA",
325
+ publisher = "Association for Computational Linguistics",
326
+ url = "https://aclanthology.org/2024.mrl-1.11/",
327
+ doi = "10.18653/v1/2024.mrl-1.11",
328
+ pages = "159--166",
329
+ abstract = "Multi-vector dense models, such as ColBERT, have proven highly effective in information retrieval. ColBERT`s late interaction scoring approximates the joint query-document attention seen in cross-encoders while maintaining inference efficiency closer to traditional dense retrieval models, thanks to its bi-encoder architecture and recent optimizations in indexing and search. In this paper, we introduce a novel architecture and a training framework to support long context window and multilingual retrieval. Leveraging Matryoshka Representation Loss, we further demonstrate that the reducing the embedding dimensionality from 128 to 64 has insignificant impact on the model`s retrieval performance and cut storage requirements by up to 50{\%}. Our new model, Jina-ColBERT-v2, demonstrates strong performance across a range of English and multilingual retrieval tasks,"
330
  }
331
  ```
332