louisbrulenaudet commited on
Commit
9327588
·
verified ·
1 Parent(s): 0c7df31

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -2
README.md CHANGED
@@ -349,7 +349,7 @@ language:
349
  ---
350
  <img src="assets/thumbnail.webp">
351
 
352
- # SentenceTransformer based on intfloat/multilingual-e5-large
353
 
354
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
355
 
@@ -678,4 +678,19 @@ Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codec
678
  publisher = "Association for Computational Linguistics",
679
  url = "https://arxiv.org/abs/1908.10084",
680
  }
681
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
349
  ---
350
  <img src="assets/thumbnail.webp">
351
 
352
+ # Lemone-Embed: A Series of Fine-Tuned Embedding Models for French Taxation
353
 
354
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
355
 
 
678
  publisher = "Association for Computational Linguistics",
679
  url = "https://arxiv.org/abs/1908.10084",
680
  }
681
+ ```
682
+
683
+ If you use this code in your research, please use the following BibTeX entry.
684
+
685
+ ```BibTeX
686
+ @misc{louisbrulenaudet2024,
687
+ author = {Louis Brulé Naudet},
688
+ title = {Lemone-Embed: A Series of Fine-Tuned Embedding Models for French Taxation},
689
+ year = {2024}
690
+ howpublished = {\url{https://huggingface.co/datasets/louisbrulenaudet/lemone-embed-l-boost}},
691
+ }
692
+ ```
693
+
694
+ ## Feedback
695
+
696
+ If you have any feedback, please reach out at [[email protected]](mailto:[email protected]).