Steve77's picture
Add new SentenceTransformer model
09b79b2 verified
metadata
base_model: BAAI/bge-base-en-v1.5
language:
  - fr
library_name: sentence-transformers
license: apache-2.0
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:47560
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
widget:
  - source_sentence: Qui a écouté le roi Asa et envoyé son armée contre les villes d'Israël?
    sentences:
      - Ben-Hadad.
      - Il se prosterna devant le roi, le visage contre terre.
      - Baescha, fils d'Achija.
  - source_sentence: >-
      Quelle est l'importance de distribuer tous ses biens aux pauvres sans
      charité?
    sentences:
      - Adina, fils de Schiza, était le chef des Rubénites.
      - Distribuer tous ses biens aux pauvres sans charité ne sert à rien.
      - L'Éternel.
  - source_sentence: Qui sont les enfants du père d'Étham?
    sentences:
      - Jizreel, Jischma, Jidbasch et leur sœur Hatselelponi.
      - Chaque division comptait vingt-quatre mille hommes.
      - >-
        Hosa était un fils de Merari et il avait quatre fils: Schimri, Hilkija,
        Thebalia, et Zacharie.
  - source_sentence: Combien de temps Nadab, fils de Jéroboam, a-t-il régné sur Israël?
    sentences:
      - >-
        Ils sont des serviteurs par le moyen desquels les frères ont cru, selon
        que le Seigneur l'a donné à chacun.
      - >-
        Sept fils: Jeusch, Benjamin, Éhud, Kenaana, Zéthan, Tarsis et
        Achischachar, enregistrés au nombre de dix-sept mille deux cents.
      - Deux ans.
  - source_sentence: >-
      Quand les Lévites devaient-ils se présenter pour louer et célébrer
      l'Éternel?
    sentences:
      - Chaque matin et chaque soir.
      - Cinq mille talents d'or et dix mille talents d'argent ont été donnés.
      - Il doit demeurer circoncis.
model-index:
  - name: BGE base bible test
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.13359388879019363
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.18795523183513946
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.21389234322259726
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.25102149582519095
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.13359388879019363
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.06265174394504648
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.04277846864451945
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0251021495825191
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.13359388879019363
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.18795523183513946
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.21389234322259726
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.25102149582519095
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.18816833747648484
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.16858798117458645
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.17400088915411802
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.12773139101083675
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.18546811156510926
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.20572037662106946
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.24213892343222598
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.12773139101083675
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.061822703855036416
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.041144075324213894
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0242138923432226
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.12773139101083675
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.18546811156510926
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.20572037662106946
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.24213892343222598
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.18151482198424093
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.1625760305898876
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.16802226648065993
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.12488896784508793
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.17463137324569195
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.19737075857168235
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.23272339669568307
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.12488896784508793
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.058210457748563975
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.03947415171433648
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.023272339669568307
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.12488896784508793
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.17463137324569195
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.19737075857168235
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.23272339669568307
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.17440736005896854
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.156282728049472
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.16141647615447188
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.10943329188132883
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.15686622845976195
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.17853970509859654
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.20838514833895896
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.10943329188132883
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.052288742819920644
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.03570794101971931
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0208385148338959
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.10943329188132883
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.15686622845976195
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.17853970509859654
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.20838514833895896
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.15566336146326976
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.13917031134121227
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.14405644027137798
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.08935867827322792
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.13181737431160065
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.15011547344110854
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.17694084206786284
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.08935867827322792
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.04393912477053354
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.03002309468822171
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.01769408420678629
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.08935867827322792
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.13181737431160065
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.15011547344110854
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.17694084206786284
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.13031373727839585
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.11575599150656894
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.12066444582998255
            name: Cosine Map@100

BGE base bible test

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5 on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • json
  • Language: fr
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Steve77/bge-base-bible-retrieval")
# Run inference
sentences = [
    "Quand les Lévites devaient-ils se présenter pour louer et célébrer l'Éternel?",
    'Chaque matin et chaque soir.',
    "Cinq mille talents d'or et dix mille talents d'argent ont été donnés.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric dim_768 dim_512 dim_256 dim_128 dim_64
cosine_accuracy@1 0.1336 0.1277 0.1249 0.1094 0.0894
cosine_accuracy@3 0.188 0.1855 0.1746 0.1569 0.1318
cosine_accuracy@5 0.2139 0.2057 0.1974 0.1785 0.1501
cosine_accuracy@10 0.251 0.2421 0.2327 0.2084 0.1769
cosine_precision@1 0.1336 0.1277 0.1249 0.1094 0.0894
cosine_precision@3 0.0627 0.0618 0.0582 0.0523 0.0439
cosine_precision@5 0.0428 0.0411 0.0395 0.0357 0.03
cosine_precision@10 0.0251 0.0242 0.0233 0.0208 0.0177
cosine_recall@1 0.1336 0.1277 0.1249 0.1094 0.0894
cosine_recall@3 0.188 0.1855 0.1746 0.1569 0.1318
cosine_recall@5 0.2139 0.2057 0.1974 0.1785 0.1501
cosine_recall@10 0.251 0.2421 0.2327 0.2084 0.1769
cosine_ndcg@10 0.1882 0.1815 0.1744 0.1557 0.1303
cosine_mrr@10 0.1686 0.1626 0.1563 0.1392 0.1158
cosine_map@100 0.174 0.168 0.1614 0.1441 0.1207

Training Details

Training Dataset

json

  • Dataset: json
  • Size: 47,560 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 8 tokens
    • mean: 21.23 tokens
    • max: 45 tokens
    • min: 3 tokens
    • mean: 25.14 tokens
    • max: 110 tokens
  • Samples:
    anchor positive
    Quels sont les noms des fils de Schobal? Aljan, Manahath, Ébal, Schephi et Onam
    Quels sont les noms des fils de Tsibeon? Ajja et Ana
    Qui est le fils d'Ana? Dischon
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.0538 10 12.8804 - - - - -
0.1076 20 12.4714 - - - - -
0.1615 30 11.8263 - - - - -
0.2153 40 11.014 - - - - -
0.2691 50 10.1609 - - - - -
0.3229 60 10.6807 - - - - -
0.3767 70 9.3215 - - - - -
0.4305 80 10.3719 - - - - -
0.4844 90 9.4147 - - - - -
0.5382 100 9.5567 - - - - -
0.5920 110 8.7699 - - - - -
0.6458 120 9.0428 - - - - -
0.6996 130 9.0977 - - - - -
0.7534 140 8.0843 - - - - -
0.8073 150 8.1363 - - - - -
0.8611 160 7.5306 - - - - -
0.9149 170 7.7972 - - - - -
0.9687 180 7.9644 - - - - -
0.9956 185 - 0.1917 0.1879 0.1784 0.1583 0.1268
1.0225 190 7.6124 - - - - -
1.0764 200 6.6315 - - - - -
1.1302 210 7.2313 - - - - -
1.1840 220 6.5394 - - - - -
1.2378 230 6.7843 - - - - -
1.2916 240 6.9276 - - - - -
1.3454 250 7.2281 - - - - -
1.3993 260 6.9158 - - - - -
1.4531 270 6.5158 - - - - -
1.5069 280 6.916 - - - - -
1.5607 290 6.5717 - - - - -
1.6145 300 6.9225 - - - - -
1.6683 310 7.3981 - - - - -
1.7222 320 6.894 - - - - -
1.7760 330 6.0293 - - - - -
1.8298 340 5.9389 - - - - -
1.8836 350 5.959 - - - - -
1.9374 360 6.4268 - - - - -
1.9913 370 6.7366 - - - - -
1.9966 371 - 0.2012 0.1965 0.1862 0.1633 0.1361
2.0451 380 5.7871 - - - - -
2.0989 390 5.7358 - - - - -
2.1527 400 6.0964 - - - - -
2.2065 410 5.8331 - - - - -
2.2603 420 5.6152 - - - - -
2.3142 430 6.5018 - - - - -
2.3680 440 5.9798 - - - - -
2.4218 450 6.0598 - - - - -
2.4756 460 5.8222 - - - - -
2.5294 470 6.303 - - - - -
2.5832 480 5.9648 - - - - -
2.6371 490 6.415 - - - - -
2.6909 500 7.084 - - - - -
2.7447 510 5.692 - - - - -
2.7985 520 5.7706 - - - - -
2.8523 530 5.6943 - - - - -
2.9062 540 5.6817 - - - - -
2.9600 550 6.1265 - - - - -
2.9869 555 - 0.1882 0.1815 0.1744 0.1557 0.1303
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 3.3.1
  • Transformers: 4.45.2
  • PyTorch: 2.5.1
  • Accelerate: 1.2.1
  • Datasets: 2.19.1
  • Tokenizers: 0.20.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}