SentenceTransformer based on sentence-transformers/paraphrase-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/paraphrase-MiniLM-L6-v2 on the en-pt-br, en-es and en-pt datasets. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/paraphrase-MiniLM-L6-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Datasets:
  • Languages: en, multilingual, es, pt

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br")
# Run inference
sentences = [
    'We now call this place home.',
    'Moramos ali. Agora é aqui a nossa casa.',
    'É mais fácil do que se possa imaginar.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Knowledge Distillation

  • Datasets: en-pt-br, en-es and en-pt
  • Evaluated with MSEEvaluator
Metric en-pt-br en-es en-pt
negative_mse -4.0617 -4.2473 -4.2555

Translation

Metric en-pt-br en-es en-pt
src2trg_accuracy 0.9859 0.908 0.8951
trg2src_accuracy 0.9808 0.898 0.8824
mean_accuracy 0.9834 0.903 0.8888

Semantic Similarity

Metric Value
pearson_cosine 0.7714
spearman_cosine 0.7862

Training Details

Training Datasets

en-pt-br

  • Dataset: en-pt-br at 0c70bc6
  • Size: 405,807 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 23.98 tokens
    • max: 128 tokens
    • min: 6 tokens
    • mean: 36.86 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And then there are certain conceptual things that can also benefit from hand calculating, but I think they're relatively small in number. E também existem alguns aspectos conceituais que também podem se beneficiar do cálculo manual, mas eu acho que eles são relativamente poucos. [-0.2655501961708069, 0.2715710997581482, 0.13977409899234772, 0.007375418208539486, -0.09395705163478851, ...]
    One thing I often ask about is ancient Greek and how this relates. Uma coisa sobre a qual eu pergunto com frequencia é grego antigo e como ele se relaciona a isto. [0.34961527585983276, -0.01806497573852539, 0.06103038787841797, 0.11750973761081696, -0.34720802307128906, ...]
    See, the thing we're doing right now is we're forcing people to learn mathematics. Vejam, o que estamos fazendo agora, é que estamos forçando as pessoas a aprender matemática. [0.031645823270082474, -0.1787087768316269, -0.30170342326164246, 0.1304805874824524, -0.29176947474479675, ...]
  • Loss: MSELoss

en-es

  • Dataset: en-es
  • Size: 6,889,042 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.04 tokens
    • max: 128 tokens
    • min: 5 tokens
    • mean: 35.11 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And then there are certain conceptual things that can also benefit from hand calculating, but I think they're relatively small in number. Y luego hay ciertas aspectos conceptuales que pueden beneficiarse del cálculo a mano pero creo que son relativamente pocos. [-0.2655501961708069, 0.2715710997581482, 0.13977409899234772, 0.007375418208539486, -0.09395705163478851, ...]
    One thing I often ask about is ancient Greek and how this relates. Algo que pregunto a menudo es sobre el griego antiguo y cómo se relaciona. [0.34961527585983276, -0.01806497573852539, 0.06103038787841797, 0.11750973761081696, -0.34720802307128906, ...]
    See, the thing we're doing right now is we're forcing people to learn mathematics. Vean, lo que estamos haciendo ahora es forzar a la gente a aprender matemáticas. [0.031645823270082474, -0.1787087768316269, -0.30170342326164246, 0.1304805874824524, -0.29176947474479675, ...]
  • Loss: MSELoss

en-pt

  • Dataset: en-pt
  • Size: 6,636,095 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 23.5 tokens
    • max: 128 tokens
    • min: 5 tokens
    • mean: 35.23 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And the country that does this first will, in my view, leapfrog others in achieving a new economy even, an improved economy, an improved outlook. E o país que fizer isto primeiro vai, na minha opinião, ultrapassar outros em alcançar uma nova economia até uma economia melhorada, uma visão melhorada. [-0.1395619511604309, -0.1703503578901291, 0.21396367251873016, -0.29212212562561035, 0.2718254327774048, ...]
    In fact, I even talk about us moving from what we often call now the "knowledge economy" to what we might call a "computational knowledge economy," where high-level math is integral to what everyone does in the way that knowledge currently is. De facto, eu até falo de mudarmos do que chamamos hoje a economia do conhecimento para o que poderemos chamar a economia do conhecimento computacional, onde a matemática de alto nível está integrada no que toda a gente faz da forma que o conhecimento actualmente está. [-0.002996142255142331, -0.34310653805732727, -0.09672430157661438, 0.23709852993488312, -0.013354267925024033, ...]
    We can engage so many more students with this, and they can have a better time doing it. Podemos cativar tantos mais estudantes com isto, e eles podem divertir-se mais a fazê-lo. [0.2670706808567047, 0.09549400955438614, -0.17057836055755615, -0.2152799665927887, -0.2832679748535156, ...]
  • Loss: MSELoss

Evaluation Datasets

en-pt-br

  • Dataset: en-pt-br at 0c70bc6
  • Size: 992 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 992 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.37 tokens
    • max: 128 tokens
    • min: 5 tokens
    • mean: 38.6 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muito obrigado, Chris. [-0.1929965764284134, 0.051721055060625076, 0.3780047297477722, -0.20386895537376404, -0.2625442445278168, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. É realmente uma grande honra ter a oportunidade de estar neste palco pela segunda vez. Estou muito agradecido. [0.04667849838733673, 0.16640479862689972, 0.05405835807323456, -0.2507464587688446, -0.5305444002151489, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. Eu fui muito aplaudido por esta conferência e quero agradecer a todos pelos muitos comentários delicados sobre o que eu tinha a dizer naquela noite. [0.04410325363278389, 0.2660813629627228, -0.013608227483928204, 0.08376947790384293, 0.22691071033477783, ...]
  • Loss: MSELoss

en-es

  • Dataset: en-es
  • Size: 9,990 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.39 tokens
    • max: 128 tokens
    • min: 4 tokens
    • mean: 36.38 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muchas gracias Chris. [-0.19299663603305817, 0.051721103489398956, 0.37800467014312744, -0.20386885106563568, -0.2625444531440735, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. Y es en verdad un gran honor tener la oportunidad de venir a este escenario por segunda vez. Estoy extremadamente agradecido. [0.04667845368385315, 0.16640479862689972, 0.05405828729271889, -0.25074639916419983, -0.5305443406105042, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. He quedado conmovido por esta conferencia, y deseo agradecer a todos ustedes sus amables comentarios acerca de lo que tenía que decir la otra noche. [0.04410335421562195, 0.2660813629627228, -0.01360794436186552, 0.08376938849687576, 0.22691065073013306, ...]
  • Loss: MSELoss

en-pt

  • Dataset: en-pt
  • Size: 9,992 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 23.82 tokens
    • max: 128 tokens
    • min: 5 tokens
    • mean: 36.7 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muito obrigado, Chris. [-0.19299663603305817, 0.051721103489398956, 0.37800467014312744, -0.20386885106563568, -0.2625444531440735, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. É realmente uma grande honra ter a oportunidade de pisar este palco pela segunda vez. Estou muito agradecido. [0.04667849838733673, 0.16640479862689972, 0.05405835807323456, -0.2507464587688446, -0.5305444002151489, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. Fiquei muito impressionado com esta conferência e quero agradecer a todos os imensos comentários simpáticos sobre o que eu tinha a dizer naquela noite. [0.04410335421562195, 0.2660813629627228, -0.01360794436186552, 0.08376938849687576, 0.22691065073013306, ...]
  • Loss: MSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • gradient_accumulation_steps: 8
  • num_train_epochs: 6
  • warmup_ratio: 0.15
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 8
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 6
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.15
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss en-pt-br loss en-es loss en-pt loss en-pt-br_negative_mse en-pt-br_mean_accuracy en-es_negative_mse en-es_mean_accuracy sts17-es-en-test_spearman_cosine en-pt_negative_mse en-pt_mean_accuracy
0.0074 100 0.0512 - - - - - - - - - -
0.0147 200 0.0505 - - - - - - - - - -
0.0221 300 0.0496 - - - - - - - - - -
0.0294 400 0.0489 - - - - - - - - - -
0.0368 500 0.0483 - - - - - - - - - -
0.0441 600 0.0479 - - - - - - - - - -
0.0515 700 0.0476 - - - - - - - - - -
0.0588 800 0.0474 - - - - - - - - - -
0.0662 900 0.0471 - - - - - - - - - -
0.0735 1000 0.0469 - - - - - - - - - -
0.0809 1100 0.0467 - - - - - - - - - -
0.0882 1200 0.0464 - - - - - - - - - -
0.0956 1300 0.0461 - - - - - - - - - -
0.1029 1400 0.046 - - - - - - - - - -
0.1103 1500 0.0458 - - - - - - - - - -
0.1176 1600 0.0456 - - - - - - - - - -
0.1250 1700 0.0455 - - - - - - - - - -
0.1323 1800 0.0454 - - - - - - - - - -
0.1397 1900 0.0452 - - - - - - - - - -
0.1470 2000 0.0452 0.0441 0.0454 0.0455 -4.785339 0.5978 -4.9081144 0.5252 0.2460 -4.929552 0.4744
0.1544 2100 0.0449 - - - - - - - - - -
0.1617 2200 0.0449 - - - - - - - - - -
0.1691 2300 0.0448 - - - - - - - - - -
0.1764 2400 0.0447 - - - - - - - - - -
0.1838 2500 0.0446 - - - - - - - - - -
0.1911 2600 0.0445 - - - - - - - - - -
0.1985 2700 0.0443 - - - - - - - - - -
0.2058 2800 0.0443 - - - - - - - - - -
0.2132 2900 0.0442 - - - - - - - - - -
0.2205 3000 0.0441 - - - - - - - - - -
0.2279 3100 0.0441 - - - - - - - - - -
0.2352 3200 0.0439 - - - - - - - - - -
0.2426 3300 0.0439 - - - - - - - - - -
0.2499 3400 0.0439 - - - - - - - - - -
0.2573 3500 0.0438 - - - - - - - - - -
0.2646 3600 0.0437 - - - - - - - - - -
0.2720 3700 0.0436 - - - - - - - - - -
0.2793 3800 0.0435 - - - - - - - - - -
0.2867 3900 0.0436 - - - - - - - - - -
0.2940 4000 0.0435 0.0424 0.0437 0.0438 -4.5627975 0.8054 -4.6922235 0.7096 0.3575 -4.715491 0.6680
0.3014 4100 0.0434 - - - - - - - - - -
0.3087 4200 0.0432 - - - - - - - - - -
0.3161 4300 0.0433 - - - - - - - - - -
0.3234 4400 0.0432 - - - - - - - - - -
0.3308 4500 0.0432 - - - - - - - - - -
0.3381 4600 0.0431 - - - - - - - - - -
0.3455 4700 0.0431 - - - - - - - - - -
0.3528 4800 0.043 - - - - - - - - - -
0.3602 4900 0.043 - - - - - - - - - -
0.3675 5000 0.043 - - - - - - - - - -
0.3749 5100 0.0429 - - - - - - - - - -
0.3822 5200 0.0429 - - - - - - - - - -
0.3896 5300 0.0427 - - - - - - - - - -
0.3969 5400 0.0428 - - - - - - - - - -
0.4043 5500 0.0428 - - - - - - - - - -
0.4116 5600 0.0427 - - - - - - - - - -
0.4190 5700 0.0426 - - - - - - - - - -
0.4263 5800 0.0427 - - - - - - - - - -
0.4337 5900 0.0427 - - - - - - - - - -
0.4410 6000 0.0425 0.0414 0.0428 0.0429 -4.415331 0.8997 -4.566894 0.7969 0.4509 -4.5856175 0.7684
0.4484 6100 0.0425 - - - - - - - - - -
0.4557 6200 0.0425 - - - - - - - - - -
0.4631 6300 0.0426 - - - - - - - - - -
0.4704 6400 0.0424 - - - - - - - - - -
0.4778 6500 0.0424 - - - - - - - - - -
0.4851 6600 0.0424 - - - - - - - - - -
0.4925 6700 0.0423 - - - - - - - - - -
0.4998 6800 0.0425 - - - - - - - - - -
0.5072 6900 0.0423 - - - - - - - - - -
0.5145 7000 0.0422 - - - - - - - - - -
0.5219 7100 0.0423 - - - - - - - - - -
0.5292 7200 0.0422 - - - - - - - - - -
0.5366 7300 0.0422 - - - - - - - - - -
0.5439 7400 0.0422 - - - - - - - - - -
0.5513 7500 0.0422 - - - - - - - - - -
0.5586 7600 0.0421 - - - - - - - - - -
0.5660 7700 0.0421 - - - - - - - - - -
0.5733 7800 0.0421 - - - - - - - - - -
0.5807 7900 0.0421 - - - - - - - - - -
0.5880 8000 0.0421 0.0409 0.0423 0.0424 -4.325846 0.9400 -4.488157 0.8366 0.5334 -4.5032544 0.8170
0.5954 8100 0.0421 - - - - - - - - - -
0.6027 8200 0.042 - - - - - - - - - -
0.6101 8300 0.042 - - - - - - - - - -
0.6174 8400 0.042 - - - - - - - - - -
0.6248 8500 0.0421 - - - - - - - - - -
0.6321 8600 0.0419 - - - - - - - - - -
0.6395 8700 0.042 - - - - - - - - - -
0.6468 8800 0.0419 - - - - - - - - - -
0.6542 8900 0.0419 - - - - - - - - - -
0.6615 9000 0.0419 - - - - - - - - - -
0.6689 9100 0.0418 - - - - - - - - - -
0.6762 9200 0.0418 - - - - - - - - - -
0.6836 9300 0.0418 - - - - - - - - - -
0.6909 9400 0.0418 - - - - - - - - - -
0.6983 9500 0.0418 - - - - - - - - - -
0.7056 9600 0.0417 - - - - - - - - - -
0.7130 9700 0.0417 - - - - - - - - - -
0.7203 9800 0.0417 - - - - - - - - - -
0.7277 9900 0.0417 - - - - - - - - - -
0.7350 10000 0.0417 0.0405 0.0420 0.0420 -4.2650604 0.9526 -4.434938 0.8587 0.6200 -4.4468904 0.8390
0.7424 10100 0.0416 - - - - - - - - - -
0.7497 10200 0.0417 - - - - - - - - - -
0.7571 10300 0.0416 - - - - - - - - - -
0.7644 10400 0.0416 - - - - - - - - - -
0.7718 10500 0.0416 - - - - - - - - - -
0.7791 10600 0.0416 - - - - - - - - - -
0.7865 10700 0.0416 - - - - - - - - - -
0.7938 10800 0.0416 - - - - - - - - - -
0.8012 10900 0.0415 - - - - - - - - - -
0.8085 11000 0.0416 - - - - - - - - - -
0.8159 11100 0.0415 - - - - - - - - - -
0.8232 11200 0.0415 - - - - - - - - - -
0.8306 11300 0.0415 - - - - - - - - - -
0.8380 11400 0.0415 - - - - - - - - - -
0.8453 11500 0.0415 - - - - - - - - - -
0.8527 11600 0.0415 - - - - - - - - - -
0.8600 11700 0.0414 - - - - - - - - - -
0.8674 11800 0.0414 - - - - - - - - - -
0.8747 11900 0.0414 - - - - - - - - - -
0.8821 12000 0.0414 0.0402 0.0417 0.0417 -4.2234926 0.9637 -4.3952775 0.8716 0.6684 -4.404395 0.8526
0.8894 12100 0.0414 - - - - - - - - - -
0.8968 12200 0.0414 - - - - - - - - - -
0.9041 12300 0.0414 - - - - - - - - - -
0.9115 12400 0.0414 - - - - - - - - - -
0.9188 12500 0.0413 - - - - - - - - - -
0.9262 12600 0.0413 - - - - - - - - - -
0.9335 12700 0.0414 - - - - - - - - - -
0.9409 12800 0.0414 - - - - - - - - - -
0.9482 12900 0.0413 - - - - - - - - - -
0.9556 13000 0.0413 - - - - - - - - - -
0.9629 13100 0.0413 - - - - - - - - - -
0.9703 13200 0.0413 - - - - - - - - - -
0.9776 13300 0.0413 - - - - - - - - - -
0.9850 13400 0.0413 - - - - - - - - - -
0.9923 13500 0.0412 - - - - - - - - - -
0.9997 13600 0.0413 - - - - - - - - - -
1.0070 13700 0.0413 - - - - - - - - - -
1.0144 13800 0.0413 - - - - - - - - - -
1.0217 13900 0.0412 - - - - - - - - - -
1.0291 14000 0.0412 0.0400 0.0415 0.0416 -4.194809 0.9682 -4.36698 0.8798 0.6892 -4.37619 0.8621
1.0364 14100 0.0412 - - - - - - - - - -
1.0438 14200 0.0412 - - - - - - - - - -
1.0511 14300 0.0412 - - - - - - - - - -
1.0585 14400 0.0412 - - - - - - - - - -
1.0658 14500 0.0412 - - - - - - - - - -
1.0732 14600 0.0412 - - - - - - - - - -
1.0805 14700 0.0412 - - - - - - - - - -
1.0879 14800 0.0412 - - - - - - - - - -
1.0952 14900 0.0411 - - - - - - - - - -
1.1026 15000 0.0411 - - - - - - - - - -
1.1099 15100 0.0411 - - - - - - - - - -
1.1173 15200 0.0411 - - - - - - - - - -
1.1246 15300 0.0411 - - - - - - - - - -
1.1320 15400 0.0411 - - - - - - - - - -
1.1393 15500 0.041 - - - - - - - - - -
1.1467 15600 0.0412 - - - - - - - - - -
1.1540 15700 0.041 - - - - - - - - - -
1.1614 15800 0.0411 - - - - - - - - - -
1.1687 15900 0.0412 - - - - - - - - - -
1.1761 16000 0.0411 0.0399 0.0414 0.0414 -4.1725326 0.9728 -4.347921 0.8845 0.7072 -4.3567324 0.8679
1.1834 16100 0.0411 - - - - - - - - - -
1.1908 16200 0.0411 - - - - - - - - - -
1.1981 16300 0.041 - - - - - - - - - -
1.2055 16400 0.041 - - - - - - - - - -
1.2128 16500 0.0411 - - - - - - - - - -
1.2202 16600 0.041 - - - - - - - - - -
1.2275 16700 0.0411 - - - - - - - - - -
1.2349 16800 0.041 - - - - - - - - - -
1.2422 16900 0.041 - - - - - - - - - -
1.2496 17000 0.041 - - - - - - - - - -
1.2569 17100 0.041 - - - - - - - - - -
1.2643 17200 0.041 - - - - - - - - - -
1.2716 17300 0.041 - - - - - - - - - -
1.2790 17400 0.041 - - - - - - - - - -
1.2863 17500 0.041 - - - - - - - - - -
1.2937 17600 0.041 - - - - - - - - - -
1.3010 17700 0.041 - - - - - - - - - -
1.3084 17800 0.0409 - - - - - - - - - -
1.3157 17900 0.041 - - - - - - - - - -
1.3231 18000 0.041 0.0398 0.0413 0.0413 -4.156324 0.9733 -4.3334045 0.8877 0.7126 -4.3421884 0.8721
1.3304 18100 0.041 - - - - - - - - - -
1.3378 18200 0.0409 - - - - - - - - - -
1.3451 18300 0.0409 - - - - - - - - - -
1.3525 18400 0.0409 - - - - - - - - - -
1.3598 18500 0.0409 - - - - - - - - - -
1.3672 18600 0.0409 - - - - - - - - - -
1.3745 18700 0.041 - - - - - - - - - -
1.3819 18800 0.0409 - - - - - - - - - -
1.3892 18900 0.0408 - - - - - - - - - -
1.3966 19000 0.0409 - - - - - - - - - -
1.4039 19100 0.0409 - - - - - - - - - -
1.4113 19200 0.0409 - - - - - - - - - -
1.4186 19300 0.0408 - - - - - - - - - -
1.4260 19400 0.0409 - - - - - - - - - -
1.4333 19500 0.0409 - - - - - - - - - -
1.4407 19600 0.0408 - - - - - - - - - -
1.4480 19700 0.0409 - - - - - - - - - -
1.4554 19800 0.0409 - - - - - - - - - -
1.4627 19900 0.0409 - - - - - - - - - -
1.4701 20000 0.0409 0.0397 0.0413 0.0413 -4.1426473 0.9763 -4.321089 0.8898 0.7271 -4.3293304 0.8749
1.4774 20100 0.0408 - - - - - - - - - -
1.4848 20200 0.0409 - - - - - - - - - -
1.4921 20300 0.0409 - - - - - - - - - -
1.4995 20400 0.0409 - - - - - - - - - -
1.5068 20500 0.0409 - - - - - - - - - -
1.5142 20600 0.0408 - - - - - - - - - -
1.5215 20700 0.0409 - - - - - - - - - -
1.5289 20800 0.0408 - - - - - - - - - -
1.5362 20900 0.0409 - - - - - - - - - -
1.5436 21000 0.0409 - - - - - - - - - -
1.5509 21100 0.0409 - - - - - - - - - -
1.5583 21200 0.0408 - - - - - - - - - -
1.5656 21300 0.0408 - - - - - - - - - -
1.5730 21400 0.0408 - - - - - - - - - -
1.5803 21500 0.0408 - - - - - - - - - -
1.5877 21600 0.0409 - - - - - - - - - -
1.5950 21700 0.0409 - - - - - - - - - -
1.6024 21800 0.0408 - - - - - - - - - -
1.6097 21900 0.0409 - - - - - - - - - -
1.6171 22000 0.0408 0.0397 0.0412 0.0412 -4.1324744 0.9768 -4.312636 0.8920 0.7339 -4.320667 0.8767
1.6244 22100 0.0409 - - - - - - - - - -
1.6318 22200 0.0408 - - - - - - - - - -
1.6391 22300 0.0409 - - - - - - - - - -
1.6465 22400 0.0408 - - - - - - - - - -
1.6538 22500 0.0408 - - - - - - - - - -
1.6612 22600 0.0408 - - - - - - - - - -
1.6686 22700 0.0408 - - - - - - - - - -
1.6759 22800 0.0408 - - - - - - - - - -
1.6833 22900 0.0407 - - - - - - - - - -
1.6906 23000 0.0408 - - - - - - - - - -
1.6980 23100 0.0408 - - - - - - - - - -
1.7053 23200 0.0407 - - - - - - - - - -
1.7127 23300 0.0408 - - - - - - - - - -
1.7200 23400 0.0408 - - - - - - - - - -
1.7274 23500 0.0408 - - - - - - - - - -
1.7347 23600 0.0408 - - - - - - - - - -
1.7421 23700 0.0408 - - - - - - - - - -
1.7494 23800 0.0408 - - - - - - - - - -
1.7568 23900 0.0407 - - - - - - - - - -
1.7641 24000 0.0408 0.0396 0.0411 0.0412 -4.1245446 0.9748 -4.304793 0.8929 0.7383 -4.312529 0.8785
1.7715 24100 0.0407 - - - - - - - - - -
1.7788 24200 0.0408 - - - - - - - - - -
1.7862 24300 0.0408 - - - - - - - - - -
1.7935 24400 0.0407 - - - - - - - - - -
1.8009 24500 0.0407 - - - - - - - - - -
1.8082 24600 0.0408 - - - - - - - - - -
1.8156 24700 0.0407 - - - - - - - - - -
1.8229 24800 0.0408 - - - - - - - - - -
1.8303 24900 0.0407 - - - - - - - - - -
1.8376 25000 0.0408 - - - - - - - - - -
1.8450 25100 0.0408 - - - - - - - - - -
1.8523 25200 0.0407 - - - - - - - - - -
1.8597 25300 0.0407 - - - - - - - - - -
1.8670 25400 0.0407 - - - - - - - - - -
1.8744 25500 0.0407 - - - - - - - - - -
1.8817 25600 0.0407 - - - - - - - - - -
1.8891 25700 0.0408 - - - - - - - - - -
1.8964 25800 0.0407 - - - - - - - - - -
1.9038 25900 0.0407 - - - - - - - - - -
1.9111 26000 0.0407 0.0396 0.0411 0.0411 -4.115908 0.9793 -4.2973475 0.8952 0.7492 -4.3051677 0.8805
1.9185 26100 0.0407 - - - - - - - - - -
1.9258 26200 0.0407 - - - - - - - - - -
1.9332 26300 0.0408 - - - - - - - - - -
1.9405 26400 0.0407 - - - - - - - - - -
1.9479 26500 0.0407 - - - - - - - - - -
1.9552 26600 0.0407 - - - - - - - - - -
1.9626 26700 0.0407 - - - - - - - - - -
1.9699 26800 0.0407 - - - - - - - - - -
1.9773 26900 0.0407 - - - - - - - - - -
1.9846 27000 0.0407 - - - - - - - - - -
1.9920 27100 0.0407 - - - - - - - - - -
1.9993 27200 0.0407 - - - - - - - - - -
2.0067 27300 0.0407 - - - - - - - - - -
2.0140 27400 0.0407 - - - - - - - - - -
2.0214 27500 0.0407 - - - - - - - - - -
2.0287 27600 0.0407 - - - - - - - - - -
2.0361 27700 0.0407 - - - - - - - - - -
2.0434 27800 0.0406 - - - - - - - - - -
2.0508 27900 0.0407 - - - - - - - - - -
2.0581 28000 0.0407 0.0395 0.0411 0.0411 -4.1113133 0.9793 -4.291781 0.8963 0.7503 -4.3004823 0.8823
2.0655 28100 0.0407 - - - - - - - - - -
2.0728 28200 0.0407 - - - - - - - - - -
2.0802 28300 0.0407 - - - - - - - - - -
2.0875 28400 0.0407 - - - - - - - - - -
2.0949 28500 0.0406 - - - - - - - - - -
2.1022 28600 0.0406 - - - - - - - - - -
2.1096 28700 0.0407 - - - - - - - - - -
2.1169 28800 0.0406 - - - - - - - - - -
2.1243 28900 0.0406 - - - - - - - - - -
2.1316 29000 0.0407 - - - - - - - - - -
2.1390 29100 0.0406 - - - - - - - - - -
2.1463 29200 0.0407 - - - - - - - - - -
2.1537 29300 0.0405 - - - - - - - - - -
2.1610 29400 0.0406 - - - - - - - - - -
2.1684 29500 0.0407 - - - - - - - - - -
2.1757 29600 0.0407 - - - - - - - - - -
2.1831 29700 0.0406 - - - - - - - - - -
2.1904 29800 0.0407 - - - - - - - - - -
2.1978 29900 0.0406 - - - - - - - - - -
2.2051 30000 0.0406 0.0395 0.0410 0.0410 -4.103692 0.9783 -4.286089 0.8967 0.7587 -4.2947936 0.8821
2.2125 30100 0.0406 - - - - - - - - - -
2.2198 30200 0.0406 - - - - - - - - - -
2.2272 30300 0.0407 - - - - - - - - - -
2.2345 30400 0.0406 - - - - - - - - - -
2.2419 30500 0.0406 - - - - - - - - - -
2.2492 30600 0.0406 - - - - - - - - - -
2.2566 30700 0.0406 - - - - - - - - - -
2.2639 30800 0.0406 - - - - - - - - - -
2.2713 30900 0.0406 - - - - - - - - - -
2.2786 31000 0.0406 - - - - - - - - - -
2.2860 31100 0.0407 - - - - - - - - - -
2.2933 31200 0.0406 - - - - - - - - - -
2.3007 31300 0.0406 - - - - - - - - - -
2.3080 31400 0.0405 - - - - - - - - - -
2.3154 31500 0.0406 - - - - - - - - - -
2.3227 31600 0.0406 - - - - - - - - - -
2.3301 31700 0.0406 - - - - - - - - - -
2.3374 31800 0.0406 - - - - - - - - - -
2.3448 31900 0.0406 - - - - - - - - - -
2.3521 32000 0.0406 0.0394 0.0410 0.0410 -4.0995665 0.9788 -4.282365 0.8976 0.7618 -4.291147 0.8826
2.3595 32100 0.0406 - - - - - - - - - -
2.3668 32200 0.0406 - - - - - - - - - -
2.3742 32300 0.0406 - - - - - - - - - -
2.3815 32400 0.0406 - - - - - - - - - -
2.3889 32500 0.0405 - - - - - - - - - -
2.3962 32600 0.0406 - - - - - - - - - -
2.4036 32700 0.0406 - - - - - - - - - -
2.4109 32800 0.0406 - - - - - - - - - -
2.4183 32900 0.0405 - - - - - - - - - -
2.4256 33000 0.0406 - - - - - - - - - -
2.4330 33100 0.0406 - - - - - - - - - -
2.4403 33200 0.0405 - - - - - - - - - -
2.4477 33300 0.0406 - - - - - - - - - -
2.4550 33400 0.0406 - - - - - - - - - -
2.4624 33500 0.0406 - - - - - - - - - -
2.4697 33600 0.0405 - - - - - - - - - -
2.4771 33700 0.0405 - - - - - - - - - -
2.4844 33800 0.0406 - - - - - - - - - -
2.4918 33900 0.0406 - - - - - - - - - -
2.4992 34000 0.0406 0.0394 0.0410 0.0410 -4.0955496 0.9798 -4.278625 0.8985 0.7684 -4.2872252 0.8832
2.5065 34100 0.0406 - - - - - - - - - -
2.5139 34200 0.0405 - - - - - - - - - -
2.5212 34300 0.0406 - - - - - - - - - -
2.5286 34400 0.0405 - - - - - - - - - -
2.5359 34500 0.0405 - - - - - - - - - -
2.5433 34600 0.0406 - - - - - - - - - -
2.5506 34700 0.0406 - - - - - - - - - -
2.5580 34800 0.0405 - - - - - - - - - -
2.5653 34900 0.0405 - - - - - - - - - -
2.5727 35000 0.0406 - - - - - - - - - -
2.5800 35100 0.0406 - - - - - - - - - -
2.5874 35200 0.0406 - - - - - - - - - -
2.5947 35300 0.0406 - - - - - - - - - -
2.6021 35400 0.0406 - - - - - - - - - -
2.6094 35500 0.0406 - - - - - - - - - -
2.6168 35600 0.0406 - - - - - - - - - -
2.6241 35700 0.0406 - - - - - - - - - -
2.6315 35800 0.0406 - - - - - - - - - -
2.6388 35900 0.0406 - - - - - - - - - -
2.6462 36000 0.0405 0.0394 0.0410 0.0410 -4.091509 0.9808 -4.2755327 0.8986 0.7703 -4.283869 0.8847
2.6535 36100 0.0406 - - - - - - - - - -
2.6609 36200 0.0405 - - - - - - - - - -
2.6682 36300 0.0406 - - - - - - - - - -
2.6756 36400 0.0405 - - - - - - - - - -
2.6829 36500 0.0405 - - - - - - - - - -
2.6903 36600 0.0406 - - - - - - - - - -
2.6976 36700 0.0405 - - - - - - - - - -
2.7050 36800 0.0405 - - - - - - - - - -
2.7123 36900 0.0405 - - - - - - - - - -
2.7197 37000 0.0405 - - - - - - - - - -
2.7270 37100 0.0406 - - - - - - - - - -
2.7344 37200 0.0406 - - - - - - - - - -
2.7417 37300 0.0405 - - - - - - - - - -
2.7491 37400 0.0405 - - - - - - - - - -
2.7564 37500 0.0405 - - - - - - - - - -
2.7638 37600 0.0405 - - - - - - - - - -
2.7711 37700 0.0405 - - - - - - - - - -
2.7785 37800 0.0405 - - - - - - - - - -
2.7858 37900 0.0405 - - - - - - - - - -
2.7932 38000 0.0405 0.0394 0.0409 0.0410 -4.0881968 0.9819 -4.2728844 0.8990 0.7717 -4.281445 0.8851
2.8005 38100 0.0405 - - - - - - - - - -
2.8079 38200 0.0406 - - - - - - - - - -
2.8152 38300 0.0405 - - - - - - - - - -
2.8226 38400 0.0406 - - - - - - - - - -
2.8299 38500 0.0405 - - - - - - - - - -
2.8373 38600 0.0405 - - - - - - - - - -
2.8446 38700 0.0406 - - - - - - - - - -
2.8520 38800 0.0405 - - - - - - - - - -
2.8593 38900 0.0405 - - - - - - - - - -
2.8667 39000 0.0404 - - - - - - - - - -
2.8740 39100 0.0405 - - - - - - - - - -
2.8814 39200 0.0405 - - - - - - - - - -
2.8887 39300 0.0405 - - - - - - - - - -
2.8961 39400 0.0405 - - - - - - - - - -
2.9034 39500 0.0405 - - - - - - - - - -
2.9108 39600 0.0405 - - - - - - - - - -
2.9181 39700 0.0405 - - - - - - - - - -
2.9255 39800 0.0405 - - - - - - - - - -
2.9328 39900 0.0406 - - - - - - - - - -
2.9402 40000 0.0405 0.0393 0.0409 0.0409 -4.0840244 0.9824 -4.2690845 0.9001 0.7744 -4.277667 0.8853
2.9475 40100 0.0405 - - - - - - - - - -
2.9549 40200 0.0405 - - - - - - - - - -
2.9622 40300 0.0405 - - - - - - - - - -
2.9696 40400 0.0405 - - - - - - - - - -
2.9769 40500 0.0405 - - - - - - - - - -
2.9843 40600 0.0405 - - - - - - - - - -
2.9916 40700 0.0405 - - - - - - - - - -
2.9990 40800 0.0405 - - - - - - - - - -
3.0063 40900 0.0405 - - - - - - - - - -
3.0137 41000 0.0405 - - - - - - - - - -
3.0210 41100 0.0405 - - - - - - - - - -
3.0284 41200 0.0405 - - - - - - - - - -
3.0357 41300 0.0405 - - - - - - - - - -
3.0431 41400 0.0404 - - - - - - - - - -
3.0504 41500 0.0405 - - - - - - - - - -
3.0578 41600 0.0405 - - - - - - - - - -
3.0651 41700 0.0405 - - - - - - - - - -
3.0725 41800 0.0405 - - - - - - - - - -
3.0798 41900 0.0406 - - - - - - - - - -
3.0872 42000 0.0405 0.0393 0.0409 0.0409 -4.0831747 0.9819 -4.267492 0.9003 0.7798 -4.2760205 0.8860
3.0945 42100 0.0404 - - - - - - - - - -
3.1019 42200 0.0405 - - - - - - - - - -
3.1092 42300 0.0405 - - - - - - - - - -
3.1166 42400 0.0405 - - - - - - - - - -
3.1239 42500 0.0405 - - - - - - - - - -
3.1313 42600 0.0405 - - - - - - - - - -
3.1386 42700 0.0404 - - - - - - - - - -
3.1460 42800 0.0405 - - - - - - - - - -
3.1533 42900 0.0404 - - - - - - - - - -
3.1607 43000 0.0405 - - - - - - - - - -
3.1680 43100 0.0405 - - - - - - - - - -
3.1754 43200 0.0405 - - - - - - - - - -
3.1827 43300 0.0405 - - - - - - - - - -
3.1901 43400 0.0405 - - - - - - - - - -
3.1974 43500 0.0404 - - - - - - - - - -
3.2048 43600 0.0405 - - - - - - - - - -
3.2121 43700 0.0405 - - - - - - - - - -
3.2195 43800 0.0405 - - - - - - - - - -
3.2268 43900 0.0405 - - - - - - - - - -
3.2342 44000 0.0405 0.0393 0.0409 0.0409 -4.0802045 0.9824 -4.2650237 0.9005 0.7727 -4.2733293 0.8865
3.2415 44100 0.0404 - - - - - - - - - -
3.2489 44200 0.0405 - - - - - - - - - -
3.2562 44300 0.0405 - - - - - - - - - -
3.2636 44400 0.0404 - - - - - - - - - -
3.2709 44500 0.0405 - - - - - - - - - -
3.2783 44600 0.0404 - - - - - - - - - -
3.2856 44700 0.0405 - - - - - - - - - -
3.2930 44800 0.0405 - - - - - - - - - -
3.3003 44900 0.0405 - - - - - - - - - -
3.3077 45000 0.0404 - - - - - - - - - -
3.3150 45100 0.0404 - - - - - - - - - -
3.3224 45200 0.0404 - - - - - - - - - -
3.3297 45300 0.0404 - - - - - - - - - -
3.3371 45400 0.0405 - - - - - - - - - -
3.3445 45500 0.0404 - - - - - - - - - -
3.3518 45600 0.0405 - - - - - - - - - -
3.3592 45700 0.0404 - - - - - - - - - -
3.3665 45800 0.0404 - - - - - - - - - -
3.3739 45900 0.0405 - - - - - - - - - -
3.3812 46000 0.0405 0.0393 0.0409 0.0409 -4.078617 0.9829 -4.2630715 0.9009 0.7804 -4.271867 0.8867
3.3886 46100 0.0403 - - - - - - - - - -
3.3959 46200 0.0404 - - - - - - - - - -
3.4033 46300 0.0405 - - - - - - - - - -
3.4106 46400 0.0404 - - - - - - - - - -
3.4180 46500 0.0404 - - - - - - - - - -
3.4253 46600 0.0404 - - - - - - - - - -
3.4327 46700 0.0405 - - - - - - - - - -
3.4400 46800 0.0404 - - - - - - - - - -
3.4474 46900 0.0404 - - - - - - - - - -
3.4547 47000 0.0404 - - - - - - - - - -
3.4621 47100 0.0405 - - - - - - - - - -
3.4694 47200 0.0404 - - - - - - - - - -
3.4768 47300 0.0404 - - - - - - - - - -
3.4841 47400 0.0405 - - - - - - - - - -
3.4915 47500 0.0404 - - - - - - - - - -
3.4988 47600 0.0405 - - - - - - - - - -
3.5062 47700 0.0405 - - - - - - - - - -
3.5135 47800 0.0404 - - - - - - - - - -
3.5209 47900 0.0404 - - - - - - - - - -
3.5282 48000 0.0404 0.0393 0.0409 0.0409 -4.075687 0.9824 -4.260863 0.9012 0.7801 -4.2693996 0.8870
3.5356 48100 0.0404 - - - - - - - - - -
3.5429 48200 0.0405 - - - - - - - - - -
3.5503 48300 0.0405 - - - - - - - - - -
3.5576 48400 0.0404 - - - - - - - - - -
3.5650 48500 0.0404 - - - - - - - - - -
3.5723 48600 0.0404 - - - - - - - - - -
3.5797 48700 0.0404 - - - - - - - - - -
3.5870 48800 0.0405 - - - - - - - - - -
3.5944 48900 0.0405 - - - - - - - - - -
3.6017 49000 0.0405 - - - - - - - - - -
3.6091 49100 0.0405 - - - - - - - - - -
3.6164 49200 0.0404 - - - - - - - - - -
3.6238 49300 0.0405 - - - - - - - - - -
3.6311 49400 0.0405 - - - - - - - - - -
3.6385 49500 0.0404 - - - - - - - - - -
3.6458 49600 0.0404 - - - - - - - - - -
3.6532 49700 0.0404 - - - - - - - - - -
3.6605 49800 0.0404 - - - - - - - - - -
3.6679 49900 0.0404 - - - - - - - - - -
3.6752 50000 0.0404 0.0393 0.0409 0.0409 -4.074522 0.9824 -4.2595944 0.9015 0.7800 -4.2678404 0.8869
3.6826 50100 0.0404 - - - - - - - - - -
3.6899 50200 0.0404 - - - - - - - - - -
3.6973 50300 0.0404 - - - - - - - - - -
3.7046 50400 0.0404 - - - - - - - - - -
3.7120 50500 0.0404 - - - - - - - - - -
3.7193 50600 0.0404 - - - - - - - - - -
3.7267 50700 0.0404 - - - - - - - - - -
3.7340 50800 0.0405 - - - - - - - - - -
3.7414 50900 0.0404 - - - - - - - - - -
3.7487 51000 0.0404 - - - - - - - - - -
3.7561 51100 0.0404 - - - - - - - - - -
3.7634 51200 0.0404 - - - - - - - - - -
3.7708 51300 0.0404 - - - - - - - - - -
3.7781 51400 0.0404 - - - - - - - - - -
3.7855 51500 0.0404 - - - - - - - - - -
3.7928 51600 0.0404 - - - - - - - - - -
3.8002 51700 0.0404 - - - - - - - - - -
3.8075 51800 0.0405 - - - - - - - - - -
3.8149 51900 0.0404 - - - - - - - - - -
3.8222 52000 0.0405 0.0393 0.0408 0.0409 -4.0725245 0.9829 -4.258272 0.9010 0.7804 -4.2663693 0.8873
3.8296 52100 0.0404 - - - - - - - - - -
3.8369 52200 0.0404 - - - - - - - - - -
3.8443 52300 0.0404 - - - - - - - - - -
3.8516 52400 0.0404 - - - - - - - - - -
3.8590 52500 0.0404 - - - - - - - - - -
3.8663 52600 0.0403 - - - - - - - - - -
3.8737 52700 0.0404 - - - - - - - - - -
3.8810 52800 0.0404 - - - - - - - - - -
3.8884 52900 0.0404 - - - - - - - - - -
3.8957 53000 0.0404 - - - - - - - - - -
3.9031 53100 0.0404 - - - - - - - - - -
3.9104 53200 0.0404 - - - - - - - - - -
3.9178 53300 0.0404 - - - - - - - - - -
3.9251 53400 0.0404 - - - - - - - - - -
3.9325 53500 0.0405 - - - - - - - - - -
3.9398 53600 0.0404 - - - - - - - - - -
3.9472 53700 0.0404 - - - - - - - - - -
3.9545 53800 0.0404 - - - - - - - - - -
3.9619 53900 0.0404 - - - - - - - - - -
3.9692 54000 0.0404 0.0393 0.0408 0.0408 -4.0710397 0.9834 -4.2558665 0.9017 0.7822 -4.264339 0.8871
3.9766 54100 0.0405 - - - - - - - - - -
3.9839 54200 0.0404 - - - - - - - - - -
3.9913 54300 0.0404 - - - - - - - - - -
3.9986 54400 0.0404 - - - - - - - - - -
4.0060 54500 0.0404 - - - - - - - - - -
4.0133 54600 0.0404 - - - - - - - - - -
4.0207 54700 0.0404 - - - - - - - - - -
4.0280 54800 0.0404 - - - - - - - - - -
4.0354 54900 0.0404 - - - - - - - - - -
4.0427 55000 0.0403 - - - - - - - - - -
4.0501 55100 0.0404 - - - - - - - - - -
4.0574 55200 0.0404 - - - - - - - - - -
4.0648 55300 0.0404 - - - - - - - - - -
4.0721 55400 0.0404 - - - - - - - - - -
4.0795 55500 0.0405 - - - - - - - - - -
4.0868 55600 0.0404 - - - - - - - - - -
4.0942 55700 0.0404 - - - - - - - - - -
4.1015 55800 0.0404 - - - - - - - - - -
4.1089 55900 0.0404 - - - - - - - - - -
4.1162 56000 0.0404 0.0393 0.0408 0.0408 -4.0708294 0.9839 -4.255319 0.9020 0.7847 -4.2634797 0.8880
4.1236 56100 0.0404 - - - - - - - - - -
4.1309 56200 0.0404 - - - - - - - - - -
4.1383 56300 0.0403 - - - - - - - - - -
4.1456 56400 0.0404 - - - - - - - - - -
4.1530 56500 0.0403 - - - - - - - - - -
4.1603 56600 0.0404 - - - - - - - - - -
4.1677 56700 0.0405 - - - - - - - - - -
4.1751 56800 0.0404 - - - - - - - - - -
4.1824 56900 0.0404 - - - - - - - - - -
4.1898 57000 0.0404 - - - - - - - - - -
4.1971 57100 0.0403 - - - - - - - - - -
4.2045 57200 0.0404 - - - - - - - - - -
4.2118 57300 0.0404 - - - - - - - - - -
4.2192 57400 0.0404 - - - - - - - - - -
4.2265 57500 0.0404 - - - - - - - - - -
4.2339 57600 0.0404 - - - - - - - - - -
4.2412 57700 0.0404 - - - - - - - - - -
4.2486 57800 0.0404 - - - - - - - - - -
4.2559 57900 0.0404 - - - - - - - - - -
4.2633 58000 0.0403 0.0392 0.0408 0.0408 -4.0691004 0.9829 -4.253834 0.9023 0.7827 -4.2621255 0.8880
4.2706 58100 0.0404 - - - - - - - - - -
4.2780 58200 0.0403 - - - - - - - - - -
4.2853 58300 0.0404 - - - - - - - - - -
4.2927 58400 0.0404 - - - - - - - - - -
4.3000 58500 0.0404 - - - - - - - - - -
4.3074 58600 0.0403 - - - - - - - - - -
4.3147 58700 0.0404 - - - - - - - - - -
4.3221 58800 0.0404 - - - - - - - - - -
4.3294 58900 0.0404 - - - - - - - - - -
4.3368 59000 0.0404 - - - - - - - - - -
4.3441 59100 0.0403 - - - - - - - - - -
4.3515 59200 0.0404 - - - - - - - - - -
4.3588 59300 0.0403 - - - - - - - - - -
4.3662 59400 0.0403 - - - - - - - - - -
4.3735 59500 0.0404 - - - - - - - - - -
4.3809 59600 0.0404 - - - - - - - - - -
4.3882 59700 0.0403 - - - - - - - - - -
4.3956 59800 0.0403 - - - - - - - - - -
4.4029 59900 0.0404 - - - - - - - - - -
4.4103 60000 0.0403 0.0392 0.0408 0.0408 -4.067449 0.9824 -4.252402 0.9026 0.7815 -4.260964 0.8878
4.4176 60100 0.0403 - - - - - - - - - -
4.4250 60200 0.0404 - - - - - - - - - -
4.4323 60300 0.0404 - - - - - - - - - -
4.4397 60400 0.0403 - - - - - - - - - -
4.4470 60500 0.0404 - - - - - - - - - -
4.4544 60600 0.0404 - - - - - - - - - -
4.4617 60700 0.0404 - - - - - - - - - -
4.4691 60800 0.0403 - - - - - - - - - -
4.4764 60900 0.0403 - - - - - - - - - -
4.4838 61000 0.0404 - - - - - - - - - -
4.4911 61100 0.0404 - - - - - - - - - -
4.4985 61200 0.0404 - - - - - - - - - -
4.5058 61300 0.0404 - - - - - - - - - -
4.5132 61400 0.0403 - - - - - - - - - -
4.5205 61500 0.0404 - - - - - - - - - -
4.5279 61600 0.0403 - - - - - - - - - -
4.5352 61700 0.0404 - - - - - - - - - -
4.5426 61800 0.0404 - - - - - - - - - -
4.5499 61900 0.0404 - - - - - - - - - -
4.5573 62000 0.0403 0.0392 0.0408 0.0408 -4.06687 0.9839 -4.2519274 0.9028 0.7839 -4.260144 0.8884
4.5646 62100 0.0403 - - - - - - - - - -
4.5720 62200 0.0404 - - - - - - - - - -
4.5793 62300 0.0403 - - - - - - - - - -
4.5867 62400 0.0404 - - - - - - - - - -
4.5940 62500 0.0404 - - - - - - - - - -
4.6014 62600 0.0404 - - - - - - - - - -
4.6087 62700 0.0404 - - - - - - - - - -
4.6161 62800 0.0404 - - - - - - - - - -
4.6234 62900 0.0404 - - - - - - - - - -
4.6308 63000 0.0404 - - - - - - - - - -
4.6381 63100 0.0404 - - - - - - - - - -
4.6455 63200 0.0404 - - - - - - - - - -
4.6528 63300 0.0404 - - - - - - - - - -
4.6602 63400 0.0404 - - - - - - - - - -
4.6675 63500 0.0404 - - - - - - - - - -
4.6749 63600 0.0404 - - - - - - - - - -
4.6822 63700 0.0403 - - - - - - - - - -
4.6896 63800 0.0404 - - - - - - - - - -
4.6969 63900 0.0403 - - - - - - - - - -
4.7043 64000 0.0403 0.0392 0.0408 0.0408 -4.0657706 0.9829 -4.251195 0.9026 0.7835 -4.2593575 0.8881
4.7116 64100 0.0403 - - - - - - - - - -
4.7190 64200 0.0404 - - - - - - - - - -
4.7263 64300 0.0404 - - - - - - - - - -
4.7337 64400 0.0404 - - - - - - - - - -
4.7410 64500 0.0404 - - - - - - - - - -
4.7484 64600 0.0403 - - - - - - - - - -
4.7557 64700 0.0403 - - - - - - - - - -
4.7631 64800 0.0404 - - - - - - - - - -
4.7704 64900 0.0403 - - - - - - - - - -
4.7778 65000 0.0403 - - - - - - - - - -
4.7851 65100 0.0404 - - - - - - - - - -
4.7925 65200 0.0403 - - - - - - - - - -
4.7998 65300 0.0403 - - - - - - - - - -
4.8072 65400 0.0404 - - - - - - - - - -
4.8145 65500 0.0404 - - - - - - - - - -
4.8219 65600 0.0404 - - - - - - - - - -
4.8292 65700 0.0403 - - - - - - - - - -
4.8366 65800 0.0404 - - - - - - - - - -
4.8439 65900 0.0404 - - - - - - - - - -
4.8513 66000 0.0404 0.0392 0.0408 0.0408 -4.0652847 0.9834 -4.2506175 0.9028 0.7839 -4.2587624 0.8887
4.8586 66100 0.0403 - - - - - - - - - -
4.8660 66200 0.0403 - - - - - - - - - -
4.8733 66300 0.0403 - - - - - - - - - -
4.8807 66400 0.0403 - - - - - - - - - -
4.8880 66500 0.0404 - - - - - - - - - -
4.8954 66600 0.0404 - - - - - - - - - -
4.9027 66700 0.0404 - - - - - - - - - -
4.9101 66800 0.0404 - - - - - - - - - -
4.9174 66900 0.0403 - - - - - - - - - -
4.9248 67000 0.0403 - - - - - - - - - -
4.9321 67100 0.0404 - - - - - - - - - -
4.9395 67200 0.0404 - - - - - - - - - -
4.9468 67300 0.0404 - - - - - - - - - -
4.9542 67400 0.0403 - - - - - - - - - -
4.9615 67500 0.0404 - - - - - - - - - -
4.9689 67600 0.0403 - - - - - - - - - -
4.9762 67700 0.0404 - - - - - - - - - -
4.9836 67800 0.0403 - - - - - - - - - -
4.9909 67900 0.0404 - - - - - - - - - -
4.9983 68000 0.0404 0.0392 0.0408 0.0408 -4.0641384 0.9834 -4.2498612 0.9028 0.7849 -4.258079 0.8888
5.0057 68100 0.0404 - - - - - - - - - -
5.0130 68200 0.0404 - - - - - - - - - -
5.0204 68300 0.0404 - - - - - - - - - -
5.0277 68400 0.0403 - - - - - - - - - -
5.0351 68500 0.0404 - - - - - - - - - -
5.0424 68600 0.0403 - - - - - - - - - -
5.0498 68700 0.0404 - - - - - - - - - -
5.0571 68800 0.0404 - - - - - - - - - -
5.0645 68900 0.0403 - - - - - - - - - -
5.0718 69000 0.0404 - - - - - - - - - -
5.0792 69100 0.0404 - - - - - - - - - -
5.0865 69200 0.0404 - - - - - - - - - -
5.0939 69300 0.0403 - - - - - - - - - -
5.1012 69400 0.0403 - - - - - - - - - -
5.1086 69500 0.0404 - - - - - - - - - -
5.1159 69600 0.0403 - - - - - - - - - -
5.1233 69700 0.0403 - - - - - - - - - -
5.1306 69800 0.0404 - - - - - - - - - -
5.1380 69900 0.0403 - - - - - - - - - -
5.1453 70000 0.0404 0.0392 0.0408 0.0408 -4.0635867 0.9829 -4.2490883 0.9027 0.7853 -4.2572694 0.8888
5.1527 70100 0.0403 - - - - - - - - - -
5.1600 70200 0.0403 - - - - - - - - - -
5.1674 70300 0.0404 - - - - - - - - - -
5.1747 70400 0.0404 - - - - - - - - - -
5.1821 70500 0.0404 - - - - - - - - - -
5.1894 70600 0.0403 - - - - - - - - - -
5.1968 70700 0.0403 - - - - - - - - - -
5.2041 70800 0.0403 - - - - - - - - - -
5.2115 70900 0.0403 - - - - - - - - - -
5.2188 71000 0.0404 - - - - - - - - - -
5.2262 71100 0.0403 - - - - - - - - - -
5.2335 71200 0.0403 - - - - - - - - - -
5.2409 71300 0.0403 - - - - - - - - - -
5.2482 71400 0.0404 - - - - - - - - - -
5.2556 71500 0.0403 - - - - - - - - - -
5.2629 71600 0.0403 - - - - - - - - - -
5.2703 71700 0.0404 - - - - - - - - - -
5.2776 71800 0.0403 - - - - - - - - - -
5.2850 71900 0.0404 - - - - - - - - - -
5.2923 72000 0.0403 0.0392 0.0408 0.0408 -4.0629997 0.9829 -4.248385 0.9029 0.7867 -4.256695 0.8886
5.2997 72100 0.0404 - - - - - - - - - -
5.3070 72200 0.0403 - - - - - - - - - -
5.3144 72300 0.0403 - - - - - - - - - -
5.3217 72400 0.0403 - - - - - - - - - -
5.3291 72500 0.0403 - - - - - - - - - -
5.3364 72600 0.0404 - - - - - - - - - -
5.3438 72700 0.0403 - - - - - - - - - -
5.3511 72800 0.0404 - - - - - - - - - -
5.3585 72900 0.0403 - - - - - - - - - -
5.3658 73000 0.0403 - - - - - - - - - -
5.3732 73100 0.0404 - - - - - - - - - -
5.3805 73200 0.0403 - - - - - - - - - -
5.3879 73300 0.0402 - - - - - - - - - -
5.3952 73400 0.0403 - - - - - - - - - -
5.4026 73500 0.0404 - - - - - - - - - -
5.4099 73600 0.0403 - - - - - - - - - -
5.4173 73700 0.0403 - - - - - - - - - -
5.4246 73800 0.0403 - - - - - - - - - -
5.4320 73900 0.0404 - - - - - - - - - -
5.4393 74000 0.0403 0.0392 0.0408 0.0408 -4.0628495 0.9834 -4.2482247 0.9030 0.7844 -4.256557 0.8888
5.4467 74100 0.0403 - - - - - - - - - -
5.4540 74200 0.0403 - - - - - - - - - -
5.4614 74300 0.0404 - - - - - - - - - -
5.4687 74400 0.0403 - - - - - - - - - -
5.4761 74500 0.0403 - - - - - - - - - -
5.4834 74600 0.0404 - - - - - - - - - -
5.4908 74700 0.0403 - - - - - - - - - -
5.4981 74800 0.0404 - - - - - - - - - -
5.5055 74900 0.0404 - - - - - - - - - -
5.5128 75000 0.0403 - - - - - - - - - -
5.5202 75100 0.0403 - - - - - - - - - -
5.5275 75200 0.0403 - - - - - - - - - -
5.5349 75300 0.0403 - - - - - - - - - -
5.5422 75400 0.0404 - - - - - - - - - -
5.5496 75500 0.0403 - - - - - - - - - -
5.5569 75600 0.0403 - - - - - - - - - -
5.5643 75700 0.0403 - - - - - - - - - -
5.5716 75800 0.0403 - - - - - - - - - -
5.5790 75900 0.0403 - - - - - - - - - -
5.5863 76000 0.0404 0.0392 0.0408 0.0408 -4.062086 0.9829 -4.247803 0.9032 0.7871 -4.2560315 0.8889
5.5937 76100 0.0404 - - - - - - - - - -
5.6010 76200 0.0403 - - - - - - - - - -
5.6084 76300 0.0404 - - - - - - - - - -
5.6157 76400 0.0403 - - - - - - - - - -
5.6231 76500 0.0404 - - - - - - - - - -
5.6304 76600 0.0404 - - - - - - - - - -
5.6378 76700 0.0403 - - - - - - - - - -
5.6451 76800 0.0403 - - - - - - - - - -
5.6525 76900 0.0403 - - - - - - - - - -
5.6598 77000 0.0404 - - - - - - - - - -
5.6672 77100 0.0403 - - - - - - - - - -
5.6745 77200 0.0403 - - - - - - - - - -
5.6819 77300 0.0403 - - - - - - - - - -
5.6892 77400 0.0404 - - - - - - - - - -
5.6966 77500 0.0403 - - - - - - - - - -
5.7039 77600 0.0403 - - - - - - - - - -
5.7113 77700 0.0403 - - - - - - - - - -
5.7186 77800 0.0403 - - - - - - - - - -
5.7260 77900 0.0403 - - - - - - - - - -
5.7333 78000 0.0404 0.0392 0.0408 0.0408 -4.062058 0.9834 -4.247644 0.9029 0.7858 -4.2557683 0.8888
5.7407 78100 0.0404 - - - - - - - - - -
5.7480 78200 0.0403 - - - - - - - - - -
5.7554 78300 0.0403 - - - - - - - - - -
5.7627 78400 0.0404 - - - - - - - - - -
5.7701 78500 0.0403 - - - - - - - - - -
5.7774 78600 0.0403 - - - - - - - - - -
5.7848 78700 0.0404 - - - - - - - - - -
5.7921 78800 0.0403 - - - - - - - - - -
5.7995 78900 0.0403 - - - - - - - - - -
5.8068 79000 0.0404 - - - - - - - - - -
5.8142 79100 0.0403 - - - - - - - - - -
5.8215 79200 0.0404 - - - - - - - - - -
5.8289 79300 0.0403 - - - - - - - - - -
5.8363 79400 0.0404 - - - - - - - - - -
5.8436 79500 0.0403 - - - - - - - - - -
5.8510 79600 0.0404 - - - - - - - - - -
5.8583 79700 0.0403 - - - - - - - - - -
5.8657 79800 0.0403 - - - - - - - - - -
5.8730 79900 0.0403 - - - - - - - - - -
5.8804 80000 0.0403 0.0392 0.0408 0.0408 -4.0617065 0.9834 -4.247319 0.9030 0.7862 -4.255536 0.8888
5.8877 80100 0.0404 - - - - - - - - - -
5.8951 80200 0.0404 - - - - - - - - - -
5.9024 80300 0.0403 - - - - - - - - - -
5.9098 80400 0.0404 - - - - - - - - - -
5.9171 80500 0.0403 - - - - - - - - - -
5.9245 80600 0.0403 - - - - - - - - - -
5.9318 80700 0.0404 - - - - - - - - - -
5.9392 80800 0.0403 - - - - - - - - - -
5.9465 80900 0.0404 - - - - - - - - - -
5.9539 81000 0.0403 - - - - - - - - - -
5.9612 81100 0.0403 - - - - - - - - - -
5.9686 81200 0.0403 - - - - - - - - - -
5.9759 81300 0.0404 - - - - - - - - - -
5.9833 81400 0.0403 - - - - - - - - - -
5.9906 81500 0.0403 - - - - - - - - - -
5.9980 81600 0.0403 - - - - - - - - - -

Framework Versions

  • Python: 3.12.7
  • Sentence Transformers: 3.3.0
  • Transformers: 4.46.2
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.1.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MSELoss

@inproceedings{reimers-2020-multilingual-sentence-bert,
    title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2020",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/2004.09813",
}
Downloads last month
41
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br-v2

Finetuned
(6)
this model

Dataset used to train jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br-v2

Evaluation results