File size: 22,997 Bytes
1cd3c77
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:11347
- loss:MultipleNegativesRankingLoss
base_model: vinai/phobert-base
widget:
- source_sentence: "Beefsteak 123 la mot dia chi ban banh mi chao, beefsteak cuc ngon\
    \ tai Can Tho ma ban nen mot gan ghe den. Khong gian quan rong rai, sach se, phuc\
    \ vu nhanh nhen, gia ca hop ly. Banh mi chao duong Nguyen Van Troi noi tieng ban\
    \ banh mi thom ngon, chat luong. Banh mi tai day chia ra lam 2 phan: co thit bo\
    \ ma khong thit bo.\n\nQuan Beefsteak 123 la mot dia diem ly tuong cho nhung nguoi\
    \ yeu thich thit bo va cac mon an ngon khac. Quan noi tieng voi su ket hop tuyet\
    \ voi giua thit bo, pate va trung op la. Neu ban muon thu nhung mon khac, quan\
    \ cung co san xuc xich, ca moi, cha lua va xiu mai. Menu cua quan duoc chia thanh\
    \ tung phan da duoc ket hop san de ban de dang lua chon. Vi du nhu bo op la pate\
    \ xuc xich hoac bo op la pate cha lua. Ban cung co the tao ra cac to hop rieng\
    \ cua rieng minh nhu op la ca moi xiu mai.Mot dieu dac biet khi den quan la khi\
    \ ban goi mot phan, ban se duoc tang mien phi mot dia xa lach tron. Day la cach\
    \ hoan hao de ket hop khau vi cua ban voi cac loai rau song tuoi ngon.Voi khong\
    \ gian thoai mai va phuc vu nhanh chong, quan Beefsteak 123 mang den cho ban trai\
    \ nghiem am thuc doc dao va ngon mieng. Hay ghe tham quan de thuong thuc nhung\
    \ mon an tuyet voi nay!\n\nTHONG TIN LIEN HE:\nDia chi: 9B Nguyen Van Troi, Phuong\
    \ Xuan Khanh, Can Tho\nDien thoai: 0907 713 458\nGio mo cua: 06:00 - 14:00\nGia\
    \ tham khao: 20.000d - 40.000d\nFanpage: https://www.facebook.com/Beefsteak-123-143170999350605/\n\
    \n Goi dien"
  sentences:
  - Beefsteak 123 - Nguyen Van Troi
  - Pho Ngon 37
  - Khong tra no hay chi tien ngay Tet
- source_sentence: 'KCC - Pho & Com Ga Xoi Mam la quan an duoc nhieu nguoi yeu thich
    tai so 6 Ton That Thuyet, Nam Tu Liem, Ha Noi. Noi day voi khong gian am cung,
    rat thich hop cho nhung bua an ben ban be, dong nghiep. Day la quan duoc nhieu
    thuc khach danh gia cao ca dich vu lan chat luong do an. Den voi KCC - Pho & Com
    Ga Xoi Mam ngoai pho la mon duoc yeu thich nhat ra, quan con co vo so cac mon
    an hap dan nhu: com rang dui ga xoi mam, com rang dua bo, com rang cai bo, pho
    xao bo, com nong dui ga xoi mam, mi xao bo, com nong cai bo, com nong dua bo.
    Doc va la tu nhung hat com gion rum, cung voi do la huong vi cua nuoc sot dac
    trung va bi truyen ngam sau vao tan ben trong.


    Cac mon nay tuy binh di trong cach che bien nhung mang lai huong vi am thuc manh
    me, du de lam to mo bat cu thuc khach nao khi thuong thuc. KCC - Pho & Com Ga
    Xoi Mam cam ket mang den cho nguoi tieu dung nhung san pham ngon an toan, co loi
    cho suc khoe voi gia rat hop ly. Ban dang o Ton That Thuyet, Ha Noi va dang ban
    khoan khong biet dia chi an pho nao ngon thi hay ghe ngay quan an KCC nhe!


    THONG TIN LIEN HE:  Dia chi:  6 Ton That Thuyet, Nam Tu Liem, Ha Noi Gio mo cua:  06:00
    - 14:00 | 17:30 - 22:00

    Dat mua ngay'
  sentences:
  - Nem Nuong Hai Anh
  - Ca basa kho thom
  - KCC - Pho & Com Ga Xoi Mam
- source_sentence: Banh canh ca loc duoc lam tu bot gao va ca loc. Bot gao sau khi
    duoc can mong thanh soi vua an thi duoc tha vao noi nuoc luoc Ca loc go lay phan
    thit, uop chut gia vi cho dam vi. Phan xuong ca khong bi bo di ma duoc giu lai
    gia nhuyen, loc lay phan nuoc ca roi do vao phan nuoc dung. Mon banh canh ca loc
    ngon nhat la khi an con nong, vua chan vua hup vua xuyt xoa cai vi cay nong. Neu
    an trong ngay dong thi qua tuyet voi roi phai khong nao. Mot to banh canh ca loc
    chi co gia khoang 30.000 dong thoi cac ban nhe.
  sentences:
  - Banh canh ca loc
  - Bun oc, bun oc chan
  - Nha hang Trung Duong Marina
- source_sentence: 'Nguyen lieu:Bap chuoi 1 cai Chanh 1 trai Bot chien gion 75 gr
    Dau an 100 ml Nuoc mam 3 muong canh Bot ngot 1 muong ca phe Tuong ot 1 muong canh
    Duong 1 muong canh Ot bot 1 muong ca pheCach che bien:So che bap chuoi: Dung tay
    tach bap chuoi thanh nhung cong nho, sau do ngam bap chuoi vao trong thau nuoc
    chanh pha loang de giup bap chuoi khong bi tham den. Tiep tuc go bo nhuy trong
    bap chuoi roi rua sach lai voi nuoc.Nhung bot va chien bap chuoi: Bap chuoi sau
    khi tach roi va rua sach ban cho bap chuoi ra to, do vao 75gr bot chien gion,
    dao deu cho bot tham vao bap chuoi. Bac chao len bep cung voi 100ml dau an dun
    soi (luong dau ngap bap chuoi), sau do cho bap chuoi da ao bot vao chien tren
    lua vua khoang 5 - 10 phut cho bap chuoi chin vang deu thi vot ra de rao dau.Lam
    bap chuoi chien nuoc mam: Bac mot cai chao khac cho vao 10ml dau an (tan dung
    luong dau con du khi chien bap chuoi), roi cho vao 3 muong canh nuoc mam, 1 muong
    ca phe bot ngot, 1 muong canh tuong ot, 1 muong canh duong, 1 muong ca phe ot
    bot khuay tan hon hop cho sanh vang lai khoang 3 phut tren lua vua. Cuoi cung
    ban cho bap chuoi da chien vang vao dao deu them 3 phut roi tat bep.Thanh pham:
    Bap chuoi gion rum hoa quyen voi vi man man ngot ngot cua nuoc mam, an kem com
    trang se cuc ki ngon mieng day. Mon an vo cung de lam nay se khien gia dinh ban
    tam tac khen ngon.'
  sentences:
  - Nha Hang Ca Hoi Song Nhi
  - Com nhoi thit hap ot chuong
  - Hoa chuoi chien nuoc mam
- source_sentence: "Noi tieng ve do lau doi va huong vi mon an nay o Ha Noi thi phai\
    \ ke den hang Banh Duc Nong Thanh Tung. Banh o day hap dan o do deo dai cua bot,\
    \ thit nam du day va nem nem vua mieng. Khi phuc vu, mon an nong sot toa ra mui\
    \ huong thom lung tu bot, hanh phi, nuoc mam. Mon banh duc o day duoc chan ngap\
    \ nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu\
    \ hanh kho da phi vang.Mon banh duc o Banh Duc Nong Thanh Tung duoc chan ngap\
    \ nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu\
    \ hanh kho da phi vang. Cach an nay hoi giong voi mon banh gio chan nuoc mam thit\
    \ bam o quan pho chua Lang Son gan cho Ban Co. La mon qua an nhe nhang, vua du\
    \ lung lung bung, co ve dan da nen rat nhieu nguoi them them, nho nho. Banh duc\
    \ nong Ha Noi o day khong bi pha them bot dau xanh nen van giu nguyen duoc huong\
    \ vi dac trung. Dac biet, phan nhan con duoc tron them mot it cu dau xao tren\
    \ ngon lua lon nen giu duoc do ngot gion.THONG TIN LIEN HE:Dia chi: 112 Truong\
    \ Dinh, Quan Hai Ba Trung, Ha NoiGio mo cua: 10:00 - 21:00Dia diem chat luong:\
    \ 4.7/5 (14 danh gia tren Google)\n Chi duong Danh gia Google"
  sentences:
  - Banh Duc
  - Let's Eat Buffet
  - Banh bi do
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---

# SentenceTransformer based on vinai/phobert-base

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [vinai/phobert-base](https://huggingface.co/vinai/phobert-base). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [vinai/phobert-base](https://huggingface.co/vinai/phobert-base) <!-- at revision c1e37c5c86f918761049cef6fa216b4779d0d01d -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: RobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("trongvox/Phobert-Sentence")
# Run inference
sentences = [
    'Noi tieng ve do lau doi va huong vi mon an nay o Ha Noi thi phai ke den hang Banh Duc Nong Thanh Tung. Banh o day hap dan o do deo dai cua bot, thit nam du day va nem nem vua mieng. Khi phuc vu, mon an nong sot toa ra mui huong thom lung tu bot, hanh phi, nuoc mam. Mon banh duc o day duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang.Mon banh duc o Banh Duc Nong Thanh Tung duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang. Cach an nay hoi giong voi mon banh gio chan nuoc mam thit bam o quan pho chua Lang Son gan cho Ban Co. La mon qua an nhe nhang, vua du lung lung bung, co ve dan da nen rat nhieu nguoi them them, nho nho. Banh duc nong Ha Noi o day khong bi pha them bot dau xanh nen van giu nguyen duoc huong vi dac trung. Dac biet, phan nhan con duoc tron them mot it cu dau xao tren ngon lua lon nen giu duoc do ngot gion.THONG TIN LIEN HE:Dia chi: 112 Truong Dinh, Quan Hai Ba Trung, Ha NoiGio mo cua: 10:00 - 21:00Dia diem chat luong: 4.7/5 (14 danh gia tren Google)\n Chi duong Danh gia Google',
    'Banh Duc',
    'Banh bi do',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset


* Size: 11,347 training samples
* Columns: <code>sentence_0</code> and <code>sentence_1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence_0                                                                           | sentence_1                                                                       |
  |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                           |
  | details | <ul><li>min: 73 tokens</li><li>mean: 127.74 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.16 tokens</li><li>max: 24 tokens</li></ul> |
* Samples:
  | sentence_0                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        | sentence_1                            |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------|
  | <code>Mamadeli la mot dia chi giup ban giai quyet con them com ga, mi y chuan vi nhat. Nhan vien tai quan nay kha de chiu va chieu khach. Mot suat com ga ta bao gom mot phan com mem, thit ga ta xe thom phuc va dia nuoc mam gung chan voi sot trung rat dam da.Giua long Sai Gon hoa le lai co huong vi cua mon com ga nuc tieng thi con dieu gi khien ban ban khoan ma khong thuong thuc nhi. Thuc don phong phu, gia ca phai chang voi huong vi mon an hoan hao dung vi hap dan la li do giup quan thu hut duoc dong dao khach hang ghe toi thuong xuyen.<br><br>Ngoai ra, voi cach trinh bay mon an day bat mat va mau sac chac chan cac thuc khach khi den day se khong the roi mat khoi mon an dau. Team thich song ao tung chao nghe toi day chac hao huc lam vi do an vua ngon, vua co hinh de song ao chat luong.Va khien ai cung thom them ghen ti khi ban co co hoi duoc thu va trai nghiem o Mamadeli do.  Neu ban muon tan huong tai nha thi hay yen tam, Mamadeli hien tai da co mat tren cac app giao hang, cac ban co the theo doi...</code>    | <code>Mamadeli - Com ga & Mi y</code> |
  | <code>Nguyen lieu:Thit heo xay 300 gr Toi bam 2 muong ca phe Hanh tim bam 2 muong ca phe Gung bam 1 muong ca phe Nuoc mam 1/2 muong canh Nuoc tuong 1 muong canh Bot nang 2 muong canh Giam an 2 muong canh Tuong ca 3 muong canh Dau an 2 muong canh Duong 4 muong canh Muoi 1/4 muong canhCach che bien Thit vien kho chua ngotUop thitBan uop thit voi 2 muong ca phe toi bam, 2 muong ca phe hanh tim, 1 muong ca phe gung bam, 1/4 muong ca phe muoi, 1/2 muong canh nuoc mam, 1 muong canh nuoc tuong, 2 muong canh bot nang.Sau do, ban tron deu de cac gia vi ngam vao nhau va uop khoang 15 phut.<br>Vo vien va chien thitBan vo thit thanh tung vien vua an.Ban dun nong 2 muong canh dau an o lua vua. Khi dau soi, ban cho thit vao va chien vang deu 2 mat.<br>Kho thitBan cho vao chao 4 muong canh duong, 2 muong canh giam an, 3 muong canh tuong ca va 4 muong canh nuoc loc roi dao deu.Ban rim phan nuoc sot voi thit vien 15 phut sau do tat bep va cho ra dia.<br>Thanh phamThit vien mem, thom, vua an cung voi nuoc sot chua chu...</code> | <code>Thit vien kho chua ngot</code>  |
  | <code>Nguyen lieu:1kg oc1 cu gungHanh khoToi, otSa teNuoc mam, bot ngot, duong...Cach lam:Oc giac khi mua ve, ban cung dem rua sach, roi ngam voi nuoc vo gao co cat them voilat ot trong 3 tieng de oc nhanh nha chat ban ra.Gung ban dem cao vo rua sach, bam nho.Hanh kho, toi boc sach vo. Hanh kho ban thai lat mong, con toi thi bam nhuyen.Ot tuoi rua sach, thai lat.Sau khi ngam xong, ban dem oc giac luoc voi nuoc co cho them vai lat gung hoac sa dap dap. Khi oc chin, ban lay thit oc ra cat lat va de ra dia. Dat chao len bep, cho dau an vao, khi dau soi ban cho hanh kho va toi vao phi thom. Tiep den, ban cho vao 3 muong sa te, ot cat lat, dao deu tay. Dao khoang 5 phut, ban cho oc vao deu roi nem nem voi nuoc mam, duong, bot ngot sao cho vua khau vi. Xao oc khoang 10 phut nua thi tat bep.Vay la hoan thanh mon an roi, gio day ban chi can cho mon an ra dia va cho them vai soi rau ram len tren la xong!</code>                                                                                                               | <code>Oc giac xao sa te</code>        |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `multi_dataset_batch_sampler`: round_robin

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: round_robin

</details>

### Training Logs
| Epoch  | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.7042 | 500  | 0.9125        |
| 1.4085 | 1000 | 0.2277        |
| 2.1127 | 1500 | 0.1527        |
| 2.8169 | 2000 | 0.1009        |


### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu121
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->