trongvox commited on
Commit
1cd3c77
·
verified ·
1 Parent(s): 287913b

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,434 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:11347
8
+ - loss:MultipleNegativesRankingLoss
9
+ base_model: vinai/phobert-base
10
+ widget:
11
+ - source_sentence: "Beefsteak 123 la mot dia chi ban banh mi chao, beefsteak cuc ngon\
12
+ \ tai Can Tho ma ban nen mot gan ghe den. Khong gian quan rong rai, sach se, phuc\
13
+ \ vu nhanh nhen, gia ca hop ly. Banh mi chao duong Nguyen Van Troi noi tieng ban\
14
+ \ banh mi thom ngon, chat luong. Banh mi tai day chia ra lam 2 phan: co thit bo\
15
+ \ ma khong thit bo.\n\nQuan Beefsteak 123 la mot dia diem ly tuong cho nhung nguoi\
16
+ \ yeu thich thit bo va cac mon an ngon khac. Quan noi tieng voi su ket hop tuyet\
17
+ \ voi giua thit bo, pate va trung op la. Neu ban muon thu nhung mon khac, quan\
18
+ \ cung co san xuc xich, ca moi, cha lua va xiu mai. Menu cua quan duoc chia thanh\
19
+ \ tung phan da duoc ket hop san de ban de dang lua chon. Vi du nhu bo op la pate\
20
+ \ xuc xich hoac bo op la pate cha lua. Ban cung co the tao ra cac to hop rieng\
21
+ \ cua rieng minh nhu op la ca moi xiu mai.Mot dieu dac biet khi den quan la khi\
22
+ \ ban goi mot phan, ban se duoc tang mien phi mot dia xa lach tron. Day la cach\
23
+ \ hoan hao de ket hop khau vi cua ban voi cac loai rau song tuoi ngon.Voi khong\
24
+ \ gian thoai mai va phuc vu nhanh chong, quan Beefsteak 123 mang den cho ban trai\
25
+ \ nghiem am thuc doc dao va ngon mieng. Hay ghe tham quan de thuong thuc nhung\
26
+ \ mon an tuyet voi nay!\n\nTHONG TIN LIEN HE:\nDia chi: 9B Nguyen Van Troi, Phuong\
27
+ \ Xuan Khanh, Can Tho\nDien thoai: 0907 713 458\nGio mo cua: 06:00 - 14:00\nGia\
28
+ \ tham khao: 20.000d - 40.000d\nFanpage: https://www.facebook.com/Beefsteak-123-143170999350605/\n\
29
+ \n Goi dien"
30
+ sentences:
31
+ - Beefsteak 123 - Nguyen Van Troi
32
+ - Pho Ngon 37
33
+ - Khong tra no hay chi tien ngay Tet
34
+ - source_sentence: 'KCC - Pho & Com Ga Xoi Mam la quan an duoc nhieu nguoi yeu thich
35
+ tai so 6 Ton That Thuyet, Nam Tu Liem, Ha Noi. Noi day voi khong gian am cung,
36
+ rat thich hop cho nhung bua an ben ban be, dong nghiep. Day la quan duoc nhieu
37
+ thuc khach danh gia cao ca dich vu lan chat luong do an. Den voi KCC - Pho & Com
38
+ Ga Xoi Mam ngoai pho la mon duoc yeu thich nhat ra, quan con co vo so cac mon
39
+ an hap dan nhu: com rang dui ga xoi mam, com rang dua bo, com rang cai bo, pho
40
+ xao bo, com nong dui ga xoi mam, mi xao bo, com nong cai bo, com nong dua bo.
41
+ Doc va la tu nhung hat com gion rum, cung voi do la huong vi cua nuoc sot dac
42
+ trung va bi truyen ngam sau vao tan ben trong.
43
+
44
+
45
+ Cac mon nay tuy binh di trong cach che bien nhung mang lai huong vi am thuc manh
46
+ me, du de lam to mo bat cu thuc khach nao khi thuong thuc. KCC - Pho & Com Ga
47
+ Xoi Mam cam ket mang den cho nguoi tieu dung nhung san pham ngon an toan, co loi
48
+ cho suc khoe voi gia rat hop ly. Ban dang o Ton That Thuyet, Ha Noi va dang ban
49
+ khoan khong biet dia chi an pho nao ngon thi hay ghe ngay quan an KCC nhe!
50
+
51
+
52
+ THONG TIN LIEN HE: Dia chi: 6 Ton That Thuyet, Nam Tu Liem, Ha Noi Gio mo cua: 06:00
53
+ - 14:00 | 17:30 - 22:00
54
+
55
+ Dat mua ngay'
56
+ sentences:
57
+ - Nem Nuong Hai Anh
58
+ - Ca basa kho thom
59
+ - KCC - Pho & Com Ga Xoi Mam
60
+ - source_sentence: Banh canh ca loc duoc lam tu bot gao va ca loc. Bot gao sau khi
61
+ duoc can mong thanh soi vua an thi duoc tha vao noi nuoc luoc Ca loc go lay phan
62
+ thit, uop chut gia vi cho dam vi. Phan xuong ca khong bi bo di ma duoc giu lai
63
+ gia nhuyen, loc lay phan nuoc ca roi do vao phan nuoc dung. Mon banh canh ca loc
64
+ ngon nhat la khi an con nong, vua chan vua hup vua xuyt xoa cai vi cay nong. Neu
65
+ an trong ngay dong thi qua tuyet voi roi phai khong nao. Mot to banh canh ca loc
66
+ chi co gia khoang 30.000 dong thoi cac ban nhe.
67
+ sentences:
68
+ - Banh canh ca loc
69
+ - Bun oc, bun oc chan
70
+ - Nha hang Trung Duong Marina
71
+ - source_sentence: 'Nguyen lieu:Bap chuoi 1 cai Chanh 1 trai Bot chien gion 75 gr
72
+ Dau an 100 ml Nuoc mam 3 muong canh Bot ngot 1 muong ca phe Tuong ot 1 muong canh
73
+ Duong 1 muong canh Ot bot 1 muong ca pheCach che bien:So che bap chuoi: Dung tay
74
+ tach bap chuoi thanh nhung cong nho, sau do ngam bap chuoi vao trong thau nuoc
75
+ chanh pha loang de giup bap chuoi khong bi tham den. Tiep tuc go bo nhuy trong
76
+ bap chuoi roi rua sach lai voi nuoc.Nhung bot va chien bap chuoi: Bap chuoi sau
77
+ khi tach roi va rua sach ban cho bap chuoi ra to, do vao 75gr bot chien gion,
78
+ dao deu cho bot tham vao bap chuoi. Bac chao len bep cung voi 100ml dau an dun
79
+ soi (luong dau ngap bap chuoi), sau do cho bap chuoi da ao bot vao chien tren
80
+ lua vua khoang 5 - 10 phut cho bap chuoi chin vang deu thi vot ra de rao dau.Lam
81
+ bap chuoi chien nuoc mam: Bac mot cai chao khac cho vao 10ml dau an (tan dung
82
+ luong dau con du khi chien bap chuoi), roi cho vao 3 muong canh nuoc mam, 1 muong
83
+ ca phe bot ngot, 1 muong canh tuong ot, 1 muong canh duong, 1 muong ca phe ot
84
+ bot khuay tan hon hop cho sanh vang lai khoang 3 phut tren lua vua. Cuoi cung
85
+ ban cho bap chuoi da chien vang vao dao deu them 3 phut roi tat bep.Thanh pham:
86
+ Bap chuoi gion rum hoa quyen voi vi man man ngot ngot cua nuoc mam, an kem com
87
+ trang se cuc ki ngon mieng day. Mon an vo cung de lam nay se khien gia dinh ban
88
+ tam tac khen ngon.'
89
+ sentences:
90
+ - Nha Hang Ca Hoi Song Nhi
91
+ - Com nhoi thit hap ot chuong
92
+ - Hoa chuoi chien nuoc mam
93
+ - source_sentence: "Noi tieng ve do lau doi va huong vi mon an nay o Ha Noi thi phai\
94
+ \ ke den hang Banh Duc Nong Thanh Tung. Banh o day hap dan o do deo dai cua bot,\
95
+ \ thit nam du day va nem nem vua mieng. Khi phuc vu, mon an nong sot toa ra mui\
96
+ \ huong thom lung tu bot, hanh phi, nuoc mam. Mon banh duc o day duoc chan ngap\
97
+ \ nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu\
98
+ \ hanh kho da phi vang.Mon banh duc o Banh Duc Nong Thanh Tung duoc chan ngap\
99
+ \ nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu\
100
+ \ hanh kho da phi vang. Cach an nay hoi giong voi mon banh gio chan nuoc mam thit\
101
+ \ bam o quan pho chua Lang Son gan cho Ban Co. La mon qua an nhe nhang, vua du\
102
+ \ lung lung bung, co ve dan da nen rat nhieu nguoi them them, nho nho. Banh duc\
103
+ \ nong Ha Noi o day khong bi pha them bot dau xanh nen van giu nguyen duoc huong\
104
+ \ vi dac trung. Dac biet, phan nhan con duoc tron them mot it cu dau xao tren\
105
+ \ ngon lua lon nen giu duoc do ngot gion.THONG TIN LIEN HE:Dia chi: 112 Truong\
106
+ \ Dinh, Quan Hai Ba Trung, Ha NoiGio mo cua: 10:00 - 21:00Dia diem chat luong:\
107
+ \ 4.7/5 (14 danh gia tren Google)\n Chi duong Danh gia Google"
108
+ sentences:
109
+ - Banh Duc
110
+ - Let's Eat Buffet
111
+ - Banh bi do
112
+ pipeline_tag: sentence-similarity
113
+ library_name: sentence-transformers
114
+ ---
115
+
116
+ # SentenceTransformer based on vinai/phobert-base
117
+
118
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [vinai/phobert-base](https://huggingface.co/vinai/phobert-base). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
119
+
120
+ ## Model Details
121
+
122
+ ### Model Description
123
+ - **Model Type:** Sentence Transformer
124
+ - **Base model:** [vinai/phobert-base](https://huggingface.co/vinai/phobert-base) <!-- at revision c1e37c5c86f918761049cef6fa216b4779d0d01d -->
125
+ - **Maximum Sequence Length:** 128 tokens
126
+ - **Output Dimensionality:** 768 dimensions
127
+ - **Similarity Function:** Cosine Similarity
128
+ <!-- - **Training Dataset:** Unknown -->
129
+ <!-- - **Language:** Unknown -->
130
+ <!-- - **License:** Unknown -->
131
+
132
+ ### Model Sources
133
+
134
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
135
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
136
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
137
+
138
+ ### Full Model Architecture
139
+
140
+ ```
141
+ SentenceTransformer(
142
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: RobertaModel
143
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
144
+ )
145
+ ```
146
+
147
+ ## Usage
148
+
149
+ ### Direct Usage (Sentence Transformers)
150
+
151
+ First install the Sentence Transformers library:
152
+
153
+ ```bash
154
+ pip install -U sentence-transformers
155
+ ```
156
+
157
+ Then you can load this model and run inference.
158
+ ```python
159
+ from sentence_transformers import SentenceTransformer
160
+
161
+ # Download from the 🤗 Hub
162
+ model = SentenceTransformer("trongvox/Phobert-Sentence")
163
+ # Run inference
164
+ sentences = [
165
+ 'Noi tieng ve do lau doi va huong vi mon an nay o Ha Noi thi phai ke den hang Banh Duc Nong Thanh Tung. Banh o day hap dan o do deo dai cua bot, thit nam du day va nem nem vua mieng. Khi phuc vu, mon an nong sot toa ra mui huong thom lung tu bot, hanh phi, nuoc mam. Mon banh duc o day duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang.Mon banh duc o Banh Duc Nong Thanh Tung duoc chan ngap nuoc mam pha loang vi ngot, hoi man man, co thit bam voi nam meo va rat nhieu hanh kho da phi vang. Cach an nay hoi giong voi mon banh gio chan nuoc mam thit bam o quan pho chua Lang Son gan cho Ban Co. La mon qua an nhe nhang, vua du lung lung bung, co ve dan da nen rat nhieu nguoi them them, nho nho. Banh duc nong Ha Noi o day khong bi pha them bot dau xanh nen van giu nguyen duoc huong vi dac trung. Dac biet, phan nhan con duoc tron them mot it cu dau xao tren ngon lua lon nen giu duoc do ngot gion.THONG TIN LIEN HE:Dia chi: 112 Truong Dinh, Quan Hai Ba Trung, Ha NoiGio mo cua: 10:00 - 21:00Dia diem chat luong: 4.7/5 (14 danh gia tren Google)\n Chi duong Danh gia Google',
166
+ 'Banh Duc',
167
+ 'Banh bi do',
168
+ ]
169
+ embeddings = model.encode(sentences)
170
+ print(embeddings.shape)
171
+ # [3, 768]
172
+
173
+ # Get the similarity scores for the embeddings
174
+ similarities = model.similarity(embeddings, embeddings)
175
+ print(similarities.shape)
176
+ # [3, 3]
177
+ ```
178
+
179
+ <!--
180
+ ### Direct Usage (Transformers)
181
+
182
+ <details><summary>Click to see the direct usage in Transformers</summary>
183
+
184
+ </details>
185
+ -->
186
+
187
+ <!--
188
+ ### Downstream Usage (Sentence Transformers)
189
+
190
+ You can finetune this model on your own dataset.
191
+
192
+ <details><summary>Click to expand</summary>
193
+
194
+ </details>
195
+ -->
196
+
197
+ <!--
198
+ ### Out-of-Scope Use
199
+
200
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
201
+ -->
202
+
203
+ <!--
204
+ ## Bias, Risks and Limitations
205
+
206
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
207
+ -->
208
+
209
+ <!--
210
+ ### Recommendations
211
+
212
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
213
+ -->
214
+
215
+ ## Training Details
216
+
217
+ ### Training Dataset
218
+
219
+ #### Unnamed Dataset
220
+
221
+
222
+ * Size: 11,347 training samples
223
+ * Columns: <code>sentence_0</code> and <code>sentence_1</code>
224
+ * Approximate statistics based on the first 1000 samples:
225
+ | | sentence_0 | sentence_1 |
226
+ |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
227
+ | type | string | string |
228
+ | details | <ul><li>min: 73 tokens</li><li>mean: 127.74 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.16 tokens</li><li>max: 24 tokens</li></ul> |
229
+ * Samples:
230
+ | sentence_0 | sentence_1 |
231
+ |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------|
232
+ | <code>Mamadeli la mot dia chi giup ban giai quyet con them com ga, mi y chuan vi nhat. Nhan vien tai quan nay kha de chiu va chieu khach. Mot suat com ga ta bao gom mot phan com mem, thit ga ta xe thom phuc va dia nuoc mam gung chan voi sot trung rat dam da.Giua long Sai Gon hoa le lai co huong vi cua mon com ga nuc tieng thi con dieu gi khien ban ban khoan ma khong thuong thuc nhi. Thuc don phong phu, gia ca phai chang voi huong vi mon an hoan hao dung vi hap dan la li do giup quan thu hut duoc dong dao khach hang ghe toi thuong xuyen.<br><br>Ngoai ra, voi cach trinh bay mon an day bat mat va mau sac chac chan cac thuc khach khi den day se khong the roi mat khoi mon an dau. Team thich song ao tung chao nghe toi day chac hao huc lam vi do an vua ngon, vua co hinh de song ao chat luong.Va khien ai cung thom them ghen ti khi ban co co hoi duoc thu va trai nghiem o Mamadeli do. Neu ban muon tan huong tai nha thi hay yen tam, Mamadeli hien tai da co mat tren cac app giao hang, cac ban co the theo doi...</code> | <code>Mamadeli - Com ga & Mi y</code> |
233
+ | <code>Nguyen lieu:Thit heo xay 300 gr Toi bam 2 muong ca phe Hanh tim bam 2 muong ca phe Gung bam 1 muong ca phe Nuoc mam 1/2 muong canh Nuoc tuong 1 muong canh Bot nang 2 muong canh Giam an 2 muong canh Tuong ca 3 muong canh Dau an 2 muong canh Duong 4 muong canh Muoi 1/4 muong canhCach che bien Thit vien kho chua ngotUop thitBan uop thit voi 2 muong ca phe toi bam, 2 muong ca phe hanh tim, 1 muong ca phe gung bam, 1/4 muong ca phe muoi, 1/2 muong canh nuoc mam, 1 muong canh nuoc tuong, 2 muong canh bot nang.Sau do, ban tron deu de cac gia vi ngam vao nhau va uop khoang 15 phut.<br>Vo vien va chien thitBan vo thit thanh tung vien vua an.Ban dun nong 2 muong canh dau an o lua vua. Khi dau soi, ban cho thit vao va chien vang deu 2 mat.<br>Kho thitBan cho vao chao 4 muong canh duong, 2 muong canh giam an, 3 muong canh tuong ca va 4 muong canh nuoc loc roi dao deu.Ban rim phan nuoc sot voi thit vien 15 phut sau do tat bep va cho ra dia.<br>Thanh phamThit vien mem, thom, vua an cung voi nuoc sot chua chu...</code> | <code>Thit vien kho chua ngot</code> |
234
+ | <code>Nguyen lieu:1kg oc1 cu gungHanh khoToi, otSa teNuoc mam, bot ngot, duong...Cach lam:Oc giac khi mua ve, ban cung dem rua sach, roi ngam voi nuoc vo gao co cat them voilat ot trong 3 tieng de oc nhanh nha chat ban ra.Gung ban dem cao vo rua sach, bam nho.Hanh kho, toi boc sach vo. Hanh kho ban thai lat mong, con toi thi bam nhuyen.Ot tuoi rua sach, thai lat.Sau khi ngam xong, ban dem oc giac luoc voi nuoc co cho them vai lat gung hoac sa dap dap. Khi oc chin, ban lay thit oc ra cat lat va de ra dia. Dat chao len bep, cho dau an vao, khi dau soi ban cho hanh kho va toi vao phi thom. Tiep den, ban cho vao 3 muong sa te, ot cat lat, dao deu tay. Dao khoang 5 phut, ban cho oc vao deu roi nem nem voi nuoc mam, duong, bot ngot sao cho vua khau vi. Xao oc khoang 10 phut nua thi tat bep.Vay la hoan thanh mon an roi, gio day ban chi can cho mon an ra dia va cho them vai soi rau ram len tren la xong!</code> | <code>Oc giac xao sa te</code> |
235
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
236
+ ```json
237
+ {
238
+ "scale": 20.0,
239
+ "similarity_fct": "cos_sim"
240
+ }
241
+ ```
242
+
243
+ ### Training Hyperparameters
244
+ #### Non-Default Hyperparameters
245
+
246
+ - `per_device_train_batch_size`: 16
247
+ - `per_device_eval_batch_size`: 16
248
+ - `multi_dataset_batch_sampler`: round_robin
249
+
250
+ #### All Hyperparameters
251
+ <details><summary>Click to expand</summary>
252
+
253
+ - `overwrite_output_dir`: False
254
+ - `do_predict`: False
255
+ - `eval_strategy`: no
256
+ - `prediction_loss_only`: True
257
+ - `per_device_train_batch_size`: 16
258
+ - `per_device_eval_batch_size`: 16
259
+ - `per_gpu_train_batch_size`: None
260
+ - `per_gpu_eval_batch_size`: None
261
+ - `gradient_accumulation_steps`: 1
262
+ - `eval_accumulation_steps`: None
263
+ - `torch_empty_cache_steps`: None
264
+ - `learning_rate`: 5e-05
265
+ - `weight_decay`: 0.0
266
+ - `adam_beta1`: 0.9
267
+ - `adam_beta2`: 0.999
268
+ - `adam_epsilon`: 1e-08
269
+ - `max_grad_norm`: 1
270
+ - `num_train_epochs`: 3
271
+ - `max_steps`: -1
272
+ - `lr_scheduler_type`: linear
273
+ - `lr_scheduler_kwargs`: {}
274
+ - `warmup_ratio`: 0.0
275
+ - `warmup_steps`: 0
276
+ - `log_level`: passive
277
+ - `log_level_replica`: warning
278
+ - `log_on_each_node`: True
279
+ - `logging_nan_inf_filter`: True
280
+ - `save_safetensors`: True
281
+ - `save_on_each_node`: False
282
+ - `save_only_model`: False
283
+ - `restore_callback_states_from_checkpoint`: False
284
+ - `no_cuda`: False
285
+ - `use_cpu`: False
286
+ - `use_mps_device`: False
287
+ - `seed`: 42
288
+ - `data_seed`: None
289
+ - `jit_mode_eval`: False
290
+ - `use_ipex`: False
291
+ - `bf16`: False
292
+ - `fp16`: False
293
+ - `fp16_opt_level`: O1
294
+ - `half_precision_backend`: auto
295
+ - `bf16_full_eval`: False
296
+ - `fp16_full_eval`: False
297
+ - `tf32`: None
298
+ - `local_rank`: 0
299
+ - `ddp_backend`: None
300
+ - `tpu_num_cores`: None
301
+ - `tpu_metrics_debug`: False
302
+ - `debug`: []
303
+ - `dataloader_drop_last`: False
304
+ - `dataloader_num_workers`: 0
305
+ - `dataloader_prefetch_factor`: None
306
+ - `past_index`: -1
307
+ - `disable_tqdm`: False
308
+ - `remove_unused_columns`: True
309
+ - `label_names`: None
310
+ - `load_best_model_at_end`: False
311
+ - `ignore_data_skip`: False
312
+ - `fsdp`: []
313
+ - `fsdp_min_num_params`: 0
314
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
315
+ - `fsdp_transformer_layer_cls_to_wrap`: None
316
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
317
+ - `deepspeed`: None
318
+ - `label_smoothing_factor`: 0.0
319
+ - `optim`: adamw_torch
320
+ - `optim_args`: None
321
+ - `adafactor`: False
322
+ - `group_by_length`: False
323
+ - `length_column_name`: length
324
+ - `ddp_find_unused_parameters`: None
325
+ - `ddp_bucket_cap_mb`: None
326
+ - `ddp_broadcast_buffers`: False
327
+ - `dataloader_pin_memory`: True
328
+ - `dataloader_persistent_workers`: False
329
+ - `skip_memory_metrics`: True
330
+ - `use_legacy_prediction_loop`: False
331
+ - `push_to_hub`: False
332
+ - `resume_from_checkpoint`: None
333
+ - `hub_model_id`: None
334
+ - `hub_strategy`: every_save
335
+ - `hub_private_repo`: None
336
+ - `hub_always_push`: False
337
+ - `gradient_checkpointing`: False
338
+ - `gradient_checkpointing_kwargs`: None
339
+ - `include_inputs_for_metrics`: False
340
+ - `include_for_metrics`: []
341
+ - `eval_do_concat_batches`: True
342
+ - `fp16_backend`: auto
343
+ - `push_to_hub_model_id`: None
344
+ - `push_to_hub_organization`: None
345
+ - `mp_parameters`:
346
+ - `auto_find_batch_size`: False
347
+ - `full_determinism`: False
348
+ - `torchdynamo`: None
349
+ - `ray_scope`: last
350
+ - `ddp_timeout`: 1800
351
+ - `torch_compile`: False
352
+ - `torch_compile_backend`: None
353
+ - `torch_compile_mode`: None
354
+ - `dispatch_batches`: None
355
+ - `split_batches`: None
356
+ - `include_tokens_per_second`: False
357
+ - `include_num_input_tokens_seen`: False
358
+ - `neftune_noise_alpha`: None
359
+ - `optim_target_modules`: None
360
+ - `batch_eval_metrics`: False
361
+ - `eval_on_start`: False
362
+ - `use_liger_kernel`: False
363
+ - `eval_use_gather_object`: False
364
+ - `average_tokens_across_devices`: False
365
+ - `prompts`: None
366
+ - `batch_sampler`: batch_sampler
367
+ - `multi_dataset_batch_sampler`: round_robin
368
+
369
+ </details>
370
+
371
+ ### Training Logs
372
+ | Epoch | Step | Training Loss |
373
+ |:------:|:----:|:-------------:|
374
+ | 0.7042 | 500 | 0.9125 |
375
+ | 1.4085 | 1000 | 0.2277 |
376
+ | 2.1127 | 1500 | 0.1527 |
377
+ | 2.8169 | 2000 | 0.1009 |
378
+
379
+
380
+ ### Framework Versions
381
+ - Python: 3.10.12
382
+ - Sentence Transformers: 3.3.1
383
+ - Transformers: 4.47.1
384
+ - PyTorch: 2.5.1+cu121
385
+ - Accelerate: 1.2.1
386
+ - Datasets: 3.2.0
387
+ - Tokenizers: 0.21.0
388
+
389
+ ## Citation
390
+
391
+ ### BibTeX
392
+
393
+ #### Sentence Transformers
394
+ ```bibtex
395
+ @inproceedings{reimers-2019-sentence-bert,
396
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
397
+ author = "Reimers, Nils and Gurevych, Iryna",
398
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
399
+ month = "11",
400
+ year = "2019",
401
+ publisher = "Association for Computational Linguistics",
402
+ url = "https://arxiv.org/abs/1908.10084",
403
+ }
404
+ ```
405
+
406
+ #### MultipleNegativesRankingLoss
407
+ ```bibtex
408
+ @misc{henderson2017efficient,
409
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
410
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
411
+ year={2017},
412
+ eprint={1705.00652},
413
+ archivePrefix={arXiv},
414
+ primaryClass={cs.CL}
415
+ }
416
+ ```
417
+
418
+ <!--
419
+ ## Glossary
420
+
421
+ *Clearly define terms in order to be accessible across audiences.*
422
+ -->
423
+
424
+ <!--
425
+ ## Model Card Authors
426
+
427
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
428
+ -->
429
+
430
+ <!--
431
+ ## Model Card Contact
432
+
433
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
434
+ -->
added_tokens.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "<mask>": 64000
3
+ }
bpe.codes ADDED
The diff for this file is too large to render. See raw diff
 
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "vinai/phobert-base",
3
+ "architectures": [
4
+ "RobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "gradient_checkpointing": false,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.1,
13
+ "hidden_size": 768,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "layer_norm_eps": 1e-05,
17
+ "max_position_embeddings": 258,
18
+ "model_type": "roberta",
19
+ "num_attention_heads": 12,
20
+ "num_hidden_layers": 12,
21
+ "pad_token_id": 1,
22
+ "position_embedding_type": "absolute",
23
+ "tokenizer_class": "PhobertTokenizer",
24
+ "torch_dtype": "float32",
25
+ "transformers_version": "4.47.1",
26
+ "type_vocab_size": 1,
27
+ "use_cache": true,
28
+ "vocab_size": 64001
29
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9f22fd487285b748a742dac4d801f7c66fc27bdb9c4e3c785525136e05b097b4
3
+ size 540015464
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 128,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": "<mask>",
6
+ "pad_token": "<pad>",
7
+ "sep_token": "</s>",
8
+ "unk_token": "<unk>"
9
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "64000": {
36
+ "content": "<mask>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": false,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 128,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "tokenizer_class": "PhobertTokenizer",
54
+ "unk_token": "<unk>"
55
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff