kobkrit commited on
Commit
41a2bad
·
verified ·
1 Parent(s): 45bf2cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +64 -0
README.md CHANGED
@@ -130,6 +130,70 @@ Prompt format is based on ChatML.
130
 
131
  ## How to use
132
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
133
  ### Huggingface
134
  ```python
135
  from transformers import AutoModelForCausalLM, AutoTokenizer
 
130
 
131
  ## How to use
132
 
133
+
134
+ ### Free API Service (hosted by Siam.Ai and Float16.cloud)
135
+
136
+ #### Siam.AI
137
+ ```bash
138
+ curl https://api.aieat.or.th/v1/completions \
139
+ -H "Content-Type: application/json" \
140
+ -H "Authorization: Bearer dummy" \
141
+ -d '{
142
+ "model": ".",
143
+ "prompt": "<|im_start|>system\nคุณคือผู้ช่วยตอบคำถามที่ฉลาดและซื่อสัตย์<|im_end|>\n<|im_start|>user\nกรุงเทพมหานครคืออะไร<|im_end|>\n<|im_start|>assistant\n",
144
+ "max_tokens": 512,
145
+ "temperature": 0.7,
146
+ "top_p": 0.8,
147
+ "top_k": 40,
148
+ "stop": ["<|im_end|>"]
149
+ }'
150
+ ```
151
+
152
+ #### Float16
153
+ ```bash
154
+ curl -X POST https://api.float16.cloud/dedicate/78y8fJLuzE/v1/chat/completions \
155
+ -H "Content-Type: application/json" \
156
+ -H "Authorization: Bearer float16-AG0F8yNce5s1DiXm1ujcNrTaZquEdaikLwhZBRhyZQNeS7Dv0X" \
157
+ -d '{
158
+ "model": "openthaigpt/openthaigpt1.5-7b-instruct",
159
+ "messages": [
160
+ {
161
+ "role": "system",
162
+ "content": "คุณคือผู้ช่วยตอบคำถามที่ฉลาดและซื่อสัตย์"
163
+ },
164
+ {
165
+ "role": "user",
166
+ "content": "สวัสดี"
167
+ }
168
+ ]
169
+ }'
170
+ ```
171
+
172
+ ### OpenAI Client Library (Hosted by VLLM, please see below.)
173
+ ```python
174
+ import openai
175
+
176
+ # Configure OpenAI client to use vLLM server
177
+ openai.api_base = "http://127.0.0.1:8000/v1"
178
+ openai.api_key = "dummy" # vLLM doesn't require a real API key
179
+
180
+ prompt = "<|im_start|>system\nคุณคือผู้ช่วยตอบคำถามที่ฉลาดและซื่อสัตย์<|im_end|>\n<|im_start|>user\nกรุงเทพมหานครคืออะไร<|im_end|>\n<|im_start|>assistant\n"
181
+
182
+ try:
183
+ response = openai.Completion.create(
184
+ model=".", # Specify the model you're using with vLLM
185
+ prompt=prompt,
186
+ max_tokens=512,
187
+ temperature=0.7,
188
+ top_p=0.8,
189
+ top_k=40,
190
+ stop=["<|im_end|>"]
191
+ )
192
+ print("Generated Text:", response.choices[0].text)
193
+ except Exception as e:
194
+ print("Error:", str(e))
195
+ ```
196
+
197
  ### Huggingface
198
  ```python
199
  from transformers import AutoModelForCausalLM, AutoTokenizer