Update README.md
Browse files
README.md
CHANGED
@@ -1,10 +1,68 @@
|
|
1 |
---
|
|
|
2 |
language:
|
3 |
-
- en
|
|
|
4 |
pipeline_tag: text-generation
|
5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
|
|
|
8 |
# **Sakura-SOLAR-Instruct**
|
9 |
<img src='./sakura.png' width=512>
|
10 |
|
@@ -29,8 +87,8 @@ I shared the information about my model. (training and code)
|
|
29 |
|
30 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
31 |
| --- | --- | --- | --- | --- | --- | --- | --- |
|
32 |
-
| Sakura-
|
33 |
-
| Sakura-SOLAR-Instruct-DPO-
|
34 |
| [kyujinpy/Sakura-SOLAR-Instruct](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct) | 74.40 | 70.99 | 88.42 | 66.33 | 71.79 | 83.66 | 65.20
|
35 |
> Rank1 2023.12.27 PM 11:50
|
36 |
|
|
|
1 |
---
|
2 |
+
license: cc-by-nc-sa-4.0
|
3 |
language:
|
4 |
+
- en
|
5 |
+
library_name: transformers
|
6 |
pipeline_tag: text-generation
|
7 |
+
base_model: kyujinpy/Sakura-SOLAR-Instruct
|
8 |
+
model_creator: KyujinHan
|
9 |
+
model_name: Sakura Solar Instruct
|
10 |
+
datasets:
|
11 |
+
- argilla/distilabel-math-preference-dpo
|
12 |
+
tags:
|
13 |
+
- exl2
|
14 |
---
|
15 |
+
# Sakura-SOLAR-Instruct
|
16 |
+
|
17 |
+
- Model creator: [KyujinHan](https://huggingface.co/kyujinpy)
|
18 |
+
- Original model: [Merged AGI 7B](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct)
|
19 |
+
|
20 |
+
## EXL2 Quants
|
21 |
+
You can use [TheBloke's GPTQ quants](https://huggingface.co/TheBloke/Sakura-SOLAR-Instruct-GPTQ) for 4bit or lower. I'm providing higher exl2 quants so exllamav2 loader can still be used. Feel free to leave suggestions for other quants.
|
22 |
+
|
23 |
+
- [5.0bpw (main)](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/main)
|
24 |
+
- [6.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/6.0bpw)
|
25 |
+
- [7.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/7.0bpw)
|
26 |
+
- [8.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/8.0bpw)
|
27 |
+
|
28 |
+
Zipped Quantization (if you want to download a single file)
|
29 |
+
- [5.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/5.0bpw-zip)
|
30 |
+
- [6.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/6.0bpw-zip)
|
31 |
+
- [7.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/7.0bpw-zip)
|
32 |
+
- [8.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/8.0bpw-zip)
|
33 |
+
|
34 |
+
## Calibration Dataset
|
35 |
+
Training dataset of Sakura-SOLAR-Instruct child models
|
36 |
+
[argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo)
|
37 |
+
|
38 |
+
## Memory Usage
|
39 |
+
Use [TheBloke's 4bit-32g quants](https://huggingface.co/TheBloke/Sakura-SOLAR-Instruct-GPTQ/tree/gptq-4bit-32g-actorder_True) (7.4GB VRAM usage) if you have 8GB cards.
|
40 |
+
|
41 |
+
Measured using ExLlamaV2 and 4096 max_seq_len with [Oobabooga's Text Generation WebUI](https://github.com/oobabooga/text-generation-webui/tree/main).
|
42 |
+
| Branch | BPW | VRAM Usage | Description |
|
43 |
+
| ------ | --- | ---------- | ----------- |
|
44 |
+
[5.0bpw (main)](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/main)|5.0|7.7 GB|For >=10GB VRAM cards
|
45 |
+
[6.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/6.0bpw)|6.0|9.0 GB|For >=10GB VRAM cards with idle VRAM atleast or below 500MB (headroom for other things)
|
46 |
+
[7.0bpw](https://huggingface.co/hgloow/Sakura-SOLAR-Instruct-EXL2/tree/7.0bpw)|7.0|10.2 GB|For >=11GB VRAM cards with idle VRAM atleast or below 500MB (headroom for other things)
|
47 |
+
[8.0bpw](https://huggingface.co/hgloow/MSakura-SOLAR-Instruct-EXL2/tree/8.0bpw)|8.0|11.3 GB|For >=12GB VRAM cards with idle VRAM atleast or below 500MB (headroom for other things)
|
48 |
+
|
49 |
+
## Prompt template: Orca-Hashes
|
50 |
+
Courtesy of [TheBloke](https://huggingface.co/TheBloke)
|
51 |
+
```
|
52 |
+
### System:
|
53 |
+
{system_message}
|
54 |
+
|
55 |
+
### User:
|
56 |
+
{prompt}
|
57 |
+
|
58 |
+
### Assistant:
|
59 |
+
|
60 |
+
```
|
61 |
+
|
62 |
+
### If you use Oobabooga's Chat tab
|
63 |
+
From my testing, the template "Orca-Mini" or any of the Orca templates produced the best result. Feel free to leave a suggestion if you know better.
|
64 |
|
65 |
+
# Original Info
|
66 |
# **Sakura-SOLAR-Instruct**
|
67 |
<img src='./sakura.png' width=512>
|
68 |
|
|
|
87 |
|
88 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
89 |
| --- | --- | --- | --- | --- | --- | --- | --- |
|
90 |
+
| Sakura-SOLRCA-Instruct-DPO | 74.05 | 71.16 | 88.49 | 66.17 | 72.10 | 82.95 | 63.46 |
|
91 |
+
| Sakura-SOLAR-Instruct-DPO-v2 | 74.14 | 70.90 | 88.41 | 66.48 | 71.86 | 83.43 | 63.76 |
|
92 |
| [kyujinpy/Sakura-SOLAR-Instruct](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct) | 74.40 | 70.99 | 88.42 | 66.33 | 71.79 | 83.66 | 65.20
|
93 |
> Rank1 2023.12.27 PM 11:50
|
94 |
|