ABX-AI commited on
Commit
4c62769
·
verified ·
1 Parent(s): b2234f6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -7
README.md CHANGED
@@ -1,17 +1,27 @@
1
  ---
2
- base_model:
3
- - Himitsui/Kaiju-11B
4
  library_name: transformers
5
  tags:
6
  - mergekit
7
  - merge
8
-
 
9
  ---
10
- # output-model-directory
11
 
12
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
 
 
 
 
 
 
 
13
 
14
  ## Merge Details
 
 
 
15
  ### Merge Method
16
 
17
  This model was merged using the SLERP merge method.
@@ -20,7 +30,10 @@ This model was merged using the SLERP merge method.
20
 
21
  The following models were included in the merge:
22
  * [Himitsui/Kaiju-11B](https://huggingface.co/Himitsui/Kaiju-11B)
23
- * ./MODELS/Solstice-FKL-v2-10.7B
 
 
 
24
 
25
  ### Configuration
26
 
@@ -43,4 +56,4 @@ parameters:
43
  value: [0.4, 0.3, 0.2, 0.1, 0]
44
  - value: 0.5
45
  dtype: bfloat16
46
- ```
 
1
  ---
2
+ base_model: []
 
3
  library_name: transformers
4
  tags:
5
  - mergekit
6
  - merge
7
+ - llama
8
+ - not-for-all-audiences
9
  ---
10
+ # Silver-Sun-v2-11B
11
 
12
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d936ad52eca001fdcd3245/9DobeVeyL98G7QUufEeQg.png)
13
+
14
+ > This is an updated version of Silver-Sun-11B. The change is that now the Solstice-FKL-v2-10.7B merge uses Sao10K/Fimbulvetr-11B-v2 instead of v1.
15
+ > Additionally, the config of the original Silver-Sun was wrong, and I have also updated that.
16
+ > As expected, this is a HIGHLY uncensored model. It should perform even better than the v1 due to the updated Fimb, and the fixed config.
17
+
18
+ **This model works with Alpaca, and from my tests, also ChatML. However Alpaca may be a better option. Try it out and use whatever works better for you.**
19
+ **Due to a quirk with Solar, if you want the best quality either launch at 4K context, or launch at 8K (and possibly beyond - have not tested it that high) with 4k context pre-loaded in the prompt.**
20
 
21
  ## Merge Details
22
+
23
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
24
+
25
  ### Merge Method
26
 
27
  This model was merged using the SLERP merge method.
 
30
 
31
  The following models were included in the merge:
32
  * [Himitsui/Kaiju-11B](https://huggingface.co/Himitsui/Kaiju-11B)
33
+ * ABX-AI/Solstice-FKL-v2-10.7B
34
+ >[!NOTE]
35
+ >A mixture of [Sao10K/Solstice-11B-v1](https://huggingface.co/Sao10K/Solstice-11B-v1) and
36
+ >[ABX-AI/Fimbulvetr-Kuro-Lotus-v2-10.7B] which is updated saishf/Fimbulvetr-Kuro-Lotus-10.7B with Fimb v2
37
 
38
  ### Configuration
39
 
 
56
  value: [0.4, 0.3, 0.2, 0.1, 0]
57
  - value: 0.5
58
  dtype: bfloat16
59
+ ```