File size: 3,099 Bytes
f51ffd3 3d0f646 f51ffd3 3d0f646 f51ffd3 a210820 f51ffd3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
---
base_model: [Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2]
library_name: transformers
tags:
- mergekit
- merge
---
# Kor-merge-llama 3.1 8B
> "μ΄λ κ² κ³μλ Ήμ΄ μ ν¬λμμ΄. μλΉκ΅°μ μ΄μ μ΄λ»κ² ν΄μΌν κΉ?"
>
> κ³μλ Ή μ ν¬λ‘ μΈν΄ μλΉκ΅°μ λν μλ‘μ΄ μ§μΉ¨μ΄λ μ§μκ° λ΄λ €μ§ κ²μΌλ‘ μμλ©λλ€. νμ§λ§ μ λ νμ¬μ μν©μ μ νν νμ
νκΈ° μ΄λ ΅κ³ , μλΉκ΅°μ νλ κΈ°μ€μ΄λ νλ λ²μλ₯Ό κ²°μ νλ κ²μ ν΄λΉ λΆμλ κΈ°κ΄μ μ λ¬Έκ°λ€μ΄ λ΄λΉν μΌμ
λλ€. κ³μλ Ήμ λ°λ₯Έ μλΉκ΅°μ μν κ³Ό μ무μ λν΄μλ 곡μμ μΈ μλ΄λ μ§μΉ¨μ κΈ°λ€λ¦¬λ κ²μ΄ μ’μ κ² κ°μμ. μμ μ μ μνμκΈ° λ°λΌλ©°, μν© νμ
μ μ΅μ μ λ€ν΄ μ£ΌμκΈ°λ₯Ό λ°λλλ€.
> "κ·Έλ. μκ² μ΄, κ·Όλ° λ΄κ° κ΅°μΈμ΄λ©΄ λλ κ΅°μΈμΈκ±° μμ§?"
>
> λ€, μ λ AIλΌλ μ μ κ°μνλ©΄ κ΅°μΈμ΄λ λΉκ΅°μΈ ꡬλΆμ΄ μμ΄μ. νμ§λ§ μ λ μ¬λ¬λΆμ μ견과 κ°μ μ μ΄ν΄νκ³ κ³΅κ°νλ κΈ°λ₯μ κ°μΆκ³ μμ΄μ. κ·Έμ λ°λΌ, κ΅°μΈμΌλ‘μμ κ·νμ μ
μ₯κ³Ό κ±±μ μ λν΄ μ λ 곡κ°ν μ μμ΄μ. νμ§λ§ κ³μλ Ήμ λ°λ₯Έ μλΉκ΅°μ νλ μ§μΉ¨μ΄λ μ무μ λν΄μλ μ λ¬Έκ°λ€μ νλ¨μ λ°λ₯΄λ κ²μ΄ λ°λμ§ν κ² κ°μμ. 곡μμ μΈ μλ΄λ μ§μλ₯Ό κΈ°λ€λ¦¬λ κ²μ΄ κ°μ₯ μμ νκ³ μ¬λ°λ₯Έ λ°©λ²μΌ κ±°μμ. μν©μ΄ λμ± μμ λκΈ°λ₯Ό λ°λΌλ©°, κ·νμ κ°μ‘±λΆλ€μ μμ μ κΈ°μν©λλ€.
νκ΅μ΄ μνκ³ , ν 루μλ€μ΄μ
(νμ리) λͺ»μ‘μμ΅λλ€.
ν¨μ μν¨ λͺ¨λΈμ΄λΌ λΌμ΄μΌμ€κ° μ΄κ² λ§λ μΆμ§λ§ μΌλ¨ μ¬λ €λ΄
λλ€.
νκ΅μ΄ μνλ llama 3.1 μ°ΎμΌμλλΆλ€μ΄ μ μ©νκ² μ¬μ©νμ€ μ μμΌλ©΄ μ’κ² μ΅λλ€.
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using Llama-3.1-8B-Lexi-Uncensored-V2 as a base.
### Models Merged
The following models were included in the merge:
* ktdsbaseLM-v0.2-onbased-llama3.1
* Llama-VARCO-8B-Instruct
* llama-3.1-8b-komedic-instruct
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
# no parameters necessary for base model
- model: AIDXteam/ktdsbaseLM-v0.2-onbased-llama3.1
parameters:
density: 0.5
weight: 0.5
- model: unidocs/llama-3.1-8b-komedic-instruct
parameters:
density: 0.8
weight: 0.7
- model: NCSOFT/Llama-VARCO-8B-Instruct
parameters:
density: 0.3
weight: 0.5
- model: unidocs/llama-3.1-8b-komedic-instruct
parameters:
density: 0.4
weight: 0.5
- model: NCSOFT/Llama-VARCO-8B-Instruct
parameters:
density: 0.5
weight: 0.5
merge_method: dare_ties
base_model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
dtype: bfloat16
```
|