Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,13 @@ tags:
|
|
7 |
- mixture of experts
|
8 |
---
|
9 |
|
10 |
-
# Custom MOE with Mergekit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
- mixture of experts
|
8 |
---
|
9 |
|
10 |
+
# Custom MOE with Mergekit
|
11 |
+
|
12 |
+
Base model: `mistralai/Mistral-7B-Instruct-v0.2`. Fused with experts:
|
13 |
+
|
14 |
+
- HuggingFaceH4/zephyr-7b-beta
|
15 |
+
- mistralai/Mistral-7B-Instruct-v0.2
|
16 |
+
- teknium/OpenHermes-2.5-Mistral-7B
|
17 |
+
- meta-math/MetaMath-Mistral-7B
|
18 |
+
|
19 |
+
This can be downloaded to improve the base mistrals model to reason across mathematical and other objectives.
|