Text Generation
Transformers
GGUF
Safetensors
PyTorch
mistral
quantized
2-bit
3-bit
4-bit precision
5-bit
6-bit
8-bit precision
GGUF
Safetensors
text-generation-inference
Merge
7b
mistralai/Mistral-7B-Instruct-v0.2
HuggingFaceH4/zephyr-7b-beta
Generated from Trainer
en
dataset:HuggingFaceH4/ultrachat_200k
dataset:HuggingFaceH4/ultrafeedback_binarized
arxiv:2305.18290
arxiv:2310.16944
Eval Results
Inference Endpoints
conversational
MaziyarPanahi
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -107,7 +107,7 @@ pip3 install huggingface-hub
|
|
107 |
Then you can download any individual model file to the current directory, at high speed, with a command like this:
|
108 |
|
109 |
```shell
|
110 |
-
huggingface-cli download
|
111 |
```
|
112 |
</details>
|
113 |
<details>
|
@@ -130,14 +130,13 @@ pip3 install hf_transfer
|
|
130 |
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
|
131 |
|
132 |
```shell
|
133 |
-
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download
|
134 |
```
|
135 |
|
136 |
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
|
137 |
</details>
|
138 |
-
<!-- README_GGUF.md-how-to-download end -->
|
139 |
|
140 |
-
|
141 |
## Example `llama.cpp` command
|
142 |
|
143 |
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
|
|
|
107 |
Then you can download any individual model file to the current directory, at high speed, with a command like this:
|
108 |
|
109 |
```shell
|
110 |
+
huggingface-cli download MaziyarPanahi/zephyr-7b-beta-Mistral-7B-Instruct-v0.2-GGUF zephyr-7b-beta-Mistral-7B-Instruct-v0.2-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
|
111 |
```
|
112 |
</details>
|
113 |
<details>
|
|
|
130 |
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
|
131 |
|
132 |
```shell
|
133 |
+
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download MaziyarPanahi/zephyr-7b-beta-Mistral-7B-Instruct-v0.2-GGUF zephyr-7b-beta-Mistral-7B-Instruct-v0.2-GGUF.Q4_K_M.gguf --local-dir . --local-dir-use-symlinks False
|
134 |
```
|
135 |
|
136 |
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
|
137 |
</details>
|
|
|
138 |
|
139 |
+
|
140 |
## Example `llama.cpp` command
|
141 |
|
142 |
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
|