gordicaleksa
commited on
Add docker advanced instruction to README (#792)
Browse files
README.md
CHANGED
@@ -114,6 +114,25 @@ accelerate launch -m axolotl.cli.inference examples/openllama-3b/lora.yml \
|
|
114 |
docker compose up -d
|
115 |
```
|
116 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
117 |
#### Conda/Pip venv
|
118 |
1. Install python >=**3.9**
|
119 |
|
|
|
114 |
docker compose up -d
|
115 |
```
|
116 |
|
117 |
+
<details>
|
118 |
+
|
119 |
+
<summary>Docker advanced</summary>
|
120 |
+
|
121 |
+
A more powerful Docker command to run would be this:
|
122 |
+
|
123 |
+
```bash
|
124 |
+
docker run --gpus '"all"' --rm -it --name axolotl --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 --mount type=volume,src=axolotl,target=/workspace/axolotl -v ${HOME}/.cache/huggingface:/root/.cache/huggingface winglian/axolotl:main-py3.10-cu118-2.0.1
|
125 |
+
```
|
126 |
+
|
127 |
+
It additionally:
|
128 |
+
* Prevents memory issues when running e.g. deepspeed (e.g. you could hit SIGBUS/signal 7 error) through `--ipc` and `--ulimit` args.
|
129 |
+
* Persists the downloaded HF data (models etc.) and your modifications to axolotl code through `--mount`/`-v` args.
|
130 |
+
* The `--name` argument simply makes it easier to refer to the container in vscode (`Dev Containers: Attach to Running Container...`) or in your terminal.
|
131 |
+
|
132 |
+
[More information on nvidia website](https://docs.nvidia.com/deeplearning/frameworks/user-guide/index.html#setincshmem)
|
133 |
+
|
134 |
+
</details>
|
135 |
+
|
136 |
#### Conda/Pip venv
|
137 |
1. Install python >=**3.9**
|
138 |
|