Fetching metadata from the HF Docker repository...
Phi2 multipack (#1173)
814aee6
unverified
-
cerebras
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
code-llama
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
falcon
Falcon embeddings (#1149) [skip docker]
-
gptj
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
jeopardy-bot
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
llama-2
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
mamba
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
mistral
Fine-Tuning Mistral-7b for Real-World Chatbot Applications Using Axolotl (Lora used) (#1155)
-
mpt-7b
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
openllama-3b
Add shifted sparse attention (#973) [skip-ci]
-
phi
Phi2 multipack (#1173)
-
pythia-12b
Feat(wandb): Refactor to be more flexible (#767)
-
pythia
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
qwen
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
redpajama
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
replit-3b
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
tiny-llama
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
xgen-7b
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
-
yi-34B-chat
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]