Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
hermes42
/
Mixtral-8x22B-v0.1-GGUF
like
12
GGUF
5 languages
Mixture of Experts
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
d093648
Mixtral-8x22B-v0.1-GGUF
1 contributor
History:
24 commits
hermes42
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-g with huggingface_hub
d093648
verified
10 months ago
.gitattributes
Safe
3.15 kB
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-g with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-a
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-a with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-b
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-b with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-c
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-c with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-d
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-d with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-e with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-f
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-f with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-g
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_M.gguf-part-g with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-a
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-a with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-b
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-b with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-c
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-c with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-d
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-d with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-e
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-e with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-f
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-f with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-g
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-g with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-h
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-h with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-i
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-i with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-j
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-j with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-k
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-k with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-l
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-l with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-m
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-m with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-n
Safe
5.37 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-n with huggingface_hub
10 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-o
Safe
5.32 GB
LFS
Upload Mixtral-8x22B-v0.1.Q4_K_S.gguf-part-o with huggingface_hub
10 months ago
README.md
Safe
3.62 kB
Update README.md
10 months ago