Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
RichardErkhov
/
sail_-_Sailor-1.8B-gguf
like
0
GGUF
Inference Endpoints
conversational
arxiv:
2404.03608
Model card
Files
Files and versions
Community
Deploy
Use this model
main
sail_-_Sailor-1.8B-gguf
1 contributor
History:
24 commits
RichardErkhov
uploaded readme
328ca68
verified
8 months ago
.gitattributes
Safe
2.82 kB
uploaded model
8 months ago
README.md
Safe
10.6 kB
uploaded readme
8 months ago
Sailor-1.8B.IQ3_M.gguf
Safe
985 MB
LFS
uploaded model
8 months ago
Sailor-1.8B.IQ3_S.gguf
Safe
954 MB
LFS
uploaded model
8 months ago
Sailor-1.8B.IQ3_XS.gguf
Safe
925 MB
LFS
uploaded model
8 months ago
Sailor-1.8B.IQ4_NL.gguf
Safe
1.13 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.IQ4_XS.gguf
Safe
1.09 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q2_K.gguf
Safe
847 MB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q3_K.gguf
Safe
1.02 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q3_K_L.gguf
Safe
1.06 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q3_K_M.gguf
Safe
1.02 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q3_K_S.gguf
Safe
954 MB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q4_0.gguf
Safe
1.12 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q4_1.gguf
Safe
1.22 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q4_K.gguf
Safe
1.22 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q4_K_M.gguf
Safe
1.22 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q4_K_S.gguf
Safe
1.16 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q5_0.gguf
Safe
1.31 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q5_1.gguf
Safe
1.41 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q5_K.gguf
Safe
1.38 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q5_K_M.gguf
Safe
1.38 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q5_K_S.gguf
Safe
1.33 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q6_K.gguf
Safe
1.58 GB
LFS
uploaded model
8 months ago
Sailor-1.8B.Q8_0.gguf
Safe
1.96 GB
LFS
uploaded model
8 months ago