Smaller version for Home User GPU's
#2
by
apcameron
- opened
Are you planning to release a smaller version of V3 that could run on a 24GB GPU?
a 70b model would be pretty good
a 70b model would be pretty good
that something that most home can't run haha
maybe 32 MoE model
+1
Would love to see DeepSeek-V3-Lite
16B or 27B version would be just wonderful to have
+1 here
16b would be fine, just like deepseek v2 lite
a 70b model would be pretty good
that something that most home can't run haha
maybe 32 MoE model
i was saying 70b model made up of 500-1500m parm parts/smaller models. this would have 140-40 parts. this would only use around 235-350GB of VRAM. but you could quantize it to fp4 and it would probably use half of it