Smaller Version Request
#5
by
oguzhanercan
- opened
Hi, thanks for your great work, best VL model I have ever tried but it is almost impossible to deploy. Are you planning to distill it to smaller models like DeepSeek did?
Completely agree! MiniMaxAI/MiniMax-VL-01 is hands down the best VL model right now, but deployment is a real challenge. A distilled version, like what DeepSeek did, could make it way more practical while keeping its incredible VL capabilities. If not distillation, at least a well-optimized quantized version could help. Hoping the team considers this!