FP8 LLMs for vLLM Collection Accurate FP8 quantized models by Neural Magic, ready for use with vLLM! • 44 items • Updated Oct 17, 2024 • 62
notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-10-v0.1 Text Generation • Updated Apr 11, 2024 • 138
notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 Text Generation • Updated Mar 14, 2024 • 126