MetaMathOctopus-MAPO-DPO-7B-i1-GGUF / MetaMathOctopus-MAPO-DPO-7B.i1-Q3_K_S.gguf

Commit History

uploaded from rich1
08b9f8d
verified

mradermacher commited on