MetaMathOctopus-MAPO-DPO-7B-GGUF / MetaMathOctopus-MAPO-DPO-7B.Q3_K_M.gguf

Commit History

uploaded from rich1
0d8e767
verified

mradermacher commited on