metadata
license: apache-2.0
I'm sorry I don't have a real README yet. This model is trained on a subset of orca-best and heavily cleaned (but extremely degenerate) reverse proxy data logs. It uses Vicuna prompt format and is based off of Mistral-7b. I don't want to waste anyone's time until I'm positive I have something that can blow everyone's mind.