Recommended context length?
#6
by
Tester100
- opened
Just a simple question. llamacpp shows n_ctx_train as 32768, does this mean this model natively supports 32k? Or is it better to go with 8k like with Kunoichi? Thanks! π
8k, I updated the model at some point but many of the quants and older downloads will have the old number.