Context length
#3
by
adeebDkheel
- opened
Hello,
meta-llama/Llama-3.1-8B-Instruct is 128K but I see 'meetkai/functionary-small-v3.2' is 16K,
how can I make 'meetkai/functionary-small-v3.2' 128K? and why it's not like base model lenght?
Thanks,
Although in our finetuning, we trained with 16k context-length, the original model was trained with 128k context length so we can still use 128k context length.
If you are using vllm, you can set: --max-model-len 131072
If you are using Sglang, you set --context-length 131072
khaimai
changed discussion status to
closed