Context Length

#9
by PSM24 - opened

What is the context length?

Looks like 32k from config

InternLM org

1M, which is the same as InternLM2.5.

Thanks - does anything need changing in the config to allow it to use the extended context? Any other guidance? (I'm using TabbyAPI/exllamav2 but presumably anything that needs doing is applicable to others)

InternLM org
β€’
edited 1 day ago

The usage for long context input is the same as for short context. You don't need to make any changes to the code, you can directly try inputting longer texts, and the model will automatically extrapolate.
If you're unsure about the input length, you can use len(tokenizer.encode(input)) to get the number of tokens in the input text.

Sign up or log in to comment