Max Tokens

#1
by rmontoya-agi - opened

Hi, I'm testing the 72B model and I get the following error with one of my prompts.

This model's maximum context length is 4096 tokens. However, you requested 4208 tokens (208 in the messages, 4000 in the completion). Please reduce the length of the messages or completion.

Is there a way to increase the max length?

Sign up or log in to comment