Maximum context length

Trying to use a personal Pickaxe with only 2 knowledge base documents.

Then, my question which was relatively quite basics was the following

Here is a job offer:

“160 words”

Please help draft a short email of intention for the proposal.

The model then returns;

“This model’s maximum context length is 8192 tokens. However, your messages resulted in 11540 tokens (11021 in the messages, 519 in the functions). Please reduce the length of the messages or functions.”

I’ve switched the model from GPT 4 to GPT-4o and it worked, but would like to understand why it was’nt with GPT-4 which I thought was superior.

Different models have different context size windows. So GPT-4 has about an 8,000 token context window size. GPT-4o has 128,000. It’s much larger.

1 Like

I have a similar problem where I am trying to update the knowledge base with an excel sheet - around 1000 rows and 50 columns - the tool seems to be note reading the whole excel but only snippets from it. Does this have to do with the context size? Or do large excels not get read well?

Tried across multiple LLMs - still very unreliable results.