Hi admins,
Some users are relating the “unexpected token” problem when they try to use the pickaxe.
Please help me fix.
Thanks!
Hi admins,
Some users are relating the “unexpected token” problem when they try to use the pickaxe.
Please help me fix.
Thanks!
This may have been related to OpenAI going down yesterday. Please let me know if it continues to occur.
Hi Mike,
We still have users overtime that are complaining about this issue.
Could you please help me? Thanks
can you send me a link to the too? I will look into it.
Yes. Here it is: https://beta.pickaxeproject.com/axe?id=PhD_Level_Scientific_Writing_Assistant_YO12C
Mike, did you have the chance to fix it?
Other users are complaining the same.
Here we have a pickaxe example: https://beta.pickaxeproject.com/axe?id=PhD_Level_Scientific_Writing_Assistant_YO12C
I could not generate the error myself when I used the tool. But looking at your settings I think it’s because you have the token limit at like 127,500/128,000 tokens so you really maxed it out. That means there’s no room for error and sometimes token counting gets funky.
I adjusted it and it should work now. Please let me know if you continue to get any errors.
Hi Mike,
Thanks. What is a safe token limit? I think I’ll need to adjust to other pickaxes.
Thanks!
Our token counting system is good, but not perfect. As its hard to always accurately tokenize. So it is possible for us to count a request at 100 tokens, then send it to OpenAI and for them to count it at 101 tokens. The reasons for this are quite complicated.
You’re using a model with a 128,000 token context window. It’s quite unlikely you really want to use all of that context. I would give it a couple thousand tokens as a buffer. Just to be safe.