So… my settings for the knowledgebase have been arbitrarily changed (without my knowledge and without warning) and the maximum amount that can be pulled has dramatically reduced! From rough calculations, 2500 tokens (the new maximum) is around 10,000 characters or around 6 or 7 pages or as described by Pickaxe, a short essay! I currently need to ingest pace charts that are at least ten pages long before bringing in any meaningful knowledge!. As a result, my studio now no longer produces reliable or correct responses.
Upon checking a test prompt in the backend, it doesn’t actually make it to the pace charts before the token count is met and hence they don’t get used.
My studio was extremely reliable until this change… the tuning on the pull and the relevance, took me WEEKS… this has now gone out the window and today, it told a user that their training should include running 400m reps (6 of them) at 36 seconds per rep… I’m sure you wont be surprised to know that the world record here is 43.03 seconds so seven seconds slower than the training pace pickaxe offered to an U17W athlete!
This massively undermines my studio and begs the question… what’s the point in using Pickaxe if we can’t pull sensible amounts of knowledge as context?
A secondary question here is why aren’t you announcing or forewarning these critical changes?
Notes: I appreciate the maximum here was always influenced by the maximum tokens for the context window of the GPT being used… I use OpenAl o3-mini and so should have around 190k tokens to play with.