If you’ve converted a GPT into a Pickaxe so that you can embed it on your website or monetize it directly, the most likely reason your AI response is getting cut off is because it has a maximum output length limit set that is too low. We allow you to customize all these token lengths and sometimes the defaults are too low.
You can easily toggle this setting in the builder. Increasing it is very simple. Simply go to the “Configure” tab and then look for “Max Output Length” under the token lengths.
I’ve included a screenshot of the toggle below. In case you need more context, here is another help forum post about how to increase the maximum response lengths of AI models