Pickaxe is slow?

Pickaxe is amazing. It really is.

However, I’m a hardcore chatGPT user and whenever I use anything made with Pickaxe I can’t get over the fact that it takes a long time to provide a response by today’s AI model standards.
Six months ago this wouldn’t have been a problem but today that has changed.

It doesn’t help that sometimes Pickaxe shows me the three dots for like 3 seconds before displaying the entire paragraph at once, while chatGPT starts displaying the first word of a response like half a second later.

Will any future updates allow us to display the first word of a response as soon as it’s available?

1 Like

We’re always working to improve core site performance. So you’ll always see improvements in this department.

It’s hard to perfectly match OpenAI’s speed. They are a bit bigger than us :slight_smile:

Thanks for responding.
You guys have built an excellent tool.

Looking forward to the speed improvements. At least displaying words as they’re ready instead of full paragraphs will be an improvement, just like some open source tools already do.

Keep up the good work.

And in the examples using OpenAi models, it makes sense that their own UI is going to be able to function faster than the API plus whatever is done on the second-party UI.

Even if the architecture didn’t naturally lend itself to those results, it would be trivial for OpenAI to ensure ChatGPT was always faster than any GPT4o api solution.

For the record, we do have text-streaming which makes the answers appear character by character. Even if there’s a delay for the text to start streaming, it should always stream rather than appear in bulk.

But maybe you’re referring to ACTIONS? If the bubble you are talking about is this (see screenshot below). This is the loading state for Actions, which are calls to third-party APIs and software to get information.

Screenshot 2024-10-10 at 10.33.58 AM

Which model do you use? I use ChatGPT 4o mini and that thing flies!l