My bot is responding slow what is the fastest LLM I should be using for a quicker response?
We just switched over our backend from AWS to google cloud. Things are a little slow while we are dialing in the settings. Will be better in a day or two!
THANK you for the update!
If you’re curious here is a generation latency chart Nathaniel posted about the average time to respond. We’re getting things back to normal ~2 second response times.
10-15 seconds is the norm for me (and makes any pickaxes I create comepletely useless).
What’s going on? Is this due to me being located in Europe? I’m beginning to lose hope.
Same here (User in France)… Any hope we can get some boost in performance ?
Yes we’re working on it!
Is there a timeline on this? I’ve paid for a year of Pro and am unable to properly launch or market any products due to the continued performance issues.
I really hope they fix this and make it fast. It’s such a good product but the speed kills it.
Yes the new redesigned version is much faster.
When will this be released?
It’s already being rolled out on a limited basis to select testers. It will be released to everyone by the end of the month. I know the recent slowness of the main site has been frustrating. We’ve re-architected the entire system to be more performative.