User memory leaking

User memory for my bot is being applied to new users, so when they login in/register the system thinks they are me (uses the name I last used while testing the bot with my own user account for example).

EDIT: Should all bots in a studio have User memory? Would that help? Im a little unsure how the usre memory is handled…

Rgds
Terry

1 Like

Are they in the same building (i.e. on the same IP address)?

Thanks for the reply. No this isn’t the case. Clearly something else is going on.

I found if I used ice breakers (to save my time when testing and building) it reinforced that this was the route most people should go down when going live. So I found the training and memory to be an issue also.

What was happening for you?

In my ice breaker it pre selected a particular hotel to save me asking the question while I was building and testing. Then it assumed the customers always wanted to go to this hotel.

I’m not sure I follow completely. What was the issue? The same ice breaker was being automatically selected for all users? Or was it something to do with user memory?

No.
Because I used the same ice breaker over and over that used a specific hotel the LLM seemed to be taught this was the hotel eveyone would be asking the question about. So it would offer the answer to the ice breaker at the start befor eanything was asked.

@admin_mike I have the same problem. I did a test with the user memory and now is applying it to everyone else using the studio (different IP addresses and using 5G not wifi)

I need help removing the memory as the pickaxe / studio becomes unusable otherwise.

Help please…

Pickaxe is a tech layer which will use the model you selected so that is where this would need deleting and I am not sure it is that easy if possible at all. Perhaps retraining it or expressly each time asking it to not learn may work but it’s all down to the black boxes of the LLM’s and the naure of AI.

In that case I need to transfer the ownership to another e-mail address and delete the current owner and hopefully the memory with it. @admin_mike can we do this?

If LLM’s work like a super large ‘pot’ with everyones data feeding the model from all over then we shouldn’t be seeing this issue here. So there is some ‘local’ model which would be tied to the API call and user. Pickaxe will have an API key they will assign to each model you select from the dropdown so like claude will have a key and OpenAI will have a key. You could create a key and select that and see if that clears your issue. But the issue is likely to come back if the nature of use is repeated.
I am supposing here lots. But the API keys are how PA speaks to the LLM API’s ike OpenAI.

I built an autoblogger for wordpress using it in a different project so I had my own key already.