I’m noticing more and more of these platforms that we use to develop/create our GPTs are giving us access to review the user chat logs! It’s only the small players because OpenAI isn’t doing this for a reason! It’s a violation of global privacy laws.
My question is who is held liable for this? I don’t care to see the user chat logs! But if I build the GPT am I held liable for breaking the law or is the developer of the platform we are building on held liable?
This is a very important question that needs to be addressed because if it’s the owner of the GPT then I don’t want to use any platform to build my GPTs that could cause me to have legal issues.
Anyone knows? Do we have any lawyers in here?
I’m thinking it may fall back on the person developing the GPT otherwise why would the creators of these platforms build in such a feature?
I think I read that here at Pickaxe we can view user logs! Is that true? I need to know because this is a serious concern because I am building a medical record reviewer GPT that a client asked for but no one should be able to see the chats!
Access to user chat logs should be granted by the User and they have to be told that the developer can read everything they chat with the GPT about so that they can opt-in or out. But to have this in the background and they don’t know they are being monitored is a disaster waiting to happen especially if it’s sensitive information. And it’s only the small guys doing this so that should be a big red flag.
Who gets sued? That IS the question!
2 Likes
“global privacy laws” is vague. I too would like to see an option to opt out of access to my customer’s data, but it should be an option as for other tools I’ve created this information is essential.
The larger questions seems to be how to ensure the end user is aware of whatever privacy does or doesn’t exist between them and the chatbot along with the developers of said bot.
1 Like
Hi @nataiverse
Happy to answer your question. Users of tools on our site are informed by the privacy policy and our terms (that they must agree to use the service) that their chat logs can be monitored by tool builders.
Tool builders are likewise informed by the same privacy policy and terms that it is their obligation to inform users of this if tools are being used in an embedded environment or otherwise not in an environment that pickaxe directly and completely controls and brands.
I would take a look at section 5 of the privacy policy and sections 6 and 7 of the terms of service.
Hope that helps! Not breaking any “international privacy laws” today 
Okay, thanks for the quick response. I will have to take a closer look at those. But we all know, that people don’t read those policies in detail. So, I won’t be able to use this platform for the medical records GPT I was building.
Is there a way for us, the developer, to opt-out of seeing any user conversation? I do not want to read any user conversations. I don’t have the desire or time for it.
For safe measure, I would prefer to not even have access to their conversations with any GPT I build. Then I don’t have to worry about it and neither does the user. I would be safer and better for everyone. This gives AI a bad rap.
Please and thank you.
3 Likes
I am also considering building a tool with Pickaxe and do not want to have the ability to review any chat logs / chat history that users of my chat or forms would be inputting. I do not believe they would use it knowing the creator of the tool can just review anything they discuss with the chatbot. Is there no opt out option for this?
3 Likes
I’m with you on this. I don’t want to see the chats and that way no liability for any privacy breaches if we can’t see them. But as of now, there is no button to opt out. I also which there was.
If there was just this one feature, PickAxe would be everything I want. Struggling to find the right option as a no code builder
With our new designs we will be adding a toggle that lets you toggle on/off viewing responses on a studio by studio level.
4 Likes
Excellent! Thanks for the update
@daddish I understand your concern. I have a similar concern for one tool I’m working on.
To solve it, I decided to create a new company policy for my business that would outline specific extenuating circumstances that would allow me or my team to open user chat records (most of them are local/federal law compliance).
This way:
- my team has a solid policy in place to clarify the boundaries and address ethical concerns they might have.
- solves any potential privacy breaches.
- we apply this to each similar app that does not require us (by design) to analyze user messaging history.
After we have the policy ready, I’ll be adding it to the app’s privacy policy of the Pickaxe app.
Not a lawyer. Not legal advice
1 Like
Any update on when we’ll be able to toggle this off?
1 Like
Where is this toggle in the new design? Thanks!
Within any Studio, go to Settings. Then look for History. Here’s a screenshot.
1 Like