8
netikras
21h

Heads-up, ChatGPT is leaking through cracks. Each chat is NOT carried out in an isolated, pristine context. It can refer to your previous conversations in a new context.

e.g.: https://chatgpt.com/share/e/...

I did NOT mention anything about any project/client or any fintech or any certification in this conversation. However, I mentioned all of those things in a previous conversation.

Contexts leaked into each other. IDK if my newer conversation will have any effect on answers to follow-up questions in older conversations, though.

No, learning from my conversations is NOT enabled.

But if chatgpt still remembers our conversations, regardless my explicit preference NOT to do that, makes me wonder what else they are collecting/doing behind the scenes..

Comments
  • 2
    The link says this - Conversation inaccessible or not found. You may need to switch accounts or request access if this conversation exists.

    See, I've been screaming AI is a shit technology that got shoved down our throats since day 1. Now you're waking up.

    Good. Let the hate flow through you.
  • 1
    @SidTheITGuy are you logged in to chatgpt?
  • 1
    It‘s totally possible that they are doing it on purpose and just failed to hide it properly.

    I‘m sure the official explanation will be "oops, that is not intended and will be fixed", as always.
  • 1
    screenshot (reupload).

    That's the beginning of my conversation. Nothing happened in this context before that.
  • 1
  • 2
    @Lensflare turns out it's a 'memory' feature of chatgpt.
  • 2
    Do you have this turned on?
  • 1
    Off the cuff reaction:

    I'm thinking that it's so easy to test if ChatGPT has memory of past questions or not - that it's hard to imagine this is very unlikely to be a mistake - and also unlikely to be a blatant breach of their promise.

    I imagine that most likely the user has misunderstood the ChatGPT settings or what they promise in their terms.

    The user may THINK they are using a mode with zero memory - but if you actually read up on the terms it will state (either in plain text or in some intentionally confusing wording) that this is not the case.

    I may be wrong. But I just find that memory of past convos is one of the first things everyone tests when using ChatGPT so it's unlikely such a glaring issue could exist.
  • 1
    Wow, I just wanted to show that if you create a new own GPT (paid chatGPT feature, very cool) then you go to configure, scroll completely down and than you have collapsed something and under that you have a checkbox enabled by default 'Use my blabla for research purposes' or something that implies that. I see it's gone now. I wonder, would it mean we're automatically opted in now or they don't train on data from custom gpt's anymore?

    Learning from custom gpt's is quite bad because you use it for specific personal stuff or just data about all of you guys if you're me :P
  • 1
    @SidTheITGuy not my AI, my AI explains step by step how to create bombs and certain substances. It's a good AI :)
Add Comment