Chat GPT sometimes generates answers from previous contexts?
It often happens to me that when I use Chat GPT, it sometimes writes replies to previous things.
For example: "I asked it which furniture is important. It gave me an answer. 20 to 30 messages about other things later, when I asked it, "Are hats popular?" I got an answer about important furniture."
The longer the chat, the more confused chatGPT becomes. At first, it only focuses on the last statements and forgets what was at the beginning. If you’re hitting or the storage space gets even lower, it’ll get out like you do.
This is on the one hand the free memory on your computer. So add competing storage eaters like Firefox. On the other hand, the AI has to chew all the talk again and again. The number of tokens is limited. It also seems that it is constantly being made.
Copy security memory contents. Then you can delete the memory and enter a corrected and compressed new one.
Forgot about the beginning, I’ve been noticed for a long time. If you try to ask Chat GPT what it still knows from the beginning, then there’s something that doesn’t make any sense. So as “It still knows everything” This really leads to confusion because you can’t fill these gaps again because you don’t know what Chat GPT still knows from the beginning
Yeah, that’s the compression I talked about above. You were then squeezed into a drawer.
Create a new chat for every new conversation in ChatGPT, then solve the problem.
If not enough, go to the settings > Personalization > Delete memories. That should solve the problem forever 🙂
Ignores Chat GPT the previous chat history or does it also include the complete chat history in responses? Or just the memories?
The chats are separated from each other. The memories are only used when it seems reasonable. But you can point out at the beginning.
In the same chat will not be deliberately forgotten. It’s tokens. ChatGPT thinks in another language so you can’t count on letters. The language is not English, but its own, where there are no ambiguities that would have to come out of the context again and again (e.g. what is meant with “Kiefer” this time?) It tries to remember things by generalizing them. That’s how it fits. In addition, you often give unconscious answers, because ChatGPT works with the tricks of a fortune teller.
I meant if Chat GPT is based on everything in the same chat on every answer. Once read about something from a maximum of 8,000 digits and it then forgets the things when it is exceeded in a chat. Where not 1 letter should not be 1 digit
Yes chat gpt has a memory that it constantly updated for you.