You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When attaching documents models do not keep context as well and sometimes in as little as one message they seem to reset and act as though the next message is the very first message, it always seems to happen in about 20-30 messages no matter how settings are tweaked
Bug Summary:
Documents attached to models causes them to lose the plot of the conversation
Steps to Reproduce:
Upload several documents to open-webui and attach them to a model directly then just talk to the model
Expected Behavior:
Documents increase knowledge and the model just gives more informed responses maintaining response quality and context.
Actual Behavior:
Documents often result in a noticeable impact on response quality as well as the fact models within 1-30 messages seem to reset and act like your next message is the very first with no way to get them to take previous messages into account again.
Environment
Open WebUI: 0.3.5
Ollama: 0.1.46
Operating System: docker running on ubuntu
Browser (if applicable): i have used librewolf, icecat & safari openwebui works fine problem not related to the browser.
Reproduction Details
Confirmation:
I have read and followed all the instructions provided in the README.md.
I am on the latest version of both Open WebUI and Ollama.
I have included the browser console logs.
I have included the Docker container logs.
Installation Method
Using the cuda version of open-webui in docker, The whole thing is running on proxmox, docker with open-webui is running in a LXC container with basically direct host performance and Ollama is running on a windows 10 virtual machine with vGPU profile to make the most of VRAM with many different applications.
Additional Information
It seems fairly easy to reproduce i have uploaded many documents, chat messages say "all documents" under them 5-7x and within a few responses it resets and acts like the next message is the first interaction again
I have tried increasing the num_keep variable and the context length, which does help a lot and makes it take longer before it resets, but once it resets and forgets everything theres no going back it seems. (So it seems the best way to reproduce this is to set those variables very low)
The text was updated successfully, but these errors were encountered:
Bug Report
Models lose track of the context and reset
Description
When attaching documents models do not keep context as well and sometimes in as little as one message they seem to reset and act as though the next message is the very first message, it always seems to happen in about 20-30 messages no matter how settings are tweaked
Bug Summary:
Documents attached to models causes them to lose the plot of the conversation
Steps to Reproduce:
Upload several documents to open-webui and attach them to a model directly then just talk to the model
Expected Behavior:
Documents increase knowledge and the model just gives more informed responses maintaining response quality and context.
Actual Behavior:
Documents often result in a noticeable impact on response quality as well as the fact models within 1-30 messages seem to reset and act like your next message is the very first with no way to get them to take previous messages into account again.
Environment
Open WebUI: 0.3.5
Ollama: 0.1.46
Operating System: docker running on ubuntu
Browser (if applicable): i have used librewolf, icecat & safari openwebui works fine problem not related to the browser.
Reproduction Details
Confirmation:
Installation Method
Using the cuda version of open-webui in docker, The whole thing is running on proxmox, docker with open-webui is running in a LXC container with basically direct host performance and Ollama is running on a windows 10 virtual machine with vGPU profile to make the most of VRAM with many different applications.
Additional Information
It seems fairly easy to reproduce i have uploaded many documents, chat messages say "all documents" under them 5-7x and within a few responses it resets and acts like the next message is the first interaction again
I have tried increasing the num_keep variable and the context length, which does help a lot and makes it take longer before it resets, but once it resets and forgets everything theres no going back it seems. (So it seems the best way to reproduce this is to set those variables very low)
The text was updated successfully, but these errors were encountered: