Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enh: save response before switching chat #2647

Open
4 tasks
0x7CFE opened this issue May 29, 2024 · 4 comments
Open
4 tasks

enh: save response before switching chat #2647

0x7CFE opened this issue May 29, 2024 · 4 comments

Comments

@0x7CFE
Copy link

0x7CFE commented May 29, 2024

Bug Report

Description

Bug Summary:
Response lost if chat is switched during generation

Steps to Reproduce:

  1. Ask something in the chat window
  2. LLM starts generating the response
  3. Switch to another topic by selecting previous chats
  4. Return to original topic
  5. Realize that you can't see what is going on. CPU is busy, but nothing is being shown in the UI, there is no ⏹️ button, etc. There is no way to abort the generation.
  6. Only after LLM finishes generation the whole reply is shown, all at once.

Expected Behavior:
I expect that the generation would happen in the background and I would be able to see the progress when I return to the chat where the question was asked. I should have an option to abort the generation before it finishes.

Actual Behavior:
Reply being generated is invisible, no way to abort.

Environment

  • Open WebUI Version: v0.1.123

  • Ollama (if applicable): 0.1.39

  • Operating System: Ubuntu 22.04

  • Browser (if applicable): Firefox 125.0

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@tjbck tjbck changed the title Response lost if chat is switched during generation enh: save response before switching chat May 29, 2024
@hershal
Copy link

hershal commented Jun 3, 2024

I'm seeing the same issue. If I navigate away from the chat before the LLM finishes generating text (e.g., clicking on another chat), then my latest input and response is lost.

@skobkin
Copy link

skobkin commented Jun 4, 2024

I'd say that it isn't a bug, but a good feature request.

@0x7CFE
Copy link
Author

0x7CFE commented Jun 5, 2024

Well, bug it or not, it does not pass the principle of least surprise test. User ends up in a situation where she seemingly lost the pending generation, has a hot CPU, and with no button to abort the generation.

@TFWol
Copy link

TFWol commented Jul 8, 2024

What should be done is have an entry in the sidebar as soon as New Chat button is pressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants