-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: Resubmission for chat response freezes but resources still in use with a locked up UI #3135
Comments
Dev note from Llama 3 70B 🤖: The error log points to the |
Related #3107 |
@Kingbadger3d should be fixed on latest dev, please let us know if the issue persists! |
@tjbck Cheers Bud. Keep up the great work. |
Discussed in #3080
Originally posted by Kingbadger3d June 12, 2024
Bug Report
Description
Bug Summary:
Frozen LLM responses with CLI error
Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
Expected Behavior:
The response should continue as normal.
Actual Behavior:
After 4-5 lines of response text, the response stops, but the CPU, GPU, and memory are still being utilized. The CLI outputs an error. Additionally, the following error is displayed in the CLI:
ERROR: Exception in ASGI application Traceback (most recent call last): [Long traceback provided] TimeoutError
Environment
Also I might add, I also have older versions of Openwebui 0.1.X version e.g that works as expected, doesn't do this and is using the very latest Ollama.
Reproduction Details
Confirmation:
Logs and Screenshots
Terminal console Logs:
Installation Method
pip
on Windows 11Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
The text was updated successfully, but these errors were encountered: