Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat response freezes but resources still in use and locked up interface in GUI with error code #3062

Closed
Kingbadger3d opened this issue Jun 12, 2024 · 3 comments

Comments

@Kingbadger3d
Copy link

Bug Report

Description

Bug Summary:
frozen LLM responses with cli error

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]

Expected Behavior:
to continue the response as normal

Actual Behavior:
response text stops after 4-5 lines but cpu and gpu / mem are still in full use, and the cli spits out an error entered bellow

Environment

  • Open WebUI Version: [e.g., 0.1.120]

  • Ollama (if applicable): [e.g., 0.1.30, 0.1.32-rc1]

  • Operating System: [Windows 11

  • *, Chrome Firefox

Reproduction Details

Confirmation:

  • [x ] I have read and followed all the instructions provided in the README.md.
  • [x ] I am on the latest version of both Open WebUI and Ollama.
  • [ x] I have included the browser console logs.

Logs and Screenshots

Browser Console Logs:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 186, in call
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 164, in call
await self.app(scope, receive, _send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in call
with collapse_excgroups():
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_utils.py", line 93, in collapse_excgroups
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
await func()
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
return await super().stream_response(send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
raise app_exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in call
with collapse_excgroups():
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_utils.py", line 93, in collapse_excgroups
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
await func()
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
return await super().stream_response(send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
raise app_exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\cors.py", line 93, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\cors.py", line 148, in simple_response
await self.app(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in call
with collapse_excgroups():
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_utils.py", line 93, in collapse_excgroups
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
await func()
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
return await super().stream_response(send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
raise app_exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in call
with collapse_excgroups():
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_utils.py", line 93, in collapse_excgroups
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
await func()
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
return await super().stream_response(send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
raise app_exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 776, in app
await route.handle(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 485, in handle
await self.app(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 186, in call
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 164, in call
await self.app(scope, receive, _send)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in call
with collapse_excgroups():
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette_utils.py", line 93, in collapse_excgroups
raise exc
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
await func()
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 50, in anext
rv = await self.read_func()
^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 317, in readline
return await self.readuntil()
^^^^^^^^^^^^^^^^^^^^^^
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 351, in readuntil
await self._wait("readuntil")
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 311, in _wait
with self._timer:
File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\helpers.py", line 735, in exit
raise asyncio.TimeoutError from None
TimeoutError

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Pip on windows

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@gibru
Copy link

gibru commented Jun 12, 2024

Didn't check the CLI output myself, but I also experience frozen chats (reported here). Switched to directly working with CLI ollama for the time being.

@Kingbadger3d
Copy link
Author

Yeah ive now seen a few threads about what looks like simular issues, its def not ollama as ive also got an old version 1.27 or something installed with the latest ollama and its working fine, just the new builds are the problem.

@silentoplayz
Copy link
Collaborator

silentoplayz commented Jun 12, 2024

Environment

Open WebUI Version: [e.g., 0.1.120]
Ollama (if applicable): [e.g., 0.1.30, 0.1.32-rc1]
Operating System: [Windows 11]
Chrome, Firefox

It appears that both your Open WebUI and Ollama installations are outdated. This may be causing the issue at hand. Please update them accordingly to their latest versions from their respective GitHub release pages.

Open WebUI - v0.3.3 (latest)
Ollama - v0.1.43 (latest)

@open-webui open-webui locked and limited conversation to collaborators Jun 12, 2024
@tjbck tjbck converted this issue into discussion #3080 Jun 12, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants