Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Resubmission for chat response freezes but resources still in use with a locked up UI #3135

Closed
3 tasks done
silentoplayz opened this issue Jun 13, 2024 Discussed in #3080 · 4 comments
Closed
3 tasks done

Comments

@silentoplayz
Copy link
Collaborator

silentoplayz commented Jun 13, 2024

Discussed in #3080

Originally posted by Kingbadger3d June 12, 2024

Bug Report

Description

Bug Summary:
Frozen LLM responses with CLI error

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]

Expected Behavior:
The response should continue as normal.

Actual Behavior:
After 4-5 lines of response text, the response stops, but the CPU, GPU, and memory are still being utilized. The CLI outputs an error. Additionally, the following error is displayed in the CLI:

ERROR:    Exception in ASGI application
Traceback (most recent call last):
[Long traceback provided]
TimeoutError

Environment

  • Open WebUI Version: v0.3.3 (latest)
  • Ollama (if applicable): v0.1.43 (latest). Ive just done a git pull, and its still doing the same.
  • Operating System: [Windows 11]
  • Browser: Chrome/Firefox

Also I might add, I also have older versions of Openwebui 0.1.X version e.g that works as expected, doesn't do this and is using the very latest Ollama.

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.

Logs and Screenshots

Terminal console Logs:

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 435, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in __call__
    with collapse_excgroups():
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_utils.py", line 93, in collapse_excgroups
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
    return await super().stream_response(send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
    raise app_exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in __call__
    with collapse_excgroups():
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_utils.py", line 93, in collapse_excgroups
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
    return await super().stream_response(send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
    raise app_exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\cors.py", line 148, in simple_response
    await self.app(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in __call__
    with collapse_excgroups():
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_utils.py", line 93, in collapse_excgroups
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
    return await super().stream_response(send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
    raise app_exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in __call__
    with collapse_excgroups():
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_utils.py", line 93, in collapse_excgroups
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 217, in stream_response
    return await super().stream_response(send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 181, in body_stream
    raise app_exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\routing.py", line 485, in handle
    await self.app(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\middleware\base.py", line 189, in __call__
    with collapse_excgroups():
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\_utils.py", line 93, in collapse_excgroups
    raise exc
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 50, in __anext__
    rv = await self.read_func()
         ^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 317, in readline
    return await self.readuntil()
           ^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 351, in readuntil
    await self._wait("readuntil")
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\streams.py", line 311, in _wait
    with self._timer:
  File "D:\Git_AI\openwebui\Miniconda3\envs\openwebui\Lib\site-packages\aiohttp\helpers.py", line 735, in __exit__
    raise asyncio.TimeoutError from None
TimeoutError

Installation Method

pip on Windows 11

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@silentoplayz
Copy link
Collaborator Author

Dev note from Llama 3 70B 🤖:
The error log indicates that an asyncio.TimeoutError was raised, which suggests that a timeout occurred while waiting for a response or a read operation. This timeout error is likely causing the response text to stop after 4-5 lines, and the CPU and GPU/memory usage to remain high.

The error log points to the aiohttp library, specifically the readuntil method, which is used to read data from a stream until a delimiter is encountered. It seems that this method is timing out, causing the error to propagate up the call stack.

@tjbck
Copy link
Contributor

tjbck commented Jun 13, 2024

Related #3107

@tjbck
Copy link
Contributor

tjbck commented Jun 14, 2024

@Kingbadger3d should be fixed on latest dev, please let us know if the issue persists!

@Kingbadger3d
Copy link

@tjbck Cheers Bud. Keep up the great work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants