Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AIOHTTP_CLIENT_TIMEOUT Does not work #3258

Closed
3 of 4 tasks
uninstall-your-browser opened this issue Jun 18, 2024 · 12 comments
Closed
3 of 4 tasks

AIOHTTP_CLIENT_TIMEOUT Does not work #3258

uninstall-your-browser opened this issue Jun 18, 2024 · 12 comments

Comments

@uninstall-your-browser
Copy link

uninstall-your-browser commented Jun 18, 2024

Bug Report

Setting AIOHTTP_CLIENT_TIMEOUT has no effect, it is impossible for longer responses with a larger model to complete using this software on my GPU.

Description

I am using host ollama with a docker compose file, where I set AIOHTTP_CLIENT_TIMEOUT like this:

    environment:
      - 'OLLAMA_BASE_URL=http://localhost:11434'
      - 'AIOHTTP_CLIENT_TIMEOUT=1'

Bug Summary:
Setting AIOHTTP_CLIENT_TIMEOUT has no effect. Setting it to 1 does not fail after 1 second. Setting it to 99999999 still fails in 5 minutes.

Steps to Reproduce:

  1. Set AIOHTTP_CLIENT_TIMEOUT to anything
  2. Request to ollama still fails after 5 minutes

Expected Behavior:
Failing after the time specified by AIOHTTP_CLIENT_TIMEOUT

Actual Behavior:
Ignores AIOHTTP_CLIENT_TIMEOUT

Environment

  • Open WebUI Version: 0.3.5

  • Ollama (if applicable): 0.1.44

  • Operating System: Linux

  • Browser (if applicable): Firefox

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:

open-webui  | INFO:     127.0.0.1:41340 - "POST /ollama/api/chat HTTP/1.1" 200 OK
open-webui  | ERROR:    Exception in ASGI application
open-webui  | Traceback (most recent call last):
open-webui  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
open-webui  |     result = await app(  # type: ignore[func-returns-value]
open-webui  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
open-webui  |     return await self.app(scope, receive, send)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
open-webui  |     await super().__call__(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
open-webui  |     await self.middleware_stack(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
open-webui  |     await self.app(scope, receive, _send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
open-webui  |     with collapse_excgroups():
open-webui  |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
open-webui  |     self.gen.throw(typ, value, traceback)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
open-webui  |     await func()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
open-webui  |     return await super().stream_response(send)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
open-webui  |     async for chunk in self.body_iterator:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
open-webui  |     raise app_exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
open-webui  |     await self.app(scope, receive_or_disconnect, send_no_error)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
open-webui  |     with collapse_excgroups():
open-webui  |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
open-webui  |     self.gen.throw(typ, value, traceback)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
open-webui  |     await func()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
open-webui  |     return await super().stream_response(send)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
open-webui  |     async for chunk in self.body_iterator:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
open-webui  |     raise app_exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
open-webui  |     await self.app(scope, receive_or_disconnect, send_no_error)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__
open-webui  |     await self.simple_response(scope, receive, send, request_headers=headers)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 148, in simple_response
open-webui  |     await self.app(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
open-webui  |     with collapse_excgroups():
open-webui  |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
open-webui  |     self.gen.throw(typ, value, traceback)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
open-webui  |     await func()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
open-webui  |     return await super().stream_response(send)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
open-webui  |     async for chunk in self.body_iterator:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
open-webui  |     raise app_exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
open-webui  |     await self.app(scope, receive_or_disconnect, send_no_error)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
open-webui  |     with collapse_excgroups():
open-webui  |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
open-webui  |     self.gen.throw(typ, value, traceback)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
open-webui  |     await func()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
open-webui  |     return await super().stream_response(send)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
open-webui  |     async for chunk in self.body_iterator:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
open-webui  |     raise app_exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
open-webui  |     await self.app(scope, receive_or_disconnect, send_no_error)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
open-webui  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
open-webui  |     await app(scope, receive, sender)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
open-webui  |     await self.middleware_stack(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
open-webui  |     await route.handle(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 485, in handle
open-webui  |     await self.app(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
open-webui  |     await super().__call__(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
open-webui  |     await self.middleware_stack(scope, receive, send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
open-webui  |     await self.app(scope, receive, _send)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
open-webui  |     with collapse_excgroups():
open-webui  |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
open-webui  |     self.gen.throw(typ, value, traceback)
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
open-webui  |     raise exc
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
open-webui  |     await func()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
open-webui  |     async for chunk in self.body_iterator:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 50, in __anext__
open-webui  |     rv = await self.read_func()
open-webui  |          ^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 317, in readline
open-webui  |     return await self.readuntil()
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 351, in readuntil
open-webui  |     await self._wait("readuntil")
open-webui  |   File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 311, in _wait
open-webui  |     with self._timer:
open-webui  |   File "/usr/local/lib/python3.11/site-packages/aiohttp/helpers.py", line 735, in __exit__
open-webui  |     raise asyncio.TimeoutError from None
open-webui  | TimeoutError

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

Docker compose with host ollama

@uninstall-your-browser
Copy link
Author

uninstall-your-browser commented Jun 18, 2024

It would also be nice if it was higher than 5 minutes by default (e.g. 20 minutes), please consider that some people do not have high end hardware

@TheTerrasque
Copy link

Which docker image are you using? And did you redeploy the compose file after changing it? (docker compose up)

@uninstall-your-browser
Copy link
Author

uninstall-your-browser commented Jun 18, 2024

Using ghcr.io/open-webui/open-webui:main
I did this:

docker compose down
docker compose pull
docker compose up

@TheTerrasque
Copy link

TheTerrasque commented Jun 18, 2024

Hmm.. Using ghcr.io/open-webui/open-webui:v0.3.5 (which should be the same image) and AIOHTTP_CLIENT_TIMEOUT, I can generate 5 minutes of responses on my system.

It would also be nice if it was higher than 5 minutes by default (e.g. 20 minutes)

I did put it at 15 minutes in my PR, but it was changed back to the default (5 min) before being accepted.

Edit: Could you post your docker-compose file?

@uninstall-your-browser
Copy link
Author

services:
  open-webui:
    build:
      context: .
      args:
        OLLAMA_BASE_URL: '/ollama'
      dockerfile: Dockerfile
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    volumes:
      - open-webui:/app/backend/data
    environment:
      - 'OLLAMA_BASE_URL=http://localhost:${OLLAMA_PORT-11434}'
      - 'WEBUI_SECRET_KEY='
      - 'PORT=${WEBUI_PORT-8080}'
      - 'AIOHTTP_CLIENT_TIMEOUT=99999999'
    network_mode: host
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped
volumes:
  ollama: {}
  open-webui: {}

@TheTerrasque
Copy link

Hmm.. Is the keep-alive still on default setting?

Can you try changing it if it's default? In Open Webui, under Settings -> General -> Advanced Settings -> Keep Alive and set it to something like 30m? I don't think this should kick in when it's generating, but it's worth checking out.

Also, do you know if ollama spent more than 5 minutes loading the model? Did it start responding before it stopped?

@uninstall-your-browser
Copy link
Author

uninstall-your-browser commented Jun 18, 2024

Also, do you know if ollama spent more than 5 minutes loading the model? Did it start responding before it stopped?

It starts generating before it stops

Can you try changing it if it's default? In Open Webui, under Settings -> General -> Advanced Settings -> Keep Alive and set
it to something like 30m? I don't think this should kick in when it's generating, but it's worth checking out.

Does not fix it

@TheTerrasque
Copy link

since you have a build section in the docker compose file, is it built from the folder or do you use image from the web? And are you using ollama embedded in the image or separate external install?

@uninstall-your-browser
Copy link
Author

uninstall-your-browser commented Jun 18, 2024

Docker compose with host ollama

Ollama is running outside of docker

I mostly copied from the example docker files:
https://github.com/open-webui/open-webui/blob/main/docker-compose.yaml

@TheTerrasque
Copy link

TheTerrasque commented Jun 18, 2024

Could you try this docker file?

services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:v0.3.5
    container_name: open-webui
    volumes:
      - open-webui:/app/backend/data
    environment:
      - 'OLLAMA_BASE_URL=http://localhost:${OLLAMA_PORT-11434}'
      - 'WEBUI_SECRET_KEY='
      - 'PORT=${WEBUI_PORT-8080}'
      - 'AIOHTTP_CLIENT_TIMEOUT=1200'
    network_mode: host
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped
volumes:
  open-webui: {}

and if it still doesn't work, can you give the output of

docker compose exec open-webui /bin/bash -c "export"

@uninstall-your-browser
Copy link
Author

Hmm, it seems to be working now

@TheTerrasque
Copy link

TheTerrasque commented Jun 18, 2024

Glad to hear! I wonder what the issue was.

I suspect it was either using an old image somehow or the image has old code in it. Both of those seems weird, though. Main should have the newest code, and pull should have pulled the newest image..

Edit: And since it's working, maybe you can close the bug report :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants