Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat with Ollama behind a company http server not possible #3037

Closed
mlier opened this issue Jun 11, 2024 · 0 comments
Closed

Chat with Ollama behind a company http server not possible #3037

mlier opened this issue Jun 11, 2024 · 0 comments

Comments

@mlier
Copy link

mlier commented Jun 11, 2024

Bug Report

Description

Bug Summary:
I am in the company network trying to use OpenWebui.
The http proxy is defined and in use.
Ollama works well.
Openwebui is running. The ui is working well. In particular, I can see my model list. I can choose one model.

My problem occurs when I want to chat.
No answer is written.
There is an error Uncaught (in promise) ReferenceError: TextDecoderStream is not defined Chat.svelte:698:18

TextDecoderStream is not find.
Is there a problem with the installation of javascript librairies due to http proxy server during the installation process ?

The same Open webui (same version) on a computer without proxy server works very well.

Environment

  • Open WebUI Version: [v0.3.2]
  • Ollama (if applicable): [0.1.38]
  • Operating System: [Ubuntu 20.04]
  • Browser (if applicable): [Firefox 102.15.1esr]

Reproduction Details

Confirmation:

  • [X ] I have read and followed all the instructions provided in the README.md.
  • [ X] I am on the latest version of both Open WebUI and Ollama.
  • [ X] I have included the browser console logs.
  • [X ] I have included the backend logs.

Logs and Screenshots

Browser Console Logs:

Uncaught (in promise) ReferenceError: TextDecoderStream is not defined   Chat.svelte:698:18
    Wt Chat.svelte:698
    At Chat.svelte:497
    At Chat.svelte:441
    Nt Chat.svelte:362
    xe MessageInput.svelte:706
    St dom.js:361
    m MessageInput.svelte:987
    m MessageInput.svelte:445
    _t Component.js:44
    m Chat.svelte:1271
    m Chat.svelte:1204
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    m root.svelte:54
    m root.svelte:49
    m  layout.svelte:196
    p  layout.svelte:193
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    si  layout.svelte:179
    F utils.js:41
    _t Component.js:47
    ut scheduler.js:99
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    fr  layout.svelte:120
    F utils.js:41
    _t Component.js:47
    ut scheduler.js:99
    Ot Component.js:164
    he root.svelte:23
    Pe client.js:304
    ce client.js:1131
    re client.js:242
    goto client.js:1391
    Nt start.js:24
    <anonymous> (index):79
    promise callback* (index):78
[Chat.svelte:698:18](http://mycompany.fr:8080/src/lib/components/chat/Chat.svelte)
usage 
Object { models: (1) […] }
[ layout.svelte:89:13](http://mycompany.fr:8080/src/routes/ layout.svelte)
usage 
Object { models: (1) […] }
[ layout.svelte:89:13](http://mycompany.fr:8080/src/routes/ layout.svelte)
usage 
Object { models: (1) […] }
[ layout.svelte:89:13](http://mycompany.fr:8080/src/routes/ layout.svelte)

Openwebui backend Console Logs:

INFO:     10.91.246.200:62691 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
INFO:     10.91.246.200:62691 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO:apps.ollama.main:url: http://localhost:11434
{'model': 'mixtral:8x7b-instruct-v0.1-q5_K_M', 'messages': [{'role': 'user', 'content': 'coucou'}], 'options': {}}
INFO:     10.91.246.200:62691 - "POST /ollama/api/chat HTTP/1.1" 200 OK
Received "usage" event from 7reXE6ajIOwqyGLnAAAF: {'action': 'chat', 'model': 'mixtral:8x7b-instruct-v0.1-q5_K_M', 'chat_id': '2b70caca-94ae-4988-ab2d-797e4a667491'}
Models in use: ['mixtral:8x7b-instruct-v0.1-q5_K_M']
remove_after_timeout 7reXE6ajIOwqyGLnAAAF mixtral:8x7b-instruct-v0.1-q5_K_M
Received "usage" event from 7reXE6ajIOwqyGLnAAAF: {'action': 'chat', 'model': 'mixtral:8x7b-instruct-v0.1-q5_K_M', 'chat_id': '2b70caca-94ae-4988-ab2d-797e4a667491'}
Models in use: ['mixtral:8x7b-instruct-v0.1-q5_K_M']
remove_after_timeout 7reXE6ajIOwqyGLnAAAF mixtral:8x7b-instruct-v0.1-q5_K_M
Received "usage" event from 7reXE6ajIOwqyGLnAAAF: {'action': 'chat', 'model': 'mixtral:8x7b-instruct-v0.1-q5_K_M', 'chat_id': '2b70caca-94ae-4988-ab2d-797e4a667491'}
Models in use: ['mixtral:8x7b-instruct-v0.1-q5_K_M']
remove_after_timeout 7reXE6ajIOwqyGLnAAAF mixtral:8x7b-instruct-v0.1-q5_K_M
Received "usage" event from 7reXE6ajIOwqyGLnAAAF: {'action': 'chat', 'model': 'mixtral:8x7b-instruct-v0.1-q5_K_M', 'chat_id': '2b70caca-94ae-4988-ab2d-797e4a667491'}

indefinitely 

Installation Method

pip install open-webui in a conda venv

@open-webui open-webui locked and limited conversation to collaborators Jun 11, 2024
@tjbck tjbck converted this issue into discussion #3039 Jun 11, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant