Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LM studio connection failed #3571

Closed
TheophileCAE opened this issue Jul 1, 2024 · 0 comments
Closed

LM studio connection failed #3571

TheophileCAE opened this issue Jul 1, 2024 · 0 comments

Comments

@TheophileCAE
Copy link

TheophileCAE commented Jul 1, 2024

Bug Report

Description

Bug Summary:
When trying to retrieve models from LM studio for inference I get "OpenAI: Network Problem" as red error on top of the screen.

Steps to Reproduce:

  • I installed Open WebUI docker with command : docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
  • I started ollama server and lm studio server
  • The ollama server was automatically connected and worked
  • In admin setting i set the openAI based url as the lm studio server url http://localhost:5001/v1 and the key as `lm-studio'
  • I also tried no key and none key
  • I saved the setting and get this error.

Expected Behavior:
It is expected to have the list of lm studio downloaded models and a connection request in lm studio log server

Actual Behavior:
Capture d’écran 2024-07-01 094533

Environment

  • Open WebUI Version: v0.3.6

  • Ollama (if applicable): 0.1.44

  • Operating System: Windows 11 Pro

  • Browser (if applicable): Firefox 127.0.2 (64-bit), Opera

Reproduction Details

Confirmation:

  • [x ] I have read and followed all the instructions provided in the README.md.
  • [x ] I am on the latest version of both Open WebUI and Ollama.
  • [x ] I have included the browser console logs.
  • [x ] I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
image
image
image

Docker Container Logs:
image

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama

I also tried installing the docker for OpenAI only and I get the same result.
docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Additional Information

I tried using get request on post man to make sure the issue wasn't on lm studio server and it worked :
image

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@open-webui open-webui locked and limited conversation to collaborators Jul 1, 2024
@tjbck tjbck converted this issue into discussion #3573 Jul 1, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant