You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug Summary:
If the OpenAI API endpoint is configured to point to a local OpenAI-like server, the model list is not populated if an API key is not configured.
Configure the OpenAI API endpoint to point to the OpenAI-like server using either OPENAI_API_BASE_URL or Admin Panel -> Connections -> OpenAI API
Do not configure an API key
Expected Behavior:
The models list should be populated if a request can be made to the models endpoint, regardless of whether or not an API key is provided. At the very least, an error should be displayed if an API key is not configured.
Actual Behavior:
Fetching the OpenAI models list silently fails if an API key is not configured. Observe that sending an authenticated request to /api/models results in an empty list. The models are also unavailable in the UI.
Environment
Open WebUI Version: v0.3.5
Ollama (if applicable): (not applicable)
Operating System: Windows 10
Browser (if applicable): (not applicable)
Reproduction Details
Confirmation:
I have read and followed all the instructions provided in the README.md.
I am on the latest version of both Open WebUI and Ollama.
I have included the browser console logs.
I have included the Docker container logs.
Logs and Screenshots
Browser Console Logs:
(not applicable)
Docker Container Logs: INFO:apps.openai.main:get_all_models() is printed in the console, indicating that an attempt is being made to request the models list, but the request is not actually performed if the API key is missing (verified via LM Studio logs). The request is performed successfully if the Check Connection button is pressed in the Admin Panel, but the list of models is not retained.
Screenshots (if applicable):
(not applicable)
Installation Method
Manual installation (pip install)
Additional Information
(follow up if needed)
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
The text was updated successfully, but these errors were encountered:
Bug Report
Description
Bug Summary:
If the OpenAI API endpoint is configured to point to a local OpenAI-like server, the model list is not populated if an API key is not configured.
Steps to Reproduce:
open-webui
normallyOPENAI_API_BASE_URL
orAdmin Panel -> Connections -> OpenAI API
Expected Behavior:
The models list should be populated if a request can be made to the models endpoint, regardless of whether or not an API key is provided. At the very least, an error should be displayed if an API key is not configured.
Actual Behavior:
Fetching the OpenAI models list silently fails if an API key is not configured. Observe that sending an authenticated request to
/api/models
results in an empty list. The models are also unavailable in the UI.Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
(not applicable)
Docker Container Logs:
INFO:apps.openai.main:get_all_models()
is printed in the console, indicating that an attempt is being made to request the models list, but the request is not actually performed if the API key is missing (verified via LM Studio logs). The request is performed successfully if theCheck Connection
button is pressed in the Admin Panel, but the list of models is not retained.Screenshots (if applicable):
(not applicable)
Installation Method
Manual installation (
pip install
)Additional Information
(follow up if needed)
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
The text was updated successfully, but these errors were encountered: