Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enh: call outlet hook from the backend #3237

Open
frederikschubert opened this issue Jun 17, 2024 · 1 comment
Open

Enh: call outlet hook from the backend #3237

frederikschubert opened this issue Jun 17, 2024 · 1 comment

Comments

@frederikschubert
Copy link

Is your feature request related to a problem? Please describe.
We are using Open WebUI as a general solution to manage access to LLMs and RAG applications in our company. Besides the Open WebUI web application, we are using the continue plugin with the following configuration:

"models": [
    {
      "model": "gpt-4o",
      "title": "GPT-4o",
      "apiKey": "sk-...,
      "completionOptions": {},
      "apiBase": "https://openwebui/openai",
      "provider": "openai",
      "requestOptions": {
        "headers": {
            "Content-Type": "application/json"
        }
      }
    },
...

Additionally, we are using pipelines such as the langfuse filter to track the usage. When called via the continue plugin, only the inlet is called. This leaves the response text empty, as it is set in the outlet filter.

Describe the solution you'd like
The inlet as well as the outlet filter of the registered pipelines should be called when using the Open WebUI as an API proxy.

Describe alternatives you've considered
The langfuse filter could be integrated in a general litellm setup, but I think that using inlets and outlets of pipelines in the same way for all chat interactions is a good feature in general.

@tjbck tjbck changed the title Enable Pipelines for OpenAI API Proxy Enh: call outlet hook from the backend Jun 17, 2024
@Ronan035
Copy link

Ronan035 commented Sep 30, 2024

Same issue with version 0.3.30 and current WebUI Pipelines docker image.
I use Continue and Open WebUI as a proxy to an external Ollama instance. The outlets are 'visible' via Open WebUI chat but not for calls made by Continue via /ollama/v1 endpoint. Only inlets are taken into account.
(Open WebUI is a fantastic tool! Huge huge thanks for your work!)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants