-
-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enh: call outlet hook from the backend #3237
Comments
tjbck
changed the title
Enable Pipelines for OpenAI API Proxy
Enh: call outlet hook from the backend
Jun 17, 2024
Same issue with version 0.3.30 and current WebUI Pipelines docker image. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
We are using Open WebUI as a general solution to manage access to LLMs and RAG applications in our company. Besides the Open WebUI web application, we are using the continue plugin with the following configuration:
Additionally, we are using pipelines such as the langfuse filter to track the usage. When called via the continue plugin, only the
inlet
is called. This leaves the response text empty, as it is set in theoutlet
filter.Describe the solution you'd like
The
inlet
as well as theoutlet
filter of the registered pipelines should be called when using the Open WebUI as an API proxy.Describe alternatives you've considered
The langfuse filter could be integrated in a general litellm setup, but I think that using inlets and outlets of pipelines in the same way for all chat interactions is a good feature in general.
The text was updated successfully, but these errors were encountered: