-
Notifications
You must be signed in to change notification settings - Fork 288
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support AI's other than OpenAI like Mistral #82
Comments
can i work on this ? |
Yes! |
This is a game changer i guess. Especially if we can do it with LLAMA 3 dont you think? I wonder why you chose mistral, i thinker if we can add llama from groq! |
is this still open, can you please share more details about which all AI/ LLms you want to implement??? |
Mistral/Llama 3. It can really be any model. |
We added support for Anthropic/Bedrock a few months ago. Next would be supporting OS models. The Vercel ai package we use makes it pretty easy to do. The main consideration is if the model supports function calling. |
We use the
ai
package: https://sdk.vercel.ai/docs. And it's easy to switch between LLM's with it. Here's a playground for example: https://sdk.vercel.ai/Although there may be features that don't work across LLMs like function calling. So we'd have to handle those differently.
If we could support Mistral that would be very cool and help us move in the direction of a 100% self-hosted version.
The text was updated successfully, but these errors were encountered: