0.1.193
Support ollama
OpenAI API-compatility, i.e ollama LLM server now mimics the OpenAI API, so any code that used to work for OpenAI LLMs will now work with a simple change of api_base
.
Langroid takes care of setting the api_base
behind the scenes, when you specify the local LLM using chat_model = "ollama/mistral"
, e.g.
import langroid.language_models as lm
import langroid as lr
llm_config = lm.OpenAIGPTConfig(
chat_model="ollama/mistral:7b-instruct-v0.2-q8_0",
chat_context_length=16_000, # adjust based on model
)
agent = lr.ChatAgent(llm=llm_config)
...
See more in this tutorial on Local LLM Setup with Langroid