It combines "Ollama" with a playful twist on "bot" and "automatica," giving it a dynamic and tech-savvy vibe. If you're looking for something unique, this could stand out and convey that it's a powerful and engaging framework.
This has been tested against the Ollama Open Web UI's API and should work against work just against their typical API without Open Web UI.
- Multiple Telegram Bots against multple Ollama models.
- Multiple chat clients against any of these Telgram bots
- Each bot can be configured to use a different model, and Ollama endpoint.
- Start a new conversation, with an optional new chat prompt
- Whitelist for telegram chatids to prevent unauthorized access
- Admin Whitelist for telegram chatids to prevent unauthorized access
- List, Save, Load, and Delete conversations
- Skip Ollama API, and just echo back the user's message (for testing)
- Optional Prompt for each chat message
- Flexible hosting options, run the local executable or hostable via Docker and docker-compose
- All configuration is done via a single file, appsettings.json
- Optionally you can mount a volume for the Chat History, via docker settings
- List models in the chat
- Hot swap a model for a bot in a chat
- Ask for debug information
- Support for Slack - needs deeper testing from the community (Slack is not my thing)
- Support for Discord - DMs or in channels with mentions.
Copy the file:
appsettings.sample.json.txt
to
appsettings.json
Edit the file to include your bot token and other settings.
Run the following command:
docker run -d -v ./appsettings.json:/app/appsettings.dev.json --restart unless-stopped robchartier/ollabotica
All settings have been documented in the (appsettings.sample.json.txt) file.
version: '3'
services:
service1:
image: robchartier/ollabotica:latest
container_name: telegram-bot-1
restart: unless-stopped
volumes:
- ./appsettings.json:/app/appsettings.json
- ./chats:/app/chats
environment:
- ASPNETCORE_ENVIRONMENT=Development