Skip to content

vnc-lm is a Discord bot with Ollama, OpenRouter, Mistral, and Cohere API integration

License

Notifications You must be signed in to change notification settings

jake83741/vnc-lm

Repository files navigation

vnc-lm

11-01-2024: Added API support for OpenRouter, Mistral, and Cohere
10-27-2024: Added prompt refining

Introduction

vnc-lm is a Discord bot with Ollama, OpenRouter, Mistral, and Cohere API integration.

Load and manage language models through local or hosted API endpoints. Configure parameters, branch conversations, and refine prompts to optimize responses.

Screen Recording 2024-10-27 at 12 13 41 AM
Web scraping
Screen Recording 2024-10-27 at 12 17 05 AM
Model pulling with ollama

Features

Model Management

Load models using the /model command. The bot sends notifications upon successful model loading. Local models can be removed with the remove parameter. Download new models by sending a model tag link in Discord.

https://ollama.com/library/llama3.2:1b-instruct-q8_0
https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/blob/main/Llama-3.2-1B-Instruct-Q8_0.gguf

🚧 Model downloading and removal is turned off by default and can be enabled by configuring the .env.

Configure model behavior by adjusting the num_ctx (context length), system_prompt (base instructions), and temperature (response randomness) parameters.

QoL Improvements

Messages longer than 1500 characters are automatically paginated during generation. Message streaming is available with Ollama. Other APIs handle responses quickly without streaming. The context window accepts text files, web links, and screenshot attachments. Deploy using Docker for simplified setup.

Switch conversations by selecting rejoin conversation from the context menu. Branch conversations from any message. Messages are cached and organized in bot_cache.json. The entrypoint.sh script maintains conversation history across Docker container restarts.

💡 Message stop to end message generation early.

Edit your last prompt to refine the model's response. The bot generates a new response using your edited prompt, replacing the previous output.

Requirements

Docker: Docker is a platform designed to help developers build, share, and run container applications. We handle the tedious setup, so you can focus on the code.

Supported APIs

Ollama: Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.

The hosted APIs below offer varying degrees of free access to state-of-the-art high parameter count language models:

OpenRouter: A unified interface for LLMs. Find the best models & prices for your prompts. Includes latest state of the art models from OpenAI, Anthropic, Google, and Meta.

Mistral: Mistral AI is a research lab building the best open source models in the world. La Plateforme enables developers and enterprises to build new products and applications, powered by Mistral’s open source and commercial LLMs.

Cohere: The Cohere platform builds natural language processing and generation into your product with a few lines of code. Our large language models can solve a broad spectrum of natural language use cases, including classification, semantic search, paraphrasing, summarization, and content generation.

Environment Configuration

git clone https://github.com/jake83741/vnc-lm.git
cd vnc-lm

Rename .env.example to .env.

Configure the below fields in the .env:

TOKEN: Discord bot token from the Discord Developer Portal. Set required bot permissions.
OLLAMAURL: Ollama server URL. See API documentation. For Docker: http://host.docker.internal:11434
NUM_CTX: Context window size. Default: 2048
TEMPERATURE: Response randomness. Default: 0.4
KEEP_ALIVE: Model retention time in memory. Default: 45m
CHARACTER_LIMIT: Page embed character limit. Default: 1500
API_RESPONSE_UPDATE_FREQUENCY: API response chunk size before message updates. Low values trigger Discord throttling. Default: 10
ADMIN: Discord user ID for model management permissions
REQUIRE_MENTION: Toggle bot mention requirement. Default: false
OPENROUTER: OpenRouter API key from OpenRouter Dashboard
OPENROUTER_MODELS: Comma-separated OpenRouter model list
MISTRAL_API_KEY: Mistral API key from Mistral Dashboard
MISTRAL_MODELS: Comma-separated Mistral model list
COHERE_API_KEY: Cohere API key from Cohere Dashboard
COHERE_MODELS: Comma-separated Cohere model list

🚧 Never share API keys.

Docker Installation (Preferred)

docker compose up --build

💡 Send /help for instructions on how to use the bot.

Manual Installation


npm install
npm run build
npm start

Usage

image
Use /model to load, configure, and remove models. Quickly adjust model behavior using the optional parameters num_ctx, system_prompt, and temperature. Note that num_ctx only works with local Ollama models.

Screen Recording 2024-10-27 at 12 49 01 AM
Refine prompts to modify model responses. Each refinement generates a new response that overwrites the previous one. Multiple refinements are supported. The latest prompt version is saved in bot_cache.json.

image
Access Rejoin Conversation in Discord's context menu to resume from any message. Hop between conversations while maintaining context. Create new conversation branches as needed. Continue conversations using different models and parameter settings.

Tree Diagram

.
├── LICENSE
├── README.md
├── docker-compose.yaml
├── dockerfile
├── .env.example
├── package.json
├── screenshots
├── src
├── api-connections
│   ├── config
│   │   └── models.ts
│   ├── factory.ts
│   ├── index.ts
│   ├── interfaces
│   │   ├── base-client.ts
│   │   └── model-manager.ts
│   ├── models.ts
│   └── provider
│       ├── cohere
│       │   └── client.ts
│       ├── mistral
│       │   └── client.ts
│       ├── ollama
│       │   └── client.ts
│       └── openrouter
│           └── client.ts
├── bot.ts
├── commands
│   ├── command-registry.ts
│   ├── help-command.ts
│   ├── model-command.ts
│   ├── optional-params
│   │   └── remove.ts
│   └── rejoin-conversation.ts
├── managers
│   ├── cache
│   │   ├── entrypoint.sh
│   │   ├── index.ts
│   │   ├── manager.ts
│   │   └── store.ts
│   ├── generation
│   │   ├── chunk.ts
│   │   ├── create.ts
│   │   └── preprocessing.ts
│   ├── message
│   │   └── manager.ts
│   └── pages
│       └── manager.ts
├── services
│   ├── ocr.ts
│   └── scraper.ts
└── utilities
    ├── constants.ts
    ├── index.ts
    ├── settings.ts
    └── types.ts
└── tsconfig.json

Dependencies


  1. Axios: Promise based HTTP client for the browser and node.js.
  2. Discord.js: A powerful JavaScript library for interacting with the Discord API.
  3. dotenv: Loads environment variables from .env for nodejs projects.
  4. tesseract.js: A javascript library that gets words in almost any language out of images.
  5. jsdom: A JavaScript implementation of various web standards, for use with Node.js
  6. readbility: A standalone version of the readability lib
  7. cohere-ai: The Cohere TypeScript SDK

Notes


  1. Set higher num_ctx values when using attachments with large amounts of text
  2. Text extraction from screenshots uses OCR - multi-modal model support is not yet implemented

License

This project is licensed under the MIT License.

About

vnc-lm is a Discord bot with Ollama, OpenRouter, Mistral, and Cohere API integration

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published