Here are
32 public repositories
matching this topic...
Use your locally running AI models to assist you in your web browsing
Updated
Nov 4, 2024
TypeScript
A generalized information-seeking agent system with Large Language Models (LLMs).
Updated
Jun 19, 2024
Python
[ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
Updated
Aug 13, 2024
Python
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
Updated
Aug 13, 2024
Python
Run MemGPT-AutoGEN-Local LLM Together
Updated
Nov 2, 2023
Python
A nifty little library for working with Ollama in Elixir.
Updated
Aug 12, 2024
Elixir
OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
Updated
Oct 13, 2024
Dart
Chat with your pdf using your local LLM, OLLAMA client.
Updated
Oct 17, 2024
TypeScript
MVP of an idea using multiple local LLM models to simulate and play D&D
Updated
Nov 3, 2024
Python
A local chatbot for managing docs
Updated
Oct 27, 2024
Python
Unofficial entropix impl for Gemma2 and Llama and Qwen2 and Mistral
Updated
Oct 22, 2024
Python
Humanlike AI Chat is a terminal-based LLM UI designed to study how to bypass AI text detection.
Updated
Mar 15, 2024
Python
Alacritty Fish Zellij Starship Neovim i3 Supermaven Ollama 🦙 = 🚀
Updated
Oct 30, 2024
Shell
Run gguf LLM models in Latest Version TextGen-webui
Updated
Oct 11, 2024
Jupyter Notebook
This started out as a POC for chatting over my documents, but has turned into a whole framework for using LLMs.
Updated
Aug 10, 2023
Python
A library that manages tools (functions) for Local LLMs
Updated
Jun 7, 2024
Python
This project is a multi-agent security framework that utilizes multiple LLM models to analyze and generate comprehensive security briefs.
Updated
May 17, 2024
Python
Read your local files and answer your queries
Updated
Sep 3, 2024
Python
A german workshop where you learn how to build RAGs with Langchain
Updated
Oct 14, 2024
Python
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
Updated
Mar 5, 2024
Python
Improve this page
Add a description, image, and links to the
localllm
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
localllm
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.