gguf
Here are 98 public repositories matching this topic...
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Nov 4, 2024 - Dart
LLM Agent Framework in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai/gemini interfaces, such as o1,ollama, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage neo4j KG, graphRAG / RAG / html 2 img
-
Updated
Nov 23, 2024 - Python
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
-
Updated
Oct 31, 2024 - TypeScript
Practical Llama 3 inference in Java
-
Updated
Nov 14, 2024 - Java
An open source DevOps tool for packaging and versioning AI/ML models, datasets, code, and configuration into an OCI artifact.
-
Updated
Nov 20, 2024 - Go
Go library for embedded vector search and semantic embeddings using llama.cpp
-
Updated
Oct 28, 2024 - Go
Search for anything using Google, DuckDuckGo, phind.com, Contains AI models, can transcribe yt videos, temporary email and phone number generation, has TTS support, webai (terminal gpt and open interpreter) and offline LLMs
-
Updated
Nov 21, 2024 - Python
Making offline AI models accessible to all types of edge devices.
-
Updated
Feb 12, 2024 - Dart
Gradio based tool to run opensource LLM models directly from Huggingface
-
Updated
Jun 27, 2024 - Python
Improve this page
Add a description, image, and links to the gguf topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gguf topic, visit your repo's landing page and select "manage topics."