AI Observability & Evaluation
-
Updated
Dec 24, 2024 - Jupyter Notebook
AI Observability & Evaluation
Laminar - open-source all-in-one platform for engineering AI products. Crate data flywheel for you AI app. Traces, Evals, Datasets, Labels. YC S24.
Open source platform for AI Engineering: OpenTelemetry-native LLM Observability, GPU Monitoring, Guardrails, Evaluations, Prompt Management, Vault, Playground. 🚀💻 Integrates with 50 LLM Providers, VectorDBs, Agent Frameworks and GPUs.
Fiddler Auditor is a tool to evaluate language models.
A comprehensive solution for monitoring your AI models in production
A python library to send data to Arize AI!
A report generator library for the ML models deployed on the Fiddler AI Observability platform
Java client to interact with Arize API
This repo hosts a chatbot that runs in a docker container to demo Okahu AI Observability Cloud
This repo hosts a chatbot that runs in Github Codespaces to demo Okahu AI Observability Cloud with OpenAI
Search for a holiday and get destination advice from an LLM. Observability by Dynatrace.
Example projects for Arthur Model Monitoring Platform
Official Python Library to monitor your LLM Application with Doku
The Modelmetry Python SDK allows developers to easily integrate Modelmetry’s advanced guardrails and monitoring capabilities into their LLM-powered applications.
The Modelmetry JS/TS SDK allows developers to easily integrate Modelmetry’s advanced guardrails and monitoring capabilities into their LLM-powered applications.
Official NodeJS library for monitoring LLM Applications with Doku
Add a description, image, and links to the ai-observability topic page so that developers can more easily learn about it.
To associate your repository with the ai-observability topic, visit your repo's landing page and select "manage topics."