Awesome Knowledge Distillation
-
Updated
Nov 27, 2024
Awesome Knowledge Distillation
Images to inference with no labeling (use foundation models to train supervised models).
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
Matching Guided Distillation (ECCV 2020)
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
The Codebase for Causal Distillation for Language Models (NAACL '22)
A framework for knowledge distillation using TensorRT inference on teacher network
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
The Codebase for Causal Distillation for Task-Specific Models
Awesome Deep Model Compression
Use AWS Rekognition to train custom models that you own.
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
Use LLaMA to label data for use in training a fine-tuned LLM.
Model distillation of CNNs for classification of Seafood Images in PyTorch
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.
Add a description, image, and links to the model-distillation topic page so that developers can more easily learn about it.
To associate your repository with the model-distillation topic, visit your repo's landing page and select "manage topics."