A treasure chest for visual classification and recognition powered by PaddlePaddle
-
Updated
Nov 21, 2024 - Python
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Collection of AWESOME vision-language models for vision tasks
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
This is a collection of our NAS and Vision Transformer work.
Pytorch implementation of various Knowledge Distillation (KD) methods.
OpenMMLab Model Compression Toolbox and Benchmark.
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
A curated list for Efficient Large Language Models
Efficient computing methods developed by Huawei Noah's Ark Lab
Collection of recent methods on (deep) neural network compression and acceleration.
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."