🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Oct 28, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
OpenMMLab Detection Toolbox and Benchmark
A high-throughput and memory-efficient inference and serving engine for LLMs
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
pix2tex: Using a ViT to convert images of equations into LaTeX code.
Faster Whisper transcription with CTranslate2
Easy-to-use Speech Toolkit including Self-Supervised Learning model, SOTA/Streaming ASR with punctuation, Streaming TTS with text frontend, Speaker Verification System, End-to-End Speech Translation and Keyword Spotting. Won NAACL2022 Best Demo Award.
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
Large Language Model Text Generation Inference
Easy-to-use image segmentation library with awesome pre-trained model zoo, supporting wide-range of practical tasks in Semantic Segmentation, Interactive Segmentation, Panoptic Segmentation, Image Matting, 3D Segmentation, etc.
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
Trax — Deep Learning with Clear Code and Speed
Code for the paper "Jukebox: A Generative Model for Music"
Chinese version of GPT2 training code, using BERT tokenizer.
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
A framework for few-shot evaluation of language models.
Google AI 2018 BERT pytorch implementation
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."