Stars
Code of Pyramidal Flow Matching for Efficient Video Generative Modeling
Official Implementation of "Lumina-mGPT: Illuminate Flexible Photorealistic Text-to-Image Generation with Multimodal Generative Pretraining"
[ICLR 2024] Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
Official inference repo for FLUX.1 models
MINT-1T: A one trillion token multimodal interleaved dataset.
Toolset for creating animated React components built on Theatre.js and Framer Motion.
PyTorch native quantization and sparsity for training and inference
Code for Adam-mini: Use Fewer Learning Rates To Gain More https://arxiv.org/abs/2406.16793
[ECCV 2024] official code for "Long-CLIP: Unlocking the Long-Text Capability of CLIP"
Qwen2.5 is the large language model series developed by Qwen team, Alibaba Cloud.
[CVPR`2024, Oral] Attention Calibration for Disentangled Text-to-Image Personalization
InstantID: Zero-shot Identity-Preserving Generation in Seconds 🔥
Analyzing and Improving the Training Dynamics of Diffusion Models (EDM2)
Project Page for "LISA: Reasoning Segmentation via Large Language Model"
[ ECCV 2024 ] MotionLCM: This repo is the official implementation of "MotionLCM: Real-time Controllable Motion Generation via Latent Consistency Model"
DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
Universal Tensor Operations in Einstein-Inspired Notation for Python.
Lumina-T2X is a unified framework for Text to Any Modality Generation
[ECCV2024] This is an official inference code of the paper "Glyph-ByT5: A Customized Text Encoder for Accurate Visual Text Rendering" and "Glyph-ByT5-v2: A Strong Aesthetic Baseline for Accurate Mu…
Official repository for LightSeq: Sequence Level Parallelism for Distributed Training of Long Context Transformers
Memory optimization and training recipes to extrapolate language models' context length to 1 million tokens, with minimal hardware.
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference