Skip to content
@thu-ml

TSAIL group

Tsinghua Statistical Artificial Intelligence & Learning Group

Pinned Loading

  1. TurboDiffusion TurboDiffusion Public

    TurboDiffusion: 100–200× Acceleration for Video Diffusion Models

    Python 3.3k 222

  2. unidiffuser unidiffuser Public

    Code and models for the paper "One Transformer Fits All Distributions in Multi-Modal Diffusion"

    Python 1.5k 92

  3. SageAttention SageAttention Public

    [ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models.

    Cuda 3.1k 321

  4. prolificdreamer prolificdreamer Public

    ProlificDreamer: High-Fidelity and Diverse Text-to-3D Generation with Variational Score Distillation (NeurIPS 2023 Spotlight)

    Python 1.6k 47

  5. ares ares Public

    A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.

    Python 523 93

  6. tianshou tianshou Public

    An elegant PyTorch deep reinforcement learning library.

    Python 9.7k 1.2k

Repositories

Showing 10 of 87 repositories
  • embodied-data-toolkit Public

    A toolkit for processing raw embodied data into standardized formats and converting between embodied dataset schemas.

    thu-ml/embodied-data-toolkit’s past year of commit activity
    Python 6 1 0 0 Updated Jan 21, 2026
  • RoboticsDiffusionTransformer Public

    RDT-1B: a Diffusion Foundation Model for Bimanual Manipulation

    thu-ml/RoboticsDiffusionTransformer’s past year of commit activity
    Python 1,598 MIT 150 36 3 Updated Jan 21, 2026
  • TurboDiffusion Public

    TurboDiffusion: 100–200× Acceleration for Video Diffusion Models

    thu-ml/TurboDiffusion’s past year of commit activity
    Python 3,258 Apache-2.0 222 53 5 Updated Jan 18, 2026
  • SLA Public

    SLA: Beyond Sparsity in Diffusion Transformers via Fine-Tunable Sparse–Linear Attention

    thu-ml/SLA’s past year of commit activity
    Python 250 Apache-2.0 15 7 0 Updated Jan 17, 2026
  • SageAttention Public

    [ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models.

    thu-ml/SageAttention’s past year of commit activity
    Cuda 3,083 Apache-2.0 321 149 15 Updated Jan 17, 2026
  • MLA-Trust Public

    A toolbox for benchmarking Multimodal LLM Agents trustworthiness across truthfulness, controllability, safety and privacy dimensions through 34 interactive tasks

    thu-ml/MLA-Trust’s past year of commit activity
    Python 62 MIT 4 2 0 Updated Jan 9, 2026
  • Motus Public

    Official code of Motus: A Unified Latent Action World Model

    thu-ml/Motus’s past year of commit activity
    Python 586 Apache-2.0 11 13 0 Updated Jan 5, 2026
  • SpargeAttn Public

    [ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.

    thu-ml/SpargeAttn’s past year of commit activity
    Cuda 915 Apache-2.0 82 57 3 Updated Dec 31, 2025
  • vidar-robotwin Public

    robotwin evaluation code for vidar.

    thu-ml/vidar-robotwin’s past year of commit activity
    Python 4 MIT 0 1 0 Updated Dec 22, 2025
  • vidar Public

    Official repo for vidar and vidarc: video foundation model for robotics.

    thu-ml/vidar’s past year of commit activity
    Python 35 1 1 0 Updated Dec 22, 2025