Skip to content

Pinned Loading

  1. flash-linear-attention flash-linear-attention Public

    🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton

    Python 2.3k 149

  2. flame flame Public

    🔥 A minimal training framework for scaling FLA models

    Python 101 16

  3. native-sparse-attention native-sparse-attention Public

    🐳 Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"

    Python 624 30

Repositories

Showing 8 of 8 repositories
  • flash-linear-attention Public

    🚀 Efficient implementations of state-of-the-art linear attention models in Torch and Triton

    fla-org/flash-linear-attention’s past year of commit activity
    Python 2,276 MIT 149 35 3 Updated Apr 17, 2025
  • fla-zoo Public

    Flash-Linear-Attention models beyond language

    fla-org/fla-zoo’s past year of commit activity
    Python 11 1 0 0 Updated Apr 15, 2025
  • flame Public

    🔥 A minimal training framework for scaling FLA models

    fla-org/flame’s past year of commit activity
    Python 101 MIT 16 2 0 Updated Apr 12, 2025
  • fla-rl Public

    A minimal RL frame work for scaling FLA models on long-horizon reasoning and agentic scenarios.

    fla-org/fla-rl’s past year of commit activity
    4 MIT 0 0 0 Updated Apr 1, 2025
  • ThunderKittens Public Forked from HazyResearch/ThunderKittens

    Tile primitives for speedy kernels

    fla-org/ThunderKittens’s past year of commit activity
    Cuda 2 MIT 134 0 0 Updated Mar 27, 2025
  • native-sparse-attention Public

    🐳 Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"

    fla-org/native-sparse-attention’s past year of commit activity
    Python 624 MIT 30 7 0 Updated Mar 19, 2025
  • fla-org/flash-hybrid-attention’s past year of commit activity
    7 0 0 0 Updated Mar 5, 2025
  • flash-bidirectional-linear-attention Public

    Triton implement of bi-directional (non-causal) linear attention

    fla-org/flash-bidirectional-linear-attention’s past year of commit activity
    Python 46 MIT 1 1 0 Updated Feb 4, 2025

Top languages

Loading…

Most used topics

Loading…