Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install failing on Ubuntu 22.04 #87

Closed
destin-v opened this issue Mar 14, 2025 · 1 comment
Closed

Install failing on Ubuntu 22.04 #87

destin-v opened this issue Mar 14, 2025 · 1 comment

Comments

@destin-v
Copy link

When executing the install instructions, it fails on pip install flash-attn --no-build-isolation. This is a fresh install using Miniconda.

# Clone this repo
git clone [email protected]:thu-ml/RoboticsDiffusionTransformer.git
cd RoboticsDiffusionTransformer

# Create a Conda environment
conda create -n rdt python=3.10.0
conda activate rdt

# Install pytorch
# Look up https://pytorch.org/get-started/previous-versions/ with your cuda version for a correct command
pip install torch==2.1.0 torchvision==0.16.0  --index-url https://download.pytorch.org/whl/cu121

# Install packaging
pip install packaging==24.0

# Install flash-attn
pip install flash-attn --no-build-isolation   # FAILS HERE

# Install other prequisites
pip install -r requirements.txt
miniconda3/envs/rdt/lib/python3.10/site-packages/torch/include/ATen/core/Generator.h:167:75: note: in passing argument 1 of ‘T* at::get_generator_or_default(const c10::optional<at::Generator>&, const at::Generator&) [with T = at::CUDAGeneratorImpl]’
        167 | static inline T* get_generator_or_default(const c10::optional<Generator>& gen, const Generator& default_gen) {
            |                                           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~
      error: command '/usr/bin/g++' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)
@destin-v
Copy link
Author

Conda needs its own installation of cuda in order for this to work:

conda activate rdt
conda install cuda -c nvidia

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant