Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Help】环境安装报错:安装flash-attn报错,它无法支持910B。当前能支持昇腾910B吗,请问还有其他方法吗? #16

Open
Kaimar666 opened this issue Mar 5, 2025 · 2 comments

Comments

@Kaimar666
Copy link

报错信息:
flash_attn-2.7.4.post1.tar.gz (6.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 66.7 MB/s eta 0:00:00
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [6 lines of output]
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "/tmp/pip-install-_oh5xiov/flash-attn_f996f45be23845c5b7293c3958f895c0/setup.py", line 22, in
import torch
ModuleNotFoundError: No module named 'torch'
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.


Package Version


accelerate 1.0.1
aiofiles 23.2.1
aiohappyeyeballs 2.4.3
aiohttp 3.10.10
aiosignal 1.3.1
annotated-types 0.7.0
anyio 4.6.2.post1
arrow 1.2.3
ascendebug 0.1.0
asttokens 2.4.1
async-timeout 4.0.3
attrs 24.2.0
auto-tune 0.1.0
av 13.1.0
binaryornot 0.4.4
certifi 2024.8.30
cffi 1.15.1.post20240308173724
chardet 5.2.0
charset-normalizer 3.4.0
click 8.1.7
comm 0.2.2
configparser 5.2.0
contourpy 1.3.0
cookiecutter 2.6.0
cryptography 3.4.7
cycler 0.12.1
dataflow 0.0.1
datasets 3.1.0
debugpy 1.8.7
decorator 5.1.1
deepspeed 0.15.4
dill 0.3.8
docstring_parser 0.16
einops 0.8.0
ephemeral-port-reserve 1.1.4
esdk-obs-python 3.21.4
exceptiongroup 1.2.2
executing 2.1.0
fastapi 0.115.4
ffmpy 0.4.0
filelock 3.16.1
fire 0.7.0
fonttools 4.54.1
frozenlist 1.5.0
fsspec 2024.9.0
gradio 4.44.1
gradio_client 1.3.0
h11 0.14.0
hccl 0.1.0
hccl-parser 0.1
hjson 3.1.0
httpcore 1.0.6
httpx 0.27.2
huaweicloudsdkcore 3.1.8
huaweicloudsdkcsms 3.1.8
huggingface-hub 0.26.2
idna 3.10
importlib-metadata 4.4.0
importlib_resources 6.4.5
ipykernel 6.29.5
ipython 8.29.0
jedi 0.19.1
Jinja2 3.1.4
jupyter_client 8.6.3
jupyter_core 5.7.2
kiwisolver 1.4.7
llamafactory 0.9.1
llm-datadist 0.0.1
llm-engine 0.0.1
lxml 5.1.0
ma-cli 1.2.3
markdown-it-py 3.0.0
MarkupSafe 2.1.5
matplotlib 3.9.2
matplotlib-inline 0.1.7
mdurl 0.1.2
mock 4.0.3
modelarts 1.4.20
mpmath 1.3.0
msadvisor 1.0.0
msgpack 1.1.0
multidict 6.1.0
multiprocess 0.70.16
nest-asyncio 1.6.0
networkx 3.4.2
ninja 1.11.1.3
npu-bridge 1.15.0
npu-device 0.1
numpy 1.26.4
op-compile-tool 0.1.0
op-gen 0.1
op-test-frame 0.1
opc-tool 0.1.0
orjson 3.10.11
packaging 24.1
pandas 2.2.3
parso 0.8.4
peft 0.12.0
pexpect 4.9.0
pillow 10.4.0
pip 24.2
platformdirs 4.3.6
prettytable 3.5.0
prompt_toolkit 3.0.48
propcache 0.2.0
protobuf 5.28.3
psutil 6.1.0
ptyprocess 0.7.0
pure_eval 0.2.3
py-cpuinfo 9.0.0
pyarrow 18.0.0
pycparser 2.21
pydantic 2.9.2
pydantic_core 2.23.4
pydub 0.25.1
Pygments 2.18.0
pyparsing 3.2.0
python-dateutil 2.9.0.post0
python-multipart 0.0.17
python-slugify 8.0.4
pytz 2024.2
PyYAML 6.0.2
pyzmq 26.2.0
regex 2024.9.11
requests 2.32.3
requests-futures 1.0.0
requests-toolbelt 0.10.1
rich 13.9.4
ring-flash-attn 0.1.4
ruff 0.7.2
safetensors 0.4.5
schedule-search 0.0.1
scipy 1.14.1
semantic-version 2.10.0
sentencepiece 0.2.0
setuptools 75.1.0
shellingham 1.5.4
shtab 1.7.1
simplejson 3.17.0
six 1.16.0
sniffio 1.3.1
sse-starlette 2.1.3
stack-data 0.6.3
starlette 0.41.2
sympy 1.13.3
tabulate 0.9.0
te 0.4.0
tenacity 8.1.0
termcolor 2.5.0
text-unidecode 1.3
tiktoken 0.8.0
tokenizers 0.19.1
tomlkit 0.12.0
torch 2.1.0
torch-npu 2.1.0.post3
tornado 6.4.1
tqdm 4.67.0
traitlets 5.14.3
transformers 4.43.2
trl 0.9.6
typer 0.12.5
typing_extensions 4.12.2
tyro 0.8.14
tzdata 2024.2
urllib3 2.2.3
uvicorn 0.32.0
wcwidth 0.2.13
websockets 12.0
wheel 0.44.0
xxhash 3.5.0
yarl 1.17.1
zipp 3.7.0

@Eisenhower
Copy link

同问

@HaoshengZou
Copy link
Collaborator

目前应该是因为昇腾910B不原生支持flash attention。我们目前的实现是用flash attention实现序列并行的。
@Eisenhower @Kaimar666 我们对昇腾910B不太熟悉,你们知道它支持deepspeed吗?如果支持deepspeed,就支持deepspeed Ulysses,我们一直有计划增加通过deepspeed Ulysses实现序列并行。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants