Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can this one bed executed without flashattention? #13

Open
4eJIoBek1 opened this issue Jun 28, 2024 · 3 comments
Open

can this one bed executed without flashattention? #13

4eJIoBek1 opened this issue Jun 28, 2024 · 3 comments

Comments

@4eJIoBek1
Copy link

flash-attention is implemented for only ampere and newer gpus, also it blocks code from using on cpu, which is not good

@oliverban
Copy link

I have a 3090 so Flash-atten should work according to https://pypi.org/project/flash-attn/ but I still can't install it on my Windows machine, any luck?

@jaykup1
Copy link

jaykup1 commented Jul 3, 2024

I have a 3090 so Flash-atten should work according to https://pypi.org/project/flash-attn/ but I still can't install it on my Windows machine, any luck?

#3 (comment)

@oliverban
Copy link

I have a 3090 so Flash-atten should work according to https://pypi.org/project/flash-attn/ but I still can't install it on my Windows machine, any luck?

#3 (comment)

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants