-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mi50 Support #29
Comments
Hi @YehowshuaScaled. I think it would be better to ask the CK team to see if they are going to support MI50. It won't be an issue if they have FA kernels running on MI50. #RuntimeError: DeviceGroupedMultiheadAttentionForward_Xdl_CShuffle_V2<256, 128, 128, 32, 8, 8, 128, 128, 32, 2, Default, ASpecDefault, B0SpecDefault, B1SpecDefault, CSpecDefault, MaskUpperTriangleFromTopLeft> does not support this problem This error is actually raised from the CK backend. |
I notice this line in setup.py |
did you solve this? |
Hello @YehowshuaScaled , Did you find a solution for it? I have 2x MI60 cards. |
Hi @jayz0123 , How hard is it to implement FA kernels for MI60? Can you please point to the relevant scripts and documentation to make the changes? What knowledge is required to implement FA2 to MI60? Is it only dependent on Composable Kernel repo support? |
I was able to build flash-attention ROCM for both my Mi100 and Mi50 cards, but only got flash attention working on the Mi100(very impressive performance I might add).
Trying to run flash attention on the Mi50 delivered the following error:
RuntimeError: DeviceGroupedMultiheadAttentionForward_Xdl_CShuffle_V2<256, 128, 128, 32, 8, 8, 128, 128, 32, 2, Default, ASpecDefault, B0SpecDefault, B1SpecDefault, CSpecDefault, MaskUpperTriangleFromTopLeft> does not support this problem
How hard would it be to port FA to the Mi50. Happy to pay/hire for support on this as I have a rather large stockpile of Mi50s.
The text was updated successfully, but these errors were encountered: