Skip to content

Commit

Permalink
Add back skipped test cases
Browse files Browse the repository at this point in the history
poyenc committed Dec 28, 2024

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
1 parent e471acc commit f547b58
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions tests/test_flash_attn_ck_fa3.py
Original file line number Diff line number Diff line change
@@ -80,12 +80,6 @@ def pad_rearrange_dropout_mask_hts_to_bhss(S_dmask, cu_seqlens_q, seqlen_q_round
def test_flash_attn_output(
seqlen_q, seqlen_k, d, dropout_p, causal, local, alibi, deterministic, mha_type, dtype, kvpacked
):
if d == 64 and causal and dtype is torch.bfloat16:
pytest.skip("hd=64,dtype=bf16 with causal mask not supported")

if d == 128 and causal:
pytest.skip("hd=128 with causal mask not supported")

device = "cuda"
# set seed
torch.random.manual_seed(0)

0 comments on commit f547b58

Please sign in to comment.