Skip to content

Commit

Permalink
[fix] fix flash attn
Browse files Browse the repository at this point in the history
  • Loading branch information
duanjunwen committed Nov 14, 2024
1 parent a259651 commit 2154aaf
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion colossalai/shardformer/layer/attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -644,7 +644,8 @@ def forward(
max_seqlen_half = max_seqlen // 2

misc_kwargs = {
"window_size": (-1, -1),
"window_size_left": -1,
"window_size_right": -1,
"alibi_slopes": None,
"softmax_scale": q.shape[-1] ** -0.5 if softmax_scale is None else softmax_scale,
"dropout_p": dropout_p,
Expand Down

0 comments on commit 2154aaf

Please sign in to comment.