Skip to content

Commit 8baaa74

Browse files
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent 7adf804 commit 8baaa74

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

monai/torch.patch

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,7 @@
33
@@ -150,6 +150,7 @@
44
), "is_causal and attn_mask cannot be set at the same time"
55
assert not enable_gqa, "conversion of scaled_dot_product_attention not implemented if enable_gqa is True"
6-
6+
77
+ scale = symbolic_helper._maybe_get_const(scale, "f")
88
if symbolic_helper._is_none(scale):
99
scale = _attention_scale(g, query)
10-

0 commit comments

Comments
 (0)