Skip to content

Commit ea687ad

Browse files
Lucas-rbntericspod
andauthored
Update monai/networks/blocks/selfattention.py
Co-authored-by: Eric Kerfoot <[email protected]> Signed-off-by: Lucas Robinet <[email protected]>
1 parent 2d8086e commit ea687ad

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

monai/networks/blocks/selfattention.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,7 +194,8 @@ def forward(self, x, attn_mask: torch.Tensor | None = None):
194194
att_mat = self.rel_positional_embedding(x, att_mat, q)
195195

196196
if self.causal:
197-
assert attn_mask is None, "Causal attention does not support attention masks."
197+
if attn_mask is not None:
198+
raise ValueError("Causal attention does not support attention masks.")
198199
att_mat = att_mat.masked_fill(self.causal_mask[:, :, : x.shape[-2], : x.shape[-2]] == 0, float("-inf"))
199200

200201
if attn_mask is not None:

0 commit comments

Comments
 (0)