Skip to content

Commit 7a87d45

Browse files
committed
selfattention block: Remove the fc linear layer if it is not used
1 parent 8dcb9dc commit 7a87d45

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

monai/networks/blocks/selfattention.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,8 @@ def __init__(
106106

107107
self.num_heads = num_heads
108108
self.hidden_input_size = hidden_input_size if hidden_input_size else hidden_size
109-
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
109+
if include_fc:
110+
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
110111

111112
self.qkv: Union[nn.Linear, nn.Identity]
112113
self.to_q: Union[nn.Linear, nn.Identity]

0 commit comments

Comments
 (0)