Skip to content

Something wrong with code in attention.py #1

@harudaee

Description

@harudaee

Excuse me, sir.

I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'.

According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' .

Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions