Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About ABMIL #34

Open
InfinityBox opened this issue Mar 8, 2022 · 2 comments
Open

About ABMIL #34

InfinityBox opened this issue Mar 8, 2022 · 2 comments

Comments

@InfinityBox
Copy link

I have read ABMIL's code (https://github.com/AMLab-Amsterdam/AttentionDeepMIL), the input size of its model is: (1, bag_length, 1, width, height), due to the images of the dataset being very small, memory problems may not occur. But when using WSI, bag_length, width and height are very big, so how to apply ABMIL to WSIs for the comparison?

@binli123
Copy link
Owner

binli123 commented Mar 8, 2022

You need only to swap in the ABMIL aggregation head which takes precomputed feature vectors as inputs.

@InfinityBox
Copy link
Author

Thank you for your reply.
Are these two lines of code the aggregation head which I need to swap? Then replace with an input layer suitable for precomputed feature vectors?
https://github.com/AMLab-Amsterdam/AttentionDeepMIL/blob/master/model.py#L41

        H = self.feature_extractor_part1(x)
        H = H.view(-1, 50 * 4 * 4)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants