-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about the training code #6
Comments
@AZZMM |
Thank you for quick response, look forward to the updates |
Excuse me, I've implemented the training code and I am confused about the details of the inhibition loss. I learn from the paper that the loss is based on the attention map in the frozen cross attention layer with shape I sum along the My implementation is like this: Looking forward to your reply, thank you! |
pre_attn = fuser_info['pre_attn'] # (BPN, heads, HW, 77) supplement_mask_inter = F.interpolate(supplement_mask, (H, W), mode=args.inter_mode) # supplement_mask is the mask of BG pre_attn_mean = (pre_attn * supplement_mask_inter).sum(dim=-1) / @AZZMM You can refer to this code to implement inhibition loss. If you still have questions, you can ask me here. |
Thank you very much for your quick reply and the code is really helpful to me. |
@AZZMM In the training, we don't need negative prompts. BPN means Batch * Phase_num, the Phase_num contain {global prompt, instance1_desc, instance2_desc, ..., instanceN_desc}. We use the first two 16*16 attn-maps for calculating inhibition loss. |
Thank you very much for your help! I'll try it. |
Hello! @limuloo There is one detail that I am not very sure about the cross attention layer without migc. Is it use vanilla cross attention or naive fuser during training? |
@AZZMM vanilla cross attention |
I see, Thank you! |
Hi, may I ask when will the MIGC++ be released? |
@WUyinwei-hah We have already completed the writing of the MIGC++ paper, and we will submit it in the next few days. Then, we will proceed to consider the open-source work for MIGC++. |
Looking forward to it! |
Hello, thanks for the excellent work, may I ask when the training code will be released, if it will be released soon, thank you!
The text was updated successfully, but these errors were encountered: