Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ ReadMe: Object Detection Experiments ] #4

Open
IemProg opened this issue Jun 23, 2024 · 1 comment
Open

[ ReadMe: Object Detection Experiments ] #4

IemProg opened this issue Jun 23, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@IemProg
Copy link

IemProg commented Jun 23, 2024

Hi @bjzhb666 ,

Thanks a lot for releasing the code.

I have been closely studying your implementation and have a couple of questions that I hope you can help clarify:

  1. Difference between train_own_forget_cl.py and train_own_forget.py:
    I noticed that there are two seemingly similar scripts in the repository: train_own_forget_cl.py and train_own_forget.py. Could you please elaborate on the specific differences between these two files? It would be helpful to understand their distinct purposes and when each script should be used.

  2. Reproducing Object Recognition Results with DETR:
    I am particularly interested in reproducing the object recognition results you achieved using DETR. Could you provide more detailed instructions or a guide on how to set up and execute the code for this task?

  3. Loss function:
    Why do you freeze the loss function parameters ?

Thank you once again for your impressive work and for any assistance you can provide.

@bjzhb666
Copy link
Owner

Thanks for your attention to our work.

  1. They are essentially no different. train_own_forget.py is for single-step forgetting and train_own_forget_cl.py is for continual forgetting. But you can use train_own_forget_cl.py to conduct single-step forgetting. We use train_own_forget.py to conduct more ablation studies in single-step forgetting. If you do not want to see the details of the ablation study, just use train_own_forget_cl.py.
  2. We did not release the code for Object Detection. (actually, we have not cleaned it) The procedure is similar. We use the official checkpoint provided by deformable DETR. We need to use the LoRA module to modify the network structure and add our GS loss, Knowledge Retention Loss, and Selective Forgetting Loss.
  3. Actually, the loss function parameters are the FFN module (classifier). The name comes from our code base Face Transformer and is somewhat confusing. Please refer to our paper for more details (e.g. Sec.6.1) about why we freeze the FFN layers.

@bjzhb666 bjzhb666 self-assigned this Jun 24, 2024
@bjzhb666 bjzhb666 added the question Further information is requested label Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants