Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why encoder is not freeze in training config? #30

Open
Parisnewsy opened this issue Mar 28, 2024 · 1 comment
Open

Why encoder is not freeze in training config? #30

Parisnewsy opened this issue Mar 28, 2024 · 1 comment

Comments

@Parisnewsy
Copy link

Hi OpenLRM Team
I don't find encoder freeze in LRM paper, why do you choose to set freeze encoder is false?

@ZexinHe
Copy link
Collaborator

ZexinHe commented Apr 2, 2024

Hi,

We are setting encoder freeze to be False by default, which means the encoder is trainable.

We tried both freezing encoder and trainable encoder at earlier experiments and found that a trainable encoder reaches better loss and relatively better performances.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants