Skip to content

Question regarding the learning rate update strategy #30

@zhitao-guo

Description

@zhitao-guo

Thank you for your beautiful work~

I noticed that the learning rate is updated after every batch by default, which seems different from the more common strategies (update lr every epoch).

self.accum_iter = 1

if data_iter_step % accum_iter == 0:

Could you please explain the reason behind this approach? Additionally, if possible, could you provide some useful references or links on this topic? Looking forward to your reply~

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions