Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

第六章 动态调整学习率 示例代码错误 #108

Open
zjlab-BioGene opened this issue Jan 11, 2025 · 0 comments
Open

第六章 动态调整学习率 示例代码错误 #108

zjlab-BioGene opened this issue Jan 11, 2025 · 0 comments

Comments

@zjlab-BioGene
Copy link

zjlab-BioGene commented Jan 11, 2025

# 选择一种优化器
optimizer = torch.optim.Adam(...) 
# 选择上面提到的一种或多种动态调整学习率的方法
scheduler1 = torch.optim.lr_scheduler.... 
scheduler2 = torch.optim.lr_scheduler....
...
schedulern = torch.optim.lr_scheduler....
# 进行训练
for epoch in range(100):
    train(...)
    validate(...)
    optimizer.step()
    # 需要在优化器参数更新之后再动态调整学习率
# scheduler的优化是在每一轮后面进行的
scheduler1.step() 
...
schedulern.step()

scheduler.step()放在训练循环外可能会导致学习率更新的时机不正确,影响模型的收敛速度和最终性能。如果将 scheduler.step() 放在训练循环外(如循环结束后),学习率只会更新一次,这通常是不正确的。在每次 optimizer.step() 更新权重后调用 scheduler.step(),这样才能正确地调整学习率。

for epoch in range(100):
    train(...)
    validate(...)
    optimizer.step()   
    scheduler.step()   
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant