Skip to content

Commit 098c2c8

Browse files
committed
Align fsdp_sft_trainer warmup_steps_ratio->lr_warmup_steps_ratio
1 parent f4c7285 commit 098c2c8

File tree

5 files changed

+1067
-1064
lines changed

5 files changed

+1067
-1064
lines changed

docs/examples/config.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -618,7 +618,7 @@ Optim
618618
optimizer_impl: torch.optim
619619
lr: 1e-5
620620
weight_decay: 0.01
621-
warmup_steps_ratio: 0.1
621+
lr_warmup_steps_ratio: 0.1
622622
clip_grad: 1.0
623623
lr_scheduler: cosine
624624
override_optimizer_config: null
@@ -627,7 +627,7 @@ Optim
627627
- ``optimizer_impl``: Module path to import optimizer from (e.g., ``"torch.optim"``, ``"torchao.optim"``, ``"bitsandbytes.optim"``).
628628
- ``optim.lr``: Learning rate for the optimizer.
629629
- ``optim.weight_decay``: Weight decay for the optimizer.
630-
- ``optim.warmup_steps_ratio``: Ratio of warmup steps to total training steps.
630+
- ``optim.lr_warmup_steps_ratio``: Ratio of warmup steps to total training steps.
631631
- ``optim.clip_grad``: Gradient clipping value.
632632
- ``optim.lr_scheduler``: Learning rate scheduler type. Options:
633633

0 commit comments

Comments
 (0)