Skip to content

Commit 32b9318

Browse files
committed
added comment to launch and config. removed output to null.
1 parent ca0ed7c commit 32b9318

File tree

2 files changed

+10
-13
lines changed

2 files changed

+10
-13
lines changed

templates/src/llm/finetune_distributed/config.yaml

Lines changed: 9 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,6 @@ defaults:
22
- _global
33
- _self_
44

5-
hydra:
6-
job:
7-
name: llm_finetune_distributed
8-
searchpath:
9-
- pkg://configs
10-
launcher:
11-
tasks_per_node: ${compute.gpus_per_node}
12-
setup:
13-
- 'export CUDA_VISIBLE_DEVICES=$SLURM_LOCALID'
14-
15-
paths:
16-
out_dir: null
17-
185
trainer:
196
seed: 42
207
model:
@@ -64,3 +51,12 @@ trainer:
6451
fsdp_min_num_params: 1000000
6552
logging:
6653
report_to: []
54+
55+
hydra:
56+
job:
57+
name: llm_finetune_distributed
58+
searchpath:
59+
- pkg://configs # Include configs from the configs package in the searchpath
60+
launcher:
61+
setup:
62+
- 'export CUDA_VISIBLE_DEVICES=$SLURM_LOCALID'

templates/src/llm/finetune_distributed/launch.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@ def main(cfg: DictConfig):
3434
OmegaConf.resolve(hydra_config)
3535
OmegaConf.save(hydra_config, save_path)
3636

37+
# Run the trainer with the run config
3738
trainer = FinetuneDistributedTrainer()
3839
return trainer(cfg)
3940

0 commit comments

Comments
 (0)