File tree Expand file tree Collapse file tree 2 files changed +10
-13
lines changed
templates/src/llm/finetune_distributed Expand file tree Collapse file tree 2 files changed +10
-13
lines changed Original file line number Diff line number Diff line change @@ -2,19 +2,6 @@ defaults:
22 - _global
33 - _self_
44
5- hydra :
6- job :
7- name : llm_finetune_distributed
8- searchpath :
9- - pkg://configs
10- launcher :
11- tasks_per_node : ${compute.gpus_per_node}
12- setup :
13- - ' export CUDA_VISIBLE_DEVICES=$SLURM_LOCALID'
14-
15- paths :
16- out_dir : null
17-
185trainer :
196 seed : 42
207 model :
@@ -64,3 +51,12 @@ trainer:
6451 fsdp_min_num_params : 1000000
6552 logging :
6653 report_to : []
54+
55+ hydra :
56+ job :
57+ name : llm_finetune_distributed
58+ searchpath :
59+ - pkg://configs # Include configs from the configs package in the searchpath
60+ launcher :
61+ setup :
62+ - ' export CUDA_VISIBLE_DEVICES=$SLURM_LOCALID'
Original file line number Diff line number Diff line change @@ -34,6 +34,7 @@ def main(cfg: DictConfig):
3434 OmegaConf .resolve (hydra_config )
3535 OmegaConf .save (hydra_config , save_path )
3636
37+ # Run the trainer with the run config
3738 trainer = FinetuneDistributedTrainer ()
3839 return trainer (cfg )
3940
You can’t perform that action at this time.
0 commit comments