Skip to content

Commit

Permalink
make format
Browse files Browse the repository at this point in the history
  • Loading branch information
matsen committed Jul 9, 2024
1 parent 191ccef commit 7b737a7
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion netam/framework.py
Original file line number Diff line number Diff line change
Expand Up @@ -422,7 +422,9 @@ def reset_optimization(self, learning_rate=None):
learning_rate = self.learning_rate

# copied from # https://github.com/karpathy/nanoGPT/blob/9755682b981a45507f6eb9b11eadef8cb83cebd5/model.py#L264
param_dict = {pn: p for pn, p in self.model.named_parameters() if p.requires_grad}
param_dict = {
pn: p for pn, p in self.model.named_parameters() if p.requires_grad
}
# Do not apply weight decay to 1D parameters (biases and layernorm weights).
decay_params = [p for p in param_dict.values() if p.dim() >= 2]
nodecay_params = [p for p in param_dict.values() if p.dim() < 2]
Expand Down

0 comments on commit 7b737a7

Please sign in to comment.