Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Sep 11, 2021

Bumps pytorch-lightning from 1.4.3 to 1.4.6.

Release notes

Sourced from pytorch-lightning's releases.

Standard weekly patch release

[1.4.6] - 2021-09-10

  • Fixed an issues with export to ONNX format when a model has multiple inputs (#8800)
  • Removed deprecation warnings being called for on_{task}_dataloader (#9279)
  • Fixed save/load/resume from checkpoint for DeepSpeed Plugin (#8397, #8644, #8627)
  • Fixed EarlyStopping running on train epoch end when check_val_every_n_epoch>1 is set (#9156)
  • Fixed an issue with logger outputs not being finalized correctly after prediction runs (#8333)
  • Fixed the Apex and DeepSpeed plugin closure running after the on_before_optimizer_step hook (#9288)
  • Fixed the Native AMP plugin closure not running with manual optimization (#9288)
  • Fixed bug where data-loading functions where not getting the correct running stage passed (#8858)
  • Fixed intra-epoch evaluation outputs staying in memory when the respective *_epoch_end hook wasn't overridden (#9261)
  • Fixed error handling in DDP process reconciliation when _sync_dir was not initialized (#9267)
  • Fixed PyTorch Profiler not enabled for manual optimization (#9316)
  • Fixed inspection of other args when a container is specified in save_hyperparameters (#9125)
  • Fixed signature of Timer.on_train_epoch_end and StochasticWeightAveraging.on_train_epoch_end to prevent unwanted deprecation warnings (#9347)

Contributors

@​ananthsub @​awaelchli @​Borda @​four4fish @​justusschock @​kaushikb11 @​s-rog @​SeanNaren @​tangbinh @​tchaton @​xerus

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Standard weekly patch release

[1.4.5] - 2021-08-31

  • Fixed reduction using self.log(sync_dict=True, reduce_fx={mean,max}) (#9142)
  • Fixed not setting a default value for max_epochs if max_time was specified on the Trainer constructor (#9072)
  • Fixed the CometLogger, no longer modifies the metrics in place. Instead creates a copy of metrics before performing any operations (#9150)
  • Fixed DDP "CUDA error: initialization error" due to a copy instead of deepcopy on ResultCollection (#9239)

Contributors

@​ananthsub @​bamblebam @​carmocca @​daniellepintz @​ethanwharris @​kaushikb11 @​sohamtiwari3120 @​tchaton

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Standard weekly patch release

[1.4.4] - 2021-08-24

  • Fixed a bug in the binary search mode of auto batch size scaling where exception was raised if the first trainer run resulted in OOM (#8954)
  • Fixed a bug causing logging with log_gpu_memory='min_max' not working (#9013)

Contributors

@​SkafteNicki @​eladsegal

If we forgot someone due to not matching commit email with GitHub account, let us know :]

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

[1.4.6] - 2021-09-07

  • Fixed an issues with export to ONNX format when a model has multiple inputs (#8800)
  • Removed deprecation warnings being called for on_{task}_dataloader (#9279)
  • Fixed save/load/resume from checkpoint for DeepSpeed Plugin ( #8397, #8644, #8627)
  • Fixed EarlyStopping running on train epoch end when check_val_every_n_epoch>1 is set (#9156)
  • Fixed an issue with logger outputs not being finalized correctly after prediction runs (#8333)
  • Fixed the Apex and DeepSpeed plugin closure running after the on_before_optimizer_step hook (#9288)
  • Fixed the Native AMP plugin closure not running with manual optimization (#9288)
  • Fixed bug where data-loading functions where not getting the correct running stage passed (#8858)
  • Fixed intra-epoch evaluation outputs staying in memory when the respective *_epoch_end hook wasn't overridden (#9261)
  • Fixed error handling in DDP process reconciliation when _sync_dir was not initialized (#9267)
  • Fixed PyTorch Profiler not enabled for manual optimization (#9316)
  • Fixed inspection of other args when a container is specified in save_hyperparameters (#9125)
  • Fixed signature of Timer.on_train_epoch_end and StochasticWeightAveraging.on_train_epoch_end to prevent unwanted deprecation warnings (#9347)

[1.4.5] - 2021-08-31

  • Fixed reduction using self.log(sync_dict=True, reduce_fx={mean,max}) (#9142)
  • Fixed not setting a default value for max_epochs if max_time was specified on the Trainer constructor (#9072)
  • Fixed the CometLogger, no longer modifies the metrics in place. Instead creates a copy of metrics before performing any operations (#9150)
  • Fixed DDP "CUDA error: initialization error" due to a copy instead of deepcopy on ResultCollection (#9239)

[1.4.4] - 2021-08-24

  • Fixed a bug in the binary search mode of auto batch size scaling where exception was raised if the first trainer run resulted in OOM (#8954)
  • Fixed a bug causing logging with log_gpu_memory='min_max' not working (#9013)
Commits
  • 00c6640 1.4.6 release
  • 3e6df2f Remove todo, ensure we only check rank 0 for deepspeed warning (#9311)
  • f2c5f5b Clear reference to training loss at the end of train step (#9336)
  • a5ad966 [bugfix] Resolve PyTorch Profiling for Manual Optimization (#9316)
  • 4cf6be9 Fix inspection of unspecified args for container hparams (#9125)
  • b034fe8 Update hooks.py
  • 1abf889 Remove deprecation warnings being called for on_{task}_dataloader (#9279)
  • 404dafc Fix TPU cleaning job (#9301)
  • f3e24f0 Disable {save,check}_on_train_epoch_end with check_val_every_n_epoch>1 (#...
  • 62bb93f Move tracking epoch end outputs logic to the EvaluationEpochLoop (#9261)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/PyTorchLightning/pytorch-lightning) from 1.4.3 to 1.4.6.
- [Release notes](https://github.com/PyTorchLightning/pytorch-lightning/releases)
- [Changelog](https://github.com/PyTorchLightning/pytorch-lightning/blob/1.4.6/CHANGELOG.md)
- [Commits](Lightning-AI/pytorch-lightning@1.4.3...1.4.6)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Sep 11, 2021
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Sep 18, 2021

Superseded by #63.

@dependabot dependabot bot closed this Sep 18, 2021
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/tune/pytorch-lightning-1.4.6 branch September 18, 2021 07:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant