Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Sep 14, 2022

Bumps pytorch-lightning from 1.6.5 to 1.7.6.

Release notes

Sourced from pytorch-lightning's releases.

PyTorch Lightning 1.7.6: Standard patch release

[1.7.6] - 2022-09-13

Changed

  • Improved the error messaging when passing Trainer.method(model, x_dataloader=None) with no module-method implementations available (#14614)

Fixed

  • Reset the dataloaders on OOM failure in batch size finder to use the last successful batch size (#14372)
  • Fixed an issue to keep downscaling the batch size in case there hasn't been even a single successful optimal batch size with mode="power" (#14372)
  • Fixed an issue where self.log-ing a tensor would create a user warning from PyTorch about cloning tensors (#14599)
  • Fixed compatibility when torch.distributed is not available (#14454)
  • Fixed torchscript error with ensembles of LightningModules (#14657)

Contributors

@​akihironitta @​awaelchli @​Borda @​carmocca @​dependabot @​krshrimali @​mauvilsa @​pierocor @​rohitgr7 @​wangraying

If we forgot someone due to not matching commit email with GitHub account, let us know :)

PyTorch Lightning 1.7.5: Standard patch release

[1.7.5] - 2022-09-06

Fixed

  • Squeezed tensor values when logging with LightningModule.log (#14489)
  • Fixed WandbLogger save_dir is not set after creation (#14326)
  • Fixed Trainer.estimated_stepping_batches when maximum number of epochs is not set (#14317)

Contributors

@​carmocca @​dependabot @​robertomest @​rohitgr7 @​tshu-w

If we forgot someone due to not matching commit email with GitHub account, let us know :)

PyTorch Lightning 1.7.4: Standard patch release

[1.7.4] - 2022-08-31

Added

  • Added an environment variable PL_DISABLE_FORK that can be used to disable all forking in the Trainer (#14319)

Fixed

  • Fixed LightningDataModule hparams parsing (#12806)
  • Reset epoch progress with batch size scaler (#13846)
  • Fixed restoring the trainer after using lr_find() so that the correct LR schedule is used for the actual training (#14113)
  • Fixed incorrect values after transferring data to an MPS device (#14368)

... (truncated)

Commits
  • 6bd71be Reset dataloaders on failure in tuner (#14372)
  • 582b8cc Better error message when dataloader and datamodule is None (V2) (#14637)
  • 72f82eb Avoid warning when cloning tensor in self.log (#14599)
  • 10e09c6 Update changelog and version for 1.7.6
  • 305033e Removed from_argparse_args tests in test_cli.py (#14597)
  • f7acd4e Update psutil requirement from <=5.9.1 to <5.9.3 in /requirements (#14665)
  • 75f930a Remove skipping logic in PL CI (#14565)
  • 4645587 Run CircleCI with the HEAD sha, not the base (#14625)
  • 3f30de1 Set running_torchscript recursively (#14657)
  • 76063ad Remove deprecated test_tube dependency from environment.yml (#14617)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 1.6.5 to 1.7.6.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@1.6.5...1.7.6)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Sep 14, 2022
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Sep 22, 2022

Superseded by #8.

@dependabot dependabot bot closed this Sep 22, 2022
@dependabot dependabot bot deleted the dependabot/pip/pytorch-lightning-1.7.6 branch September 22, 2022 23:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant