You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when training a sweep agent, we need to start the agent within a hyperparameter config, which automatically logs all of the hyperparameters to W&B. We also need to initialize the pytorch-lightning WandbLogger, which under the hood attempts to log the config again. See here for details: wandb/wandb#2641.
Ideally, we would solve this by not updating twice, however, this may be out of our control as we rely on wandb and pytorch-lightning for those behaviors.
A work-around is to suppress the W&B warning message. So far, warnings.filterwarnings does not work -- but we should investigate this more.
The text was updated successfully, but these errors were encountered:
Currently, when training a sweep agent, we need to start the agent within a hyperparameter config, which automatically logs all of the hyperparameters to W&B. We also need to initialize the pytorch-lightning
WandbLogger
, which under the hood attempts to log the config again. See here for details: wandb/wandb#2641.Ideally, we would solve this by not updating twice, however, this may be out of our control as we rely on
wandb
andpytorch-lightning
for those behaviors.A work-around is to suppress the W&B warning message. So far,
warnings.filterwarnings
does not work -- but we should investigate this more.The text was updated successfully, but these errors were encountered: