Running Custom validation loop causes bug when calling self.log #16608
Unanswered
vic-ene
asked this question in
code help: RL / MetaLearning
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello !
I am training a model where gradients are calculated in both training and validation steps with different dataloaders.
Info: I have added the line with torch.set_grad_enabled(True) in the function validation_step to make this work.
However, I wanted to have the option of doing several passes on the validation dataloader in a row (each epoch). Thus, I am following this documentation to implement my own EvaluationLoop, but I get an error after running it with the trainer.
The class created for the EvaluationLoop is the following
I assign it as such
and the error message is
File "/pytorch_lightning/core/module.py", line 390, in log
raise MisconfigurationException(
lightning_lite.utilities.exceptions.MisconfigurationException: You are trying to
self.log()
but it is not managed by theTrainer
control flowI use do use self.log("val_bpd", loss, on_step=True, on_epoch=True, prog_bar=True, logger=True, sync_dist=True) in the validation_step()
The code runs normally if I do not replace the val loop.
Beta Was this translation helpful? Give feedback.
All reactions