Is there any way to disable prefetching of batches? #9660
Replies: 3 comments 4 replies
-
not an expert in Reinforcement learning, but is it something that can be configured inside native PyTorch Dataloader using |
Beta Was this translation helpful? Give feedback.
-
Dear @blacknfuzzy, Yes, PyTorch Lightning is responsible to do the pre-fetching. PyTorch Lightning currently does the following in order to know if the batch is the latest and conclude properly an epoch. Right now, the only work-around would be for you to replace the DataFetcher and keep track of latest. Might be simpler if you could point us to some code, so we can brainstorm on the best way forward. Best, |
Beta Was this translation helpful? Give feedback.
-
By defining class Dataset:
def __len__(self):
return 65535
def __iter__(self):
return self
def __next__(self):
pass |
Beta Was this translation helpful? Give feedback.
-
I noticed that one batch always gets prefetched, since I am using pytorch-lightning for Reinforcement Learning this is not desired behaviour.
Is there any way to have the trainer not do this?
Beta Was this translation helpful? Give feedback.
All reactions