-
Notifications
You must be signed in to change notification settings - Fork 28
Extension of PR#105 #112
base: master
Are you sure you want to change the base?
Extension of PR#105 #112
Conversation
update to head
Fixed some small mistakes with the dimensions.
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
if not isinstance(dims, DiscreteParameter): | ||
if len(dims) > 2: | ||
dims = list(combinations(dims, 2)) | ||
else: | ||
dims = (dims,) | ||
dims = DiscreteParameter(dims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing correct handling for type(dims)==int. Will throw error on line 45
Co-authored-by: Paul <[email protected]>
import contextlib | ||
SEARCHSORTED_AVAILABLE = True | ||
try: | ||
from torchsearchsorted import searchsorted |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add a python implementation of the search sorted functionality?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a feature from #105 (see #105 (comment)) by @weningerleon so I am not sure what is done there exactly, but to me it seems like this was only necessary for pytorch < v1.6, since 1.6 added https://pytorch.org/docs/stable/generated/torch.searchsorted.html?highlight=searchsorted#torch.searchsorted and https://pytorch.org/docs/stable/generated/torch.bucketize.html#torch.bucketize.
|
||
# calling searchsorted on the x values. | ||
ind = ynew.long() | ||
searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind) | |
torch.searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be honest I have no clue how torch.searchsorted behaves compared to torchsearchsorted.searchsorted . Please check this yourselves as this is just an "educated" guess.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll have a look at it...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep it works, I updated the torchinterp1d file, no need for the old dependency any more thanks to pyTorch 1.6
I updated the other pull request. I guess it would be best if we just have one pull request, instead of two with the same changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree.
I think it would be best to have a single PR. I only created a new one, as I could not commit to yours (sorry, I am still a bit confused about github and PRs). I propose that I rebase my additions on your PR again and change the default parameters as follows and according to your proposal #105 (comment): I would like to set all absolute patch sizes to tuples of 0 and only use relative patch_sizes if the absolute patch sizes are tuples of 0 (or each element <= 0). This would result in relative patch sizes being the default and users can overwrite them with absolute patch sizes without having to change relative patch sizes.
I think you need to allow me to commit to your PR though. Let me know what you think :)
SEARCHSORTED_AVAILABLE = True | ||
try: | ||
from torchsearchsorted import searchsorted | ||
except ImportError: | ||
SEARCHSORTED_AVAILABLE = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SEARCHSORTED_AVAILABLE = True | |
try: | |
from torchsearchsorted import searchsorted | |
except ImportError: | |
SEARCHSORTED_AVAILABLE = False |
if not SEARCHSORTED_AVAILABLE: | ||
raise Exception( | ||
'The interp1d function depends on the ' | ||
'torchsearchsorted module, which is not available.\n' | ||
'You must get it at ', | ||
'https://github.com/aliutkus/torchsearchsorted \n') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if not SEARCHSORTED_AVAILABLE: | |
raise Exception( | |
'The interp1d function depends on the ' | |
'torchsearchsorted module, which is not available.\n' | |
'You must get it at ', | |
'https://github.com/aliutkus/torchsearchsorted \n') |
Short Description
This PR is based on PR #105 and fixes some small mistakes, e.g. block_size_y and block_size_z were sometimes based on img_rows. Extends the Genesis transforms to accept absolute or relative patch sizes and all Inpainting transforms to accept custom border distances (was set to 3 in #105). Adds separate patch sizes for in and outpainting in RandomInOrOutpainting.
Since this is one of my first PRs, I am not sure if everything is done properly. So please double check everything and feel free to comment if anything is unclear.
Cheers
PR Checklist
PR Implementer
This is a small checklist for the implementation details of this PR.
If you submit a PR, please look at these points (don't worry about the
RisingTeam
and
Reviewer
workflows, the only purpose of those is to have a compact view ofthe steps). I did not implement unit tests but did some short test for all changed transforms, (ran with default and sometimes additional settings on images of size (32,192,192) for 100 iterations).
If there are any questions regarding code style or other conventions check out our
summary.
__all__
sections and__init__
RisingTeam
RisingTeam workflow
Please make sure to communicate the current status of the pr.)
closes #IssueNumber
at the bottom ifnot already in description)
Reviewer
Reviewer workflow
rising
design conventions?Can you think of critical points which should be covered in an additional test?