Skip to content
This repository has been archived by the owner on Jan 12, 2024. It is now read-only.

Extension of PR#105 #112

Open
wants to merge 12 commits into
base: master
Choose a base branch
from
Open

Conversation

p-wein
Copy link
Contributor

@p-wein p-wein commented Oct 15, 2020

Short Description

This PR is based on PR #105 and fixes some small mistakes, e.g. block_size_y and block_size_z were sometimes based on img_rows. Extends the Genesis transforms to accept absolute or relative patch sizes and all Inpainting transforms to accept custom border distances (was set to 3 in #105). Adds separate patch sizes for in and outpainting in RandomInOrOutpainting.
Since this is one of my first PRs, I am not sure if everything is done properly. So please double check everything and feel free to comment if anything is unclear.
Cheers

PR Checklist

PR Implementer

This is a small checklist for the implementation details of this PR.
If you submit a PR, please look at these points (don't worry about the RisingTeam
and Reviewer workflows, the only purpose of those is to have a compact view of
the steps). I did not implement unit tests but did some short test for all changed transforms, (ran with default and sometimes additional settings on images of size (32,192,192) for 100 iterations).

If there are any questions regarding code style or other conventions check out our
summary.

  • Implementation
  • Docstrings & Typing
  • Check __all__ sections and __init__
  • Unittests (look at the line coverage of your tests, the goal is 100%!)
  • Update notebooks & documentation if necessary
  • Pass all tests
  • Add the checksum of the last implementation commit to the Changelog

RisingTeam

RisingTeam workflow
  • Add pull request to project (optionally delete corresponding project note)
  • Assign correct label (if you don't have permission to do this, someone will do it for you.
    Please make sure to communicate the current status of the pr.)
  • Does this PR close an Issue? (add closes #IssueNumber at the bottom if
    not already in description)

Reviewer

Reviewer workflow
  • Do all tests pass? (Unittests, NotebookTests, Documentation)
  • Does the implementation follow rising design conventions?
  • Are the tests useful? (!!!) Are additional tests needed?
    Can you think of critical points which should be covered in an additional test?
  • Optional: Check coverage locally / Check tests locally if GPU is necessary to execute

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Comment on lines +44 to +49
if not isinstance(dims, DiscreteParameter):
if len(dims) > 2:
dims = list(combinations(dims, 2))
else:
dims = (dims,)
dims = DiscreteParameter(dims)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing correct handling for type(dims)==int. Will throw error on line 45

import contextlib
SEARCHSORTED_AVAILABLE = True
try:
from torchsearchsorted import searchsorted
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add a python implementation of the search sorted functionality?

Copy link
Contributor Author

@p-wein p-wein Oct 22, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a feature from #105 (see #105 (comment)) by @weningerleon so I am not sure what is done there exactly, but to me it seems like this was only necessary for pytorch < v1.6, since 1.6 added https://pytorch.org/docs/stable/generated/torch.searchsorted.html?highlight=searchsorted#torch.searchsorted and https://pytorch.org/docs/stable/generated/torch.bucketize.html#torch.bucketize.


# calling searchsorted on the x values.
ind = ynew.long()
searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind)
torch.searchsorted(v['x'].contiguous(), v['xnew'].contiguous(), ind)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be honest I have no clue how torch.searchsorted behaves compared to torchsearchsorted.searchsorted . Please check this yourselves as this is just an "educated" guess.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll have a look at it...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep it works, I updated the torchinterp1d file, no need for the old dependency any more thanks to pyTorch 1.6
I updated the other pull request. I guess it would be best if we just have one pull request, instead of two with the same changes?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree.
I think it would be best to have a single PR. I only created a new one, as I could not commit to yours (sorry, I am still a bit confused about github and PRs). I propose that I rebase my additions on your PR again and change the default parameters as follows and according to your proposal #105 (comment): I would like to set all absolute patch sizes to tuples of 0 and only use relative patch_sizes if the absolute patch sizes are tuples of 0 (or each element <= 0). This would result in relative patch sizes being the default and users can overwrite them with absolute patch sizes without having to change relative patch sizes.
I think you need to allow me to commit to your PR though. Let me know what you think :)

Comment on lines +36 to +40
SEARCHSORTED_AVAILABLE = True
try:
from torchsearchsorted import searchsorted
except ImportError:
SEARCHSORTED_AVAILABLE = False
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
SEARCHSORTED_AVAILABLE = True
try:
from torchsearchsorted import searchsorted
except ImportError:
SEARCHSORTED_AVAILABLE = False

Comment on lines +74 to +79
if not SEARCHSORTED_AVAILABLE:
raise Exception(
'The interp1d function depends on the '
'torchsearchsorted module, which is not available.\n'
'You must get it at ',
'https://github.com/aliutkus/torchsearchsorted \n')
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if not SEARCHSORTED_AVAILABLE:
raise Exception(
'The interp1d function depends on the '
'torchsearchsorted module, which is not available.\n'
'You must get it at ',
'https://github.com/aliutkus/torchsearchsorted \n')

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants