Skip to content
This repository has been archived by the owner on Apr 4, 2023. It is now read-only.

Can't Visualise results #50

Open
Shubham0209 opened this issue Feb 26, 2021 · 7 comments
Open

Can't Visualise results #50

Shubham0209 opened this issue Feb 26, 2021 · 7 comments

Comments

@Shubham0209
Copy link

Shubham0209 commented Feb 26, 2021

@zsyzzsoft Thank you so much for all your help till now.

Actually, I wanted to visualize the results as well like moving images, warped moving images, optical flow etc. For that I ran eval.py file and I added the keys as mentioned by you in eval.py file like keys = ['pt_mask', 'landmark_dists', 'jaccs', 'dices', 'jacobian_det','real_flow','image_fixed', 'warped_moving']

But in end I couldn't see any images anywhere, just the .txt file of evaluation results in evaluate folder. Could you please let me know what other changes do I have to make to visualize the results.

@zsyzzsoft
Copy link
Collaborator

You need to parse the validation results of the added keys manually as in these lines.

@Shubham0209
Copy link
Author

Can you please help me with the code snippet of how to do it for visualizing the optical flow, and from it I can follow and do the same for visualizing the warped image etc.

Thank you

@zsyzzsoft
Copy link
Collaborator

You may follow these lines as an example and copy the related functions to eval.py

@Shubham0209
Copy link
Author

I used the following lines in eval.py to praise the validation results.

            im_flow = RenderFlow(results['real_flow'][0])
            skimage.io.imsave(os.path.join('flow.png'), im_flow)

            warped_img = results['warped_moving'][0]
            show_image(warped_img, os.path.join('warped.png'))

            fixed_img = results['image_fixed'][3]
            show_image(fixed_img, os.path.join('fixed.png'))

            moving_img = results['image_moving'][3]
            show_image(moving_img, os.path.join('moving.png'))

And I got the following images:

Moving Image
moving

Optical flow
flow

Warped image
warped

Fixed image
fixed

I don't think that they are accurate results, since the moving and fixed image appear to be exactly same. And moreover the summary of results after running the eval.py file is as follows:

Summary
Dice score: nan (nan)
Jacc score: 0.0 (0.0)
Landmark distance: 0.0 (0.0)
Jacobian determinant: 0.1843898892402649 (2.9802322387695312e-08) - (It came out to be same for every image pair).

Could you please help me!!

@Shubham0209
Copy link
Author

@zsyzzsoft What does warped_moving_0, warped_moving_1 etc and real_flow_0, real_flow_1 etc indicate?

Why my moving image (i.e results['image_moving']) and fixed image (i.e results['image_fixed']) ?

Why my warped moving (results['warped_moving']) and results['warped_moving'_1] also same ?

PS: I have run train.py and eval.py on my own data of size 28828896.

@zsyzzsoft
Copy link
Collaborator

Looks like there is something wrong with your dataset config.

@Shubham0209
Copy link
Author

Shubham0209 commented Mar 13, 2021

I have pairs of moving and fixed images (lets say 68_mov and 68_fix) and I also have their respective segmentation masks (68_mov_seg and 68_fix_seg). To create a validation .h5 file, I have created a group 68_mov and in it I have 2 members or datasets: 'segmentation' corresponding to 68_mov_seg and 'volume' corresponding to 68_mov. This is followed by group 68_fix and in it I have 2 members or datasets: 'segmentation' corresponding to 68_fix_seg and 'volume' corresponding to 68_fix. And since I have to perform pairwise matching I run the command python eval.py -c weights/Mar14-0215 -g 1 --paired. Please correct if I am wrong somewhere.

Still I get output as

d068_fix_img.nii d068_mov_img.nii nan 0.0 0.0 0.16778272
d068_mov_img.nii d068_fix_img.nii nan 0.0 0.0 0.16778272
d070_fix_img.nii d070_mov_img.nii nan 0.0 0.0 0.16778272
d070_mov_img.nii d070_fix_img.nii nan 0.0 0.0 0.16778272
d071_fix_img.nii d071_mov_img.nii nan 0.0 0.0 0.16778272
d071_mov_img.nii d071_fix_img.nii nan 0.0 0.0 0.16778272

Moreover, as my dataset siye is 28828896 you told me this 'The network assumes that input resolution is a multiple of 64. In your case, you may remove the 6-th level layers (conv6, conv6_1, pred6, and upsamp6to5) in the network as a workaround.' and hence I removed these parts from network/base_networks.py. Below is code from base_networks.py

class VTN(Network):
    def __init__(self, name, flow_multiplier=1., channels=16, **kwargs):
        super().__init__(name, **kwargs)
        self.flow_multiplier = flow_multiplier
        self.channels = channels

    def build(self, img1, img2):
        '''
            img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
        '''
        concatImgs = tf.concat([img1, img2], 4, 'concatImgs')

        dims = 3
        c = self.channels
        conv1 = convolveLeakyReLU('conv1',   concatImgs, c,   3, 2)  # 64 * 64 * 64
        conv2 = convolveLeakyReLU('conv2',   conv1,      c*2,   3, 2)  # 32 * 32 * 32
        conv3 = convolveLeakyReLU('conv3',   conv2,      c*4,   3, 2)
        conv3_1 = convolveLeakyReLU('conv3_1', conv3,      c*4,   3, 1)
        conv4 = convolveLeakyReLU('conv4',   conv3_1,    c*8,  3, 2)  # 16 * 16 * 16
        conv4_1 = convolveLeakyReLU('conv4_1', conv4,      c*8,  3, 1)
        conv5 = convolveLeakyReLU('conv5',   conv4_1,    c*16,  3, 2)  # 8 * 8 * 8
        conv5_1 = convolveLeakyReLU('conv5_1', conv5,      c*16,  3, 1)
        # conv6 = convolveLeakyReLU('conv6',   conv5_1,    c*32,  3, 2)  # 4 * 4 * 4
        # conv6_1 = convolveLeakyReLU('conv6_1', conv6,      c*32,  3, 1)
        # # 16 * 32 = 512 channels

        shape0 = concatImgs.shape.as_list()
        shape1 = conv1.shape.as_list()
        shape2 = conv2.shape.as_list()
        shape3 = conv3.shape.as_list()
        shape4 = conv4.shape.as_list()
        shape5 = conv5.shape.as_list()
        #shape6 = conv6.shape.as_list()

        #pred6 = convolve('pred6', conv6_1, dims, 3, 1)
        #upsamp6to5 = upconvolve('upsamp6to5', pred6, dims, 4, 2, shape5[1:4])
        # deconv5 = upconvolveLeakyReLU('deconv5', conv6_1, shape5[4], 4, 2, shape5[1:4])
        # concat5 = tf.concat([conv5_1, deconv5, upsamp6to5], 4, 'concat5')

        pred5=convolve('pred5', conv5_1, dims, 3, 1)
        upsamp5to4 = upconvolve('upsamp5to4', pred5, dims, 4, 2, shape4[1:4])
        deconv4 = upconvolveLeakyReLU('deconv4', conv5_1, shape4[4], 4, 2, shape4[1:4])
        concat4 = tf.concat([conv4_1, deconv4, upsamp5to4], 4, 'concat4')  # channel = 512+256+2

        # pred5 = convolve('pred5', concat5, dims, 3, 1)
        # upsamp5to4 = upconvolve('upsamp5to4', pred5, dims, 4, 2, shape4[1:4])
        # deconv4 = upconvolveLeakyReLU('deconv4', concat5, shape4[4], 4, 2, shape4[1:4])
        # concat4 = tf.concat([conv4_1, deconv4, upsamp5to4], 4, 'concat4')  # channel = 512+256+2

        pred4 = convolve('pred4', concat4, dims, 3, 1)
        upsamp4to3 = upconvolve('upsamp4to3', pred4, dims, 4, 2, shape3[1:4])
        deconv3 = upconvolveLeakyReLU('deconv3', concat4, shape3[4], 4, 2, shape3[1:4])
        concat3 = tf.concat([conv3_1, deconv3, upsamp4to3],4, 'concat3')  # channel = 256+128+2

        pred3 = convolve('pred3', concat3, dims, 3, 1)
        upsamp3to2 = upconvolve('upsamp3to2', pred3, dims, 4, 2, shape2[1:4])
        deconv2 = upconvolveLeakyReLU('deconv2', concat3, shape2[4], 4, 2, shape2[1:4])
        concat2 = tf.concat([conv2, deconv2, upsamp3to2],4, 'concat2')  # channel = 128+64+2

        pred2 = convolve('pred2', concat2, dims, 3, 1)
        upsamp2to1 = upconvolve('upsamp2to1', pred2, dims, 4, 2, shape1[1:4])
        deconv1 = upconvolveLeakyReLU('deconv1', concat2, shape1[4], 4, 2, shape1[1:4])
        concat1 = tf.concat([conv1, deconv1, upsamp2to1], 4, 'concat1')
        pred0 = upconvolve('upsamp1to0', concat1, dims, 4, 2, shape0[1:4])

        return {'flow': pred0 * 20 * self.flow_multiplier}

You have been of great help to me. Please help again!!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants