-
We normally perform several transformation(eg. resized,spacing,orientation) on our inputs to train our model. As such, when we need to predict a new input data, we will also need to transform the data to get the segmentation. However, the segmentation produced is also a transformed data. Hence, when we want to overlay it on top of our original input data, it will not be allign with it. Thanks a lot. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 6 replies
-
Hi @janetkok , Thanks for your interest here. Thanks. |
Beta Was this translation helpful? Give feedback.
-
And for the overlay question, I think you can train with Thanks. |
Beta Was this translation helpful? Give feedback.
Hi @janetkok ,
Thanks for your interest here.
Currently, not all the transforms update the affine matrix, so we can't revert all the transforms, we will work on it soon, tracking: #365
For your case, can you only use
spacingd
andorientationd
transforms thensave_nifti
with meta_data? refer to:https://github.com/Project-MONAI/tutorials/blob/master/3d_segmentation/torch/unet_evaluation_dict.py#L96
Thanks.