Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

5ttgen: Add Deep Atropos algorithm #3057

Open
wants to merge 11 commits into
base: dev
Choose a base branch
from

Conversation

LucSam
Copy link

@LucSam LucSam commented Jan 2, 2025

This PR introduces a new option to the 5ttgen script: 5ttgen deep_atropos.
It may need syntax updates to match the dev branch structure and I welcome guidance on making those changes.

Summary of Changes:
A new script: lib/mrtrix3/_5ttgen/deep_atropos.py.
This option leverages ANTsPyNet’s antspynet.deep_atropos for tissue segmentation, addressing challenges with lesion-affected T1-weighted images.
Tissue intensity mappings in the generated deep_atropos.nii.gz are as follows:
1 = CSF
2 = GM
3 = WM
4 = SCGM
5 = Brainstem (BS)
6 = Cerebellum (CER)
Brainstem (BS) and Cerebellum (CER) are treated as white matter (WM) to align with the nature of the segmentation results.
Why This Is Useful:

Lesion-affected or clinical quality brain images may produce poor results or are difficult to achieve (e.g. via recon-all due to pathology or quality) with existing 5ttgen options such as:

5ttgen fsl
5ttgen freesurfer
5ttgen hsvs
These methods could fail to correctly segment grey matter (GM) due to lesion intensity changes, resulting in incorrect tissue assignments.
The deep_atropos option could provide a robust alternative for T1 images in such cases.

Testing and Feedback:
The changes have been tested, but further validation and review by others would be highly appreciated to confirm reliability across various datasets.

@Lestropie
Copy link
Member

Thanks for the contribution @LucSam. See if you're able to get the command executing correctly on dev; there's been several changes there that mean a direct copy of code written for the old API won't immediately work. If you get stuck with any of it, just say so and I can do the requisite transition myself.

The other thing I'd like to include here is a test. If ANTs deep_atropos can be executed on one of the existing T1w images for the Python script test data, then I can upload the output of this command into that repository, and that then serves as both proof that the command produces a sensible output for a known input, and a mechanism to detect if any future proposed changes have a deleterious effect on this new algorithm.

@LucSam
Copy link
Author

LucSam commented Jan 6, 2025

Thank you very much for your help Rob. I've updated and tested the command with sub-01 from the test data repository on dev. Similar to the fsl method, i used mrgrid sub-01_T1w.nii.gz regrid T1_1mm.nii.gz -voxel 1.0 -interp sinc to refine the deep_atropos result. Edit: I could provide the files 'deep_atropos_seg.nii.gz; deep_atropos_5TT.mif; deep_atropos_5TT_vis; sub-01_T1w_1mm.nii.gz and the script generate_deep_atropos_segmentation.py' in relation to 'BIDS/sub-01/anat/sub-01_T1w.nii.gz' if needed.

LucSam and others added 2 commits January 8, 2025 11:05
- Add capability take as input the concatenated tissue probability images as an alternative to the segmentation label image.
- Change default allocation of brain stem to be the 5th volume, in line with 5ttgen hsvs behaviour; this can be overridden using the -white-stem option.
- Change allocation of cerebellum to be subcortical grey matter.
- Import updated script test data, and add tests for the new command.
- Add RS to command author list.
@Lestropie
Copy link
Member

I succeeded in running deep atropos on my own system on a T1w from the existing test data. I noticed that in addition to the "segmentation image", which is an integer label image, it also yields for each voxel the probability ascribed to each component of the model. While these are not technically partial volume fractions, they can nevertheless be interpreted as such to produce a 5TT image that doesn't have the sharp jagged edges that you get from a label image. So I've added a code path to use those data. I have generated the outputs from the command under various configurations, and integrated them into tests (which don't run as part of CI but can be run locally). The test data are here. Note that as per comments in c411b25 I have changed the mapping of deep atropos components to 5TT tissue classes to fit what I think is best for tractography.

Here's the difference in conversion from discrete segmentation vs. probability maps:

5ttgen_deep_atropos

I am activating the CI workflows for the CI. @LucSam you should see that there's a bunch of tests that will fail. Some of those will require that I propagate the changes in #3054 from master to dev. However there are others that will be specific to the proposed additions here. See how you go following the instructions regarding the extra steps that need to be done here.

- Added MRtrix3 copyright license header (2025)
- Fixed indentation issues throughout file to conform with MRtrix3's two-space standardz
- Fixed pylint warnings:
  - Corrected bad indentation (changed 4/8 spaces to 2/4 spaces)
  - Added pylint disable statement for unused variables in usage and execute
  - Addressed f-string error
- Verified functionality through testing:
  - All deep_atropos tests passing (fromseg and fromprob with default and whitestem)

Related to MRtrix3#3057
@LucSam
Copy link
Author

LucSam commented Jan 13, 2025

Dear @Lestropie, I was able to mimic your ants deep_atropos usage to save two images (segmentation_image.nii.gz & probability_images.nii.gz) – thank you very much for this hint and your overall guidance and help. I've made further modifications to deep_atropos.py to address code style issues and have completed testing:

  • Fixed all pylint warnings by correcting indentation, adding pylint statements and addressing f-string errors: e.g. W1309: Using an f-string that does not have any interpolated variables (f-string-without-interpolation ) or W0311: Bad indentation. Found 4 spaces, expected 2 (bad-indentation)
  • Added the MRtrix3 copyright license header ( cf. years: 2024/2025.), I am not sure whether this is generated automatically
  • Ran thorough testing with all related tests passing successfully withctest -L 5ttgen:
      Start 356: 5ttgen_deepatropos_fromseg_default
 1/24 Test #356: 5ttgen_deepatropos_fromseg_default ......   Passed    7.01 sec
      Start 357: 5ttgen_deepatropos_fromseg_whitestem
 2/24 Test #357: 5ttgen_deepatropos_fromseg_whitestem ....   Passed    4.12 sec
      Start 358: 5ttgen_deepatropos_fromprob_default
 3/24 Test #358: 5ttgen_deepatropos_fromprob_default .....   Passed    4.84 sec
      Start 359: 5ttgen_deepatropos_fromprob_whitestem
 4/24 Test #359: 5ttgen_deepatropos_fromprob_whitestem ...   Passed    5.19 sec'

as well as with ctest -R deepatropos:

     Start 356: 5ttgen_deepatropos_fromseg_default
 1/4 Test #356: 5ttgen_deepatropos_fromseg_default ......   Passed    4.82 sec
     Start 357: 5ttgen_deepatropos_fromseg_whitestem
 2/4 Test #357: 5ttgen_deepatropos_fromseg_whitestem ....   Passed    5.70 sec
     Start 358: 5ttgen_deepatropos_fromprob_default
 3/4 Test #358: 5ttgen_deepatropos_fromprob_default .....   Passed    6.37 sec
     Start 359: 5ttgen_deepatropos_fromprob_whitestem
 4/4 Test #359: 5ttgen_deepatropos_fromprob_whitestem ...   Passed    6.78 sec
 
 100% tests passed, 0 tests failed out of 4
 
 Label Time Summary:
 5ttgen      =  23.68 sec*proc (4 tests)
 pythonci    =  11.19 sec*proc (2 tests)
 script      =  23.68 sec*proc (4 tests)
 
 Total Test time (real) =  23.77 sec

I've pushed these changes to the add_deep_atropos branch. Could you please review when you have a chance and/or instruct me with further steps? Best, Lucius

@Lestropie
Copy link
Member

I found it slightly clunky to get the data from antspynet out to NIfTIs for use here.

  • The segmentation image needs to be explicitly indexed from the deep_atropos result.
  • The probability images need to be indexed twice (once to get into the "probabilities" dataset, then again to select one of the 7 3D images). I then used mrcat to concatenate the 7 manually exported images into a single 4D dataset.

It might be useful to have example usages that show the steps necessary to get the images out; in particular if there's a way within antspynet to concatenate the 7 probability volumes into an output NIfTI rather than having to do it after the fact.

There's some outstanding independent CI issues currently that are precluding assessment of the proposal; I need to merge CI fixes from master into dev before I can do a final evaluation.

LucSam added a commit to LucSam/mrtrix3_LF that referenced this pull request Jan 14, 2025
@LucSam
Copy link
Author

LucSam commented Jan 14, 2025

I've updated the deep_atropos.pyscript with documentation showing how to use antspynet to generate the input files. The documentation includes a Python code example that demonstrates how to directly generate a 4D NIfTI file containing all probability maps without needing manual concatenation:

prob_maps = np.stack([np.array(img.numpy()) for img in segments['probability_images']], axis=-1)
nib.save(nib.Nifti1Image(prob_maps, t1_image.affine), 'probabilities.nii.gz')

This produces a single 4D.nii.gz with all 7 probability volumes,

mrinfo probability_images.nii.gz 
************************************************
Image name:          "probability_images.nii.gz"
************************************************
  Dimensions:        173 x 240 x 240 x 7
  Voxel size:        1 x 1 x 1 x 1
  Data strides:      [ 1 2 3 4 ]
  Format:            NIfTI-1.1 (GZip compressed)
  Data type:         32 bit float (little endian)
  Intensity scaling: offset = 0, multiplier = 1
  Transform:               0.9994    -0.02016     0.02959      -86.96
                          0.02067      0.9996    -0.01723      -131.7
                         -0.02923     0.01784      0.9994        -145

which can be used directly with 5ttgen:5ttgen deep_atropos probabilities.nii.gz 5tt.mif

When trying to run ./docs/generate_user_docs.sh --build-dir build on macOS to generate the documentation, I encountered an error:

mrtrix3 % ./docs/generate_user_docs.sh --build-dir build
ls: --ignore=CMakeLists.txt: No such file or directory
ls: --ignore=__init__.py*: No such file or directory
./docs/generate_user_docs.sh: line 100: CMakeLists.txt: command not found
./docs/generate_user_docs.sh: line 105: CMakeLists.txt: command not found
./docs/generate_user_docs.sh: line 100: __init__: command not found
./docs/generate_user_docs.sh: line 105: __init__: command not found
./docs/generate_user_docs.sh: line 100: __init__.py.in: command not found
./docs/generate_user_docs.sh: line 105: __init__.py.in: command not found

It might be related to macOS using BSD ls which handles flags differently than GNU ls (?) and potential directory structure differences between branches. I tried as well with symbolic links to potentially needed files.

@Lestropie Lestropie self-assigned this Jan 20, 2025
@Lestropie
Copy link
Member

@daljit46: Are you able to look into docs/generate_user_docs.sh on BSD?

- Fix use of single vs double quotes so that examples can be copy-pasted into a terminal and executed successfully.
- Load the input image a second time using nibabel to get the affine transformation; the ANTsImage class modifies both the content and the representation of this information, and does not contain a member variable ".affine" as utilised in the original example.
@daljit46
Copy link
Member

daljit46 commented Jan 21, 2025

@LucSam is correct in that our documentation generation script only works on Linux because ls --ignore is not a thing on BSD.
We can make a separate PR to fix that. For now, @LucSam if you replace lines 85-87 in docs/generate_user_docs.sh with the following code, you should be able to generate the documentation locally (just make sure that you don't commit this change).

for filepath in "${mrtrix_root}/python/mrtrix3/commands/"*.py; do
  base=$(basename "$filepath" .py)
  if [[ "$base" == "__init__" || "$base" == "CMakeLists" ]]; then
    continue
  fi
  cmdlist+=$'\n'"$base"
done

@Lestropie
Copy link
Member

Don't worry about needing to re-run the documentation generation script here: I've replaced your commit with one that fixes the typo both in the source code as you did, and in the documentation (which is all that script does). But yes, we should rectify the operation of that script in a separate PR.

This should be ready to merge once I have rectified the CI tests via #3061; I'm just having a fight with Docker trying to update the container dependencies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants