Skip to content

Commit

Permalink
Adding robustica option to ICA decomposition to achieve consistent re…
Browse files Browse the repository at this point in the history
…sults (#1013)

* Add robustica method

* Incorporation of major comments regarding robustica addition

Manual modification of commit f2cdb4e to remove unwanted file additions.

* Add robustica 0.1.3 to dependency list

Cherry-pick of 41354cb.

* Multiple fixes to RobustICA addition from code review

From: BahmanTahayori#2.

Co-authored-by: Robert E. Smith <[email protected]>

* Specify magic number fixed seed of 42 as a constant

Cherry-pick of da1b128 (with modification).

* Updated

* Robustica Updates

* Incorporating the third round of Robert E. Smith's comments

* Enhance the "ica_method" description suggested by D. Handwerker

Co-authored-by: Dan Handwerker <[email protected]>

* Enhancing the "n_robust_runs" description suggested by D. Handwerkerd

Co-authored-by: Dan Handwerker <[email protected]>

* RobustICA: Restructure code loop over robust methods (#4)

* RobustICA: Restructure code loop over robust methods

* Addressing the issue with try/except

---------

Co-authored-by: Bahman <[email protected]>

* Applied suggested changes

In this commit, some of the comments from Daniel Handwerker and Robert
Smith were incorporated.

* Incorporating more comments

* Fixing the problem of argument parser for n_robust_runs.

* Removing unnecessary tests from the test_integration. There are 3
  tests for echo as before, but the ica_method is robustica for five and
three echos and fatsica for the four echo test.

* Adding already requested changes

* fixed failing tests

* updated documentation in faq.rst

* more documentation changes

* Update docs/faq.rst

Co-authored-by: Robert Smith <[email protected]>

* Update docs/faq.rst

Co-authored-by: Robert Smith <[email protected]>

* Aligning robustICA with current Main + (#5)

* Limit current adaptive mask method to brain mask (#1060)

* Limit adaptive mask calculation to brain mask.

Limit adaptive mask calculation to brain mask.

Expand on logic of first adaptive mask method.

Update tedana/utils.py

Improve docstring.

Update test.

Add decreasing-signal-based adaptive mask.

Keep removing.

Co-Authored-By: Dan Handwerker <[email protected]>

* Use `compute_epi_mask` in t2smap workflow.

* Try fixing the tests.

* Fix make_adaptive_mask.

* Update test_utils.py

* Update test_utils.py

* Improve docstring.

* Update utils.py

* Update test_utils.py

* Revert "Update test_utils.py"

This reverts commit 259b002.

* Don't take absolute value of echo means.

* Log echo-wise thresholds in adaptive mask.

* Add comment about non-zero voxels.

* Update utils.py

* Update test_utils.py

* Update test_utils.py

* Update test_utils.py

* Log the thresholds again.

* Address review.

* Fix test.

---------

Co-authored-by: Dan Handwerker <[email protected]>

* Update nilearn requirement from <=0.10.3,>=0.7 to >=0.7,<=0.10.4 (#1077)

* Add adaptive mask plot to report (#1073)

* Update scikit-learn requirement (#1075)

Updates the requirements on [scikit-learn](https://github.com/scikit-learn/scikit-learn) to permit the latest version.
- [Release notes](https://github.com/scikit-learn/scikit-learn/releases)
- [Commits](scikit-learn/scikit-learn@0.21.0...1.4.2)

---
updated-dependencies:
- dependency-name: scikit-learn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update pandas requirement from <=2.2.1,>=2.0 to >=2.0,<=2.2.2 (#1076)

Updates the requirements on [pandas](https://github.com/pandas-dev/pandas) to permit the latest version.
- [Release notes](https://github.com/pandas-dev/pandas/releases)
- [Commits](pandas-dev/pandas@v2.0.0...v2.2.2)

---
updated-dependencies:
- dependency-name: pandas
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update bokeh requirement from <=3.4.0,>=1.0.0 to >=1.0.0,<=3.4.1 (#1078)

Updates the requirements on [bokeh](https://github.com/bokeh/bokeh) to permit the latest version.
- [Changelog](https://github.com/bokeh/bokeh/blob/branch-3.5/docs/CHANGELOG)
- [Commits](bokeh/bokeh@1.0.0...3.4.1)

---
updated-dependencies:
- dependency-name: bokeh
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Load user-defined mask as expected by plot_adaptive_mask (#1079)

* DOC desc-optcomDenoised -> desc-denoised (#1080)

* docs: add mvdoc as a contributor for code, bug, and doc (#1082)

* docs: update README.md

* docs: update .all-contributorsrc

---------

Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com>

* Identify the last good echo in adaptive mask instead of sum of good echoes (#1061)

* Limit adaptive mask calculation to brain mask.

Limit adaptive mask calculation to brain mask.

Expand on logic of first adaptive mask method.

Update tedana/utils.py

Improve docstring.

Update test.

Add decreasing-signal-based adaptive mask.

Keep removing.

Co-Authored-By: Dan Handwerker <[email protected]>

* Use `compute_epi_mask` in t2smap workflow.

* Try fixing the tests.

* Fix make_adaptive_mask.

* Update test_utils.py

* Update test_utils.py

* Improve docstring.

* Identify the last good echo instead of sum.

Improve docstring.

Update test_utils.py

Update test_utils.py

Fix make_adaptive_mask.

Try fixing the tests.

Use `compute_epi_mask` in t2smap workflow.

Limit adaptive mask calculation to brain mask.

Limit adaptive mask calculation to brain mask.

Expand on logic of first adaptive mask method.

Update tedana/utils.py

Improve docstring.

Update test.

Add decreasing-signal-based adaptive mask.

Keep removing.

Co-Authored-By: Dan Handwerker <[email protected]>

* Fix.

* Update utils.py

* Update utils.py

* Try fixing.

* Update utils.py

* Update utils.py

* add checks

* Just loop over voxels.

* Update utils.py

* Update utils.py

* Update test_utils.py

* Revert "Update test_utils.py"

This reverts commit 259b002.

* Update test_utils.py

* Update test_utils.py

* Remove checks.

* Don't take absolute value of echo means.

* Log echo-wise thresholds in adaptive mask.

* Add comment about non-zero voxels.

* Update utils.py

* Update utils.py

* Update test_utils.py

* Update test_utils.py

* Update test_utils.py

* Log the thresholds again.

* Update test_utils.py

* Update test_utils.py

* Update test_utils.py

* Add simulated data to adaptive mask test.

* Clean up the tests.

* Add value that tests the base mask.

* Remove print in test.

* Update tedana/utils.py

Co-authored-by: Dan Handwerker <[email protected]>

* Update tedana/utils.py

Co-authored-by: Dan Handwerker <[email protected]>

---------

Co-authored-by: Dan Handwerker <[email protected]>

* Output RMSE map and time series for decay model fit (#1044)

* Draft function to calculate decay model fit.

* Calculate root mean squared error instead.

* Incorporate metrics.

* Output RMSE results.

* Output results in tedana.

* Hopefully fix things.

* Update decay.py

* Try improving performance.

* Update decay.py

* Fix again.

* Use tqdm.

* Update decay.py

* Update decay.py

* Update decay.py

* Update expected outputs.

* Add figures.

* Update outputs.

* Include global signal in confounds file.

* Update fiu_four_echo_outputs.txt

* Rename function.

* Rename function.

* Update tedana.py

* Update tedana/decay.py

Co-authored-by: Dan Handwerker <[email protected]>

* Update decay.py

* Update decay.py

* Whoops.

* Apply suggestions from code review

Co-authored-by: Dan Handwerker <[email protected]>

* Fix things maybe.

* Fix things.

* Update decay.py

* Remove any files that are built through appending.

* Update outputs.

* Add section on plots to docs.

* Fix the description.

* Update docs/outputs.rst

Co-authored-by: Dan Handwerker <[email protected]>

* Update docs/outputs.rst

* Fix docstring.

---------

Co-authored-by: Dan Handwerker <[email protected]>

* minimum nilearn 0.10.3 (#1094)

* Use nearest-neighbors interpolation in `plot_component` (#1098)

* Use nearest-neighbors interpolation in plot_stat_map.

* Only use NN interp for component maps.

* Update scipy requirement from <=1.13.0,>=1.2.0 to >=1.2.0,<=1.13.1 (#1100)

Updates the requirements on [scipy](https://github.com/scipy/scipy) to permit the latest version.
- [Release notes](https://github.com/scipy/scipy/releases)
- [Commits](scipy/scipy@v1.2.0...v1.13.1)

---
updated-dependencies:
- dependency-name: scipy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update scikit-learn requirement from <=1.4.2,>=0.21 to >=0.21,<=1.5.0 (#1101)

Updates the requirements on [scikit-learn](https://github.com/scikit-learn/scikit-learn) to permit the latest version.
- [Release notes](https://github.com/scikit-learn/scikit-learn/releases)
- [Commits](scikit-learn/scikit-learn@0.21.0...1.5.0)

---
updated-dependencies:
- dependency-name: scikit-learn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update numpy requirement from <=1.26.4,>=1.16 to >=1.16,<=2.0.0 (#1104)

* Update numpy requirement from <=1.26.4,>=1.16 to >=1.16,<=2.0.0

Updates the requirements on [numpy](https://github.com/numpy/numpy) to permit the latest version.
- [Release notes](https://github.com/numpy/numpy/releases)
- [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst)
- [Commits](numpy/numpy@v1.16.0...v2.0.0)

---
updated-dependencies:
- dependency-name: numpy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>

* Use np.nan instead of np.NaN

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Taylor Salo <[email protected]>

* Filter out non-diagonal affine warning (#1103)

* Filter out non-diagonal affine warning.

* Fix warning capture.

* Update tedana/reporting/static_figures.py

Co-authored-by: Dan Handwerker <[email protected]>

* Update static_figures.py

---------

Co-authored-by: Dan Handwerker <[email protected]>

* Update bokeh requirement from <=3.4.1,>=1.0.0 to <=3.5.0,>=3.5.0 (#1109)

* Update bokeh requirement from <=3.4.1,>=1.0.0 to <=3.5.0,>=3.5.0

Updates the requirements on [bokeh](https://github.com/bokeh/bokeh) to permit the latest version.
- [Changelog](https://github.com/bokeh/bokeh/blob/branch-3.6/docs/CHANGELOG)
- [Commits](bokeh/bokeh@1.0.0...3.5.0)

---
updated-dependencies:
- dependency-name: bokeh
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>

* Update pyproject.toml

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Taylor Salo <[email protected]>

* Update scikit-learn requirement from <=1.5.0,>=0.21 to <=1.5.1,>=1.5.1 (#1108)

* Update scikit-learn requirement from <=1.5.0,>=0.21 to <=1.5.1,>=1.5.1

Updates the requirements on [scikit-learn](https://github.com/scikit-learn/scikit-learn) to permit the latest version.
- [Release notes](https://github.com/scikit-learn/scikit-learn/releases)
- [Commits](scikit-learn/scikit-learn@0.21.0...1.5.1)

---
updated-dependencies:
- dependency-name: scikit-learn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>

* Update pyproject.toml to restore minimum version of scikit-learn

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Dan Handwerker <[email protected]>

* Update scipy requirement from <=1.13.1,>=1.2.0 to <=1.14.0,>=1.14.0 (#1106)

* Update scipy requirement from <=1.13.1,>=1.2.0 to <=1.14.0,>=1.14.0

Updates the requirements on [scipy](https://github.com/scipy/scipy) to permit the latest version.
- [Release notes](https://github.com/scipy/scipy/releases)
- [Commits](scipy/scipy@v1.2.0...v1.14.0)

---
updated-dependencies:
- dependency-name: scipy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>

* Update pyproject.toml to retain minimum version of scipy

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Dan Handwerker <[email protected]>
Co-authored-by: Eneko Uruñuela <[email protected]>

* Update numpy requirement from <=2.0.0,>=1.16 to >=1.16,<=2.0.1 (#1112)

Updates the requirements on [numpy](https://github.com/numpy/numpy) to permit the latest version.
- [Release notes](https://github.com/numpy/numpy/releases)
- [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst)
- [Commits](numpy/numpy@v1.16.0...v2.0.1)

---
updated-dependencies:
- dependency-name: numpy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Cleaning up installation instructions (#1113)

* install instructions

* Update docs/installation.rst

Co-authored-by: Taylor Salo <[email protected]>

* Update docs/installation.rst

Co-authored-by: Eneko Uruñuela <[email protected]>

---------

Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Eneko Uruñuela <[email protected]>

* Update bokeh requirement from <=3.5.0,>=1.0.0 to >=1.0.0,<=3.5.1 (#1116)

Updates the requirements on [bokeh](https://github.com/bokeh/bokeh) to permit the latest version.
- [Changelog](https://github.com/bokeh/bokeh/blob/3.5.1/docs/CHANGELOG)
- [Commits](bokeh/bokeh@1.0.0...3.5.1)

---
updated-dependencies:
- dependency-name: bokeh
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Update list of multi-echo datasets (#1115)

* Generate metrics from external regressors using F stats (#1064)

* Get required metrics from decision tree.

* Continue changes.

* More updates.

* Store necessary_metrics as a list.

* Update selection_nodes.py

* Update selection_utils.py

* Update across the package.

* Keep updating.

* Update tedana.py

* Add extra metrics to list.

* Update ica_reclassify.py

* Draft metric-based regressor correlations.

* Fix typo.

* Work on trees.

* Expand regular expressions in trees.

* Fix up the expansion.

* Really fix it though.

* Fix style issue.

* Added external regress integration test

* Got intregration test with external regressors working

* Added F tests and options

* added corr_no_detrend.json

* updated names and reporting

* Run black.

* Address style issues.

* Try fixing test bugs.

* Update test_component_selector.py

* Update component_selector.py

* Use component table directly in selectcomps2use.

* Fix.

* Include generated metrics in necessary metrics.

* Update component_selector.py

* responding to feedback from tsalo

* Update component_selector.py

* Update test_component_selector.py

* fixed some testing failures

* fixed test_check_null_succeeds

* fixed ica_reclassify bug and selector_properties test

* ComponentSelector initialized before loading data

* fixed docstrings

* updated building decision tree docs

* using external regressors and most tests passing

* removed corr added tasks

* fit_model moved to stats

* removed and cleaned up external_regressors_config option

* Added task regressors and some tests. Now alll in decision tree

* cleaning up decision tree json files

* removed mot12_csf.json changed task to signal

* fixed tests with task_keep signal

* Update tedana/metrics/external.py

Co-authored-by: Taylor Salo <[email protected]>

* Update tedana/metrics/_utils.py

Co-authored-by: Taylor Salo <[email protected]>

* Update tedana/metrics/collect.py

Co-authored-by: Taylor Salo <[email protected]>

* Update tedana/metrics/external.py

Co-authored-by: Taylor Salo <[email protected]>

* Update tedana/metrics/external.py

Co-authored-by: Taylor Salo <[email protected]>

* Responding to review comments

* reworded docstring

* Added type hints to external.py

* fixed external.py type hints

* type hints to _utils collect and component_selector

* type hints and doc improvements in selection_utils

* no expand_node recursion

* removed expand_nodes expand_node expand_dict

* docstring lines break on punctuation

* updating external tests and docs

* moved test data downloading to tests.utils.py and started test for fit_regressors

* fixed bug where task regressors retained in partial models

* matched testing external regressors to included mixing and fixed bugs

* Made single function for detrending regressors

* added tests for external fit_regressors and fix_mixing_to_regressors

* Full tests in test_external_metrics.py

* adding tests

* fixed extern regress validation warnings and added tests

* sorting set values for test outputs

* added to test_metrics

* Added docs to building_decision_trees.rst

* Added motion task decision tree flow chart

* made recommended change to external_regressor_config

* Finished documentation and renamed demo decision trees

* added link to example external regressors tsv file

* Apply suggestions from code review

Fixed nuissance typos

Co-authored-by: Taylor Salo <[email protected]>

* Minor documentation edits

---------

Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Neha Reddy <[email protected]>

* Link to the open-multi-echo-data website (#1117)

* Update multi-echo.rst

* Update multi-echo.rst

* Refactor `metrics.dependence` module (#1088)

* Add type hints to metric functions.

* Use keyword arguments.

* Update tests.

* Update dependence.py

* Update collect.py

* Fix other stuff.

* documentation and resource updates (#1114)

* documentation and resource updates

* Fixed citation numbering and updated posters

---------

Co-authored-by: Neha Reddy <[email protected]>

* Adding already requested changes

* fixed failing tests

* updated documentation in faq.rst

* more documentation changes

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Matteo Visconti di Oleggio Castello <[email protected]>
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com>
Co-authored-by: Eneko Uruñuela <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Neha Reddy <[email protected]>

* align with main

* fixed ica.py docstring error

* added scikit-learn-extra to pyproject and changed ref name

* increment circleci version keys

* Removing the scikit-learn-extra dependency

* Updating pyproject.toml file

* Minor changes to make the help more readable

* Minor changes

* upgrading to robustica 0.1.4

* Update docs

Co-authored-by: Dan Handwerker <[email protected]>

* updating utils.py, toml file and the docs

* minor change to utils.py

* Incorporating Eneko's comments

Co-authored-by: Eneko Uruñuela <[email protected]>

* Added a warning when the clustering method is changed

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: Robert E. Smith <[email protected]>
Co-authored-by: Dan Handwerker <[email protected]>
Co-authored-by: handwerkerd <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Matteo Visconti di Oleggio Castello <[email protected]>
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com>
Co-authored-by: Eneko Uruñuela <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Taylor Salo <[email protected]>
Co-authored-by: Neha Reddy <[email protected]>
  • Loading branch information
12 people authored Sep 23, 2024
1 parent 30a9a66 commit df58347
Show file tree
Hide file tree
Showing 13 changed files with 412 additions and 62 deletions.
36 changes: 18 additions & 18 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Generate environment
command: |
Expand All @@ -23,7 +23,7 @@ jobs:
pip install -e .[tests]
fi
- save_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
paths:
- /opt/conda/envs/tedana_py38

Expand All @@ -34,7 +34,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Running unit tests
command: |
Expand All @@ -56,7 +56,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py39-v2-{{ checksum "pyproject.toml" }}
key: conda-py39-v3-{{ checksum "pyproject.toml" }}
- run:
name: Generate environment
command: |
Expand All @@ -75,7 +75,7 @@ jobs:
mkdir /tmp/src/coverage
mv /tmp/src/tedana/.coverage /tmp/src/coverage/.coverage.py39
- save_cache:
key: conda-py39-v2-{{ checksum "pyproject.toml" }}
key: conda-py39-v3-{{ checksum "pyproject.toml" }}
paths:
- /opt/conda/envs/tedana_py39
- persist_to_workspace:
Expand All @@ -90,7 +90,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py310-v1-{{ checksum "pyproject.toml" }}
key: conda-py310-v3-{{ checksum "pyproject.toml" }}
- run:
name: Generate environment
command: |
Expand All @@ -109,7 +109,7 @@ jobs:
mkdir /tmp/src/coverage
mv /tmp/src/tedana/.coverage /tmp/src/coverage/.coverage.py310
- save_cache:
key: conda-py310-v1-{{ checksum "pyproject.toml" }}
key: conda-py310-v3-{{ checksum "pyproject.toml" }}
paths:
- /opt/conda/envs/tedana_py310
- persist_to_workspace:
Expand All @@ -124,7 +124,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py311-v1-{{ checksum "pyproject.toml" }}
key: conda-py311-v3-{{ checksum "pyproject.toml" }}
- run:
name: Generate environment
command: |
Expand All @@ -143,7 +143,7 @@ jobs:
mkdir /tmp/src/coverage
mv /tmp/src/tedana/.coverage /tmp/src/coverage/.coverage.py311
- save_cache:
key: conda-py311-v1-{{ checksum "pyproject.toml" }}
key: conda-py311-v3-{{ checksum "pyproject.toml" }}
paths:
- /opt/conda/envs/tedana_py311
- persist_to_workspace:
Expand All @@ -158,7 +158,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py312-v1-{{ checksum "pyproject.toml" }}
key: conda-py312-v3-{{ checksum "pyproject.toml" }}
- run:
name: Generate environment
command: |
Expand All @@ -177,7 +177,7 @@ jobs:
mkdir /tmp/src/coverage
mv /tmp/src/tedana/.coverage /tmp/src/coverage/.coverage.py312
- save_cache:
key: conda-py312-v1-{{ checksum "pyproject.toml" }}
key: conda-py312-v3-{{ checksum "pyproject.toml" }}
paths:
- /opt/conda/envs/tedana_py312
- persist_to_workspace:
Expand All @@ -192,7 +192,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Style check
command: |
Expand All @@ -208,7 +208,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Run integration tests
no_output_timeout: 40m
Expand All @@ -233,7 +233,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Run integration tests
no_output_timeout: 40m
Expand All @@ -258,7 +258,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Run integration tests
no_output_timeout: 40m
Expand All @@ -283,7 +283,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Run integration tests
no_output_timeout: 40m
Expand All @@ -308,7 +308,7 @@ jobs:
steps:
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Run integration tests
no_output_timeout: 40m
Expand All @@ -335,7 +335,7 @@ jobs:
at: /tmp
- checkout
- restore_cache:
key: conda-py38-v2-{{ checksum "pyproject.toml" }}
key: conda-py38-v3-{{ checksum "pyproject.toml" }}
- run:
name: Merge coverage files
command: |
Expand Down
4 changes: 4 additions & 0 deletions docs/approach.rst
Original file line number Diff line number Diff line change
Expand Up @@ -334,6 +334,8 @@ Next, ``tedana`` applies TE-dependent independent component analysis (ICA) in
order to identify and remove TE-independent (i.e., non-BOLD noise) components.
The dimensionally reduced optimally combined data are first subjected to ICA in
order to fit a mixing matrix to the whitened data.
``tedana`` can use a single interation of FastICA or multiple interations of robustICA,
with an explanation of those approaches `in our FAQ`_.
This generates a number of independent timeseries (saved as **desc-ICA_mixing.tsv**),
as well as parameter estimate maps which show the spatial loading of these components on the
brain (**desc-ICA_components.nii.gz**).
Expand Down Expand Up @@ -380,6 +382,8 @@ yielding a denoised timeseries, which is saved as **desc-denoised_bold.nii.gz**.

.. image:: /_static/a15_denoised_data_timeseries.png

.. _in our FAQ: faq.html#tedana-what-is-the-right-number-of-ica-components-what-options-let-me-get-it
.. _These decision trees are detailed here: included_decision_trees.html

*******************************
Manual classification with RICA
Expand Down
67 changes: 67 additions & 0 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,11 +93,78 @@ The TEDICA step may fail to converge if TEDPCA is either too strict
With updates to the ``tedana`` code, this issue is now rare, but it may happen
when preprocessing has not been applied to the data, or when improper steps have
been applied to the data (e.g. rescaling, nuisance regression).
It can also still happen when everything is seemingly correct
(see the answer to the next question).
If you are confident that your data have been preprocessed correctly prior to
applying tedana, and you encounter this problem, please submit a question to `NeuroStars`_.

.. _NeuroStars: https://neurostars.org

*********************************************************************************
[tedana] What is the right number of ICA components & what options let me get it?
*********************************************************************************

Part of the PCA step in ``tedana`` processing involves identifying the number of
components that contain meaningful signal.
The PCA components are then used to calculate the same number of ICA components.
The ``--tedpca`` option includes several options to identify the "correct" number
of PCA components.
``kundu`` and ``kundu-stabilize`` use several echo-based criteria to exclude PCA
components that are unlikely to contain T2* or S0 signal.
``mdl`` (conservative & fewest components), ``kic``,
& ``aic`` (liberal & more components) use `MAPCA`_.
Within the same general method, each uses a cost function to find a minimum
where more components no longer model meaningful variance.
For some datasets we see all methods fail and result in too few or too many components.
There is no consistent number of components or % variance explained to define the correct number.
The correct number of components will depend on the noise levels of the data.
For example, smaller voxels will results in more thermal noise and less total variance explained.
A dataset with more head motion artifacts will have more variance explained,
since more structured signal is within the head motion artifacts.
The clear failure cases are extreme. That is getting less than 1/5 the number of components
compared to time points or having nearly as many components as time points.
We are working on identifying why this happens and adding better solutions.
Our current guess is that most of the above methods assume data are
independant and identically distributed (IID),
and signal leakage from in-slice and multi-slice accelleration may violate this assumption.

We have one option that is generally useful and is also a partial solution.
``--ica_method robustica`` will run `robustica`_.
This is a method that, for a given number of PCA components,
will repeatedly run ICA and identify components that are stable across iterations.
While running ICA multiple times will slow processing, as a general benefit,
this means that the ICA results are less sensitive to the initialization parameters,
computer hardware, and software versions.
This will result in better stability and replicability of ICA results.
Additionally, `robustica`_ almost always results in fewer components than initially prescripted,
since there are fewer stable components across interations than the total number of components.
This means, even if the initial PCA component estimate is a bit off,
the number of resulting robust ICA components will represent stable information in the data.
For a dataset where the PCA comoponent estimation methods are failing,
one could use ``--tedpca`` with a fixed integer for a constant number of components,
that is on the high end of the typical number of components for a study,
and then `robustica`_ will reduce the number of components to only find stable information.
That said, if the fixed PCA component number is too high,
then the method will have too many unstable components,
and if the fixed PCA component number is too low, then there will be even fewer ICA components.
With this approach, the number of ICA components is more consistent,
but is still sensitive to the intial number of PCA components.
For example, for a single dataset 60 PCA components might result in 46 stable ICA components,
while 55 PCA components might results in 43 stable ICA components.
We are still testing how these interact to give better recommendations for even more stable results.
While the TEDANA developers expect that ``--ica_method robustica`` may become
the default configuration in future TEDANA versions,
it is first being released to the public as a non-default option
in hope of gaining insight into its behaviour
across a broader range of multi-echo fMRI data.
If users are having trouble with PCA component estimation failing on a dataset,
we recommend using RobustICA;
and we invite users to send us feedback on its behavior and efficacy.


.. _MAPCA: https://github.com/ME-ICA/mapca
.. _robustica: https://github.com/CRG-CNAG/robustica

.. _manual classification:

********************************************************************************
Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ dependencies = [
"pandas>=2.0,<=2.2.2",
"pybtex",
"pybtex-apa-style",
"robustica>=0.1.4,<=0.1.4",
"scikit-learn>=0.21, <=1.5.2",
"scipy>=1.2.0, <=1.14.1",
"threadpoolctl",
Expand Down
19 changes: 19 additions & 0 deletions tedana/config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
"""Setting default values for ICA decomposition."""

DEFAULT_ICA_METHOD = "fastica"
DEFAULT_N_MAX_ITER = 500
DEFAULT_N_MAX_RESTART = 10
DEFAULT_SEED = 42


"""Setting values for number of robust runs."""

DEFAULT_N_ROBUST_RUNS = 30
MIN_N_ROBUST_RUNS = 5
MAX_N_ROBUST_RUNS = 500
WARN_N_ROBUST_RUNS = 200


"""Setting the warning threshold for the index quality."""

WARN_IQ = 0.6
Loading

0 comments on commit df58347

Please sign in to comment.