You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The HCP/DCAN output data was already smoothed at 2mm FWHM, according to the "Handbook of Functional MRI Data Analysis" by Poldrack et al., the additional smoothing would cause more smoothing than what is being provided as an input.
Is the --smoothing 2 applied to the ABCD and HCP-YA data in the preprint due to the despiking step where the data was mapped to NIFTI then remapped to CIFTI? https://pmc.ncbi.nlm.nih.gov/articles/PMC10690221/. It seems that the NIFTI file already exists in the path though (to get the white matter and CSF). Why not do the despiking in the NIFTI file directly and also get the global signal from there?
The text was updated successfully, but these errors were encountered:
Sorry, I'm not sure I understand. Are you saying that the DCAN and HCP pipelines include a smoothing step with a FWHM of 2 mm?
In the paper (it's been published at this point) we applied a 2 mm smoothing kernel with wb_command -cifti-smoothing, so that's on top of any other smoothing that might have occurred previously in preprocessing or surface projection.
In the fMRISurface part from Glasser et al. 2013 Neuroimage "The minimal preprocessing pipelines for the Human Connectome Project" they smooth the data when mapping from volume to surface:
"The main goal of the second functional pipeline, fMRISurface, is to bring the timeseries from the volume into the CIFTI grayordinate standard space. This is accomplished by mapping the voxels within the cortical gray matter ribbon onto the native cortical surface, transforming them according to the surface registration onto the 32k Conte69 mesh, and mapping the set of subcortical gray matter voxels from each subcortical parcel in each individual to a standard set of voxels in each atlas parcel. The result is a standard set of grayordinates in every subject (i.e. the same number in each subject, and with spatial correspondence) with 2 mm average surface vertex and subcortical volume voxel spacing. These data are smoothed with surface (novel algorithm, see below) and parcel constrained smoothing of 2 mm FWHM to regularize the mapping process."
The HCP/DCAN output data was already smoothed at 2mm FWHM, according to the "Handbook of Functional MRI Data Analysis" by Poldrack et al., the additional smoothing would cause more smoothing than what is being provided as an input.
https://www2.psychology.uiowa.edu/classes/31231/Readings/0919_Handbook_of_Functional_MRI_Data_Analysis_Ch3.pdf.
Is the --smoothing 2 applied to the ABCD and HCP-YA data in the preprint due to the despiking step where the data was mapped to NIFTI then remapped to CIFTI? https://pmc.ncbi.nlm.nih.gov/articles/PMC10690221/. It seems that the NIFTI file already exists in the path though (to get the white matter and CSF). Why not do the despiking in the NIFTI file directly and also get the global signal from there?
The text was updated successfully, but these errors were encountered: