Working in native space

Hi,

as I see, DPABI has the option to calculate measures like ALFF in native space and then to normalize the derivates and smooth the derivatives. On which article or method is this based on?

greetings

 

David

In Yan, C.G., Cheung, B., Kelly, C., Colcombe, S., Craddock, R.C., Di Martino, A., Li, Q., Zuo, X.N., Castellanos, F.X., Milham, M.P., 2013. A comprehensive assessment of regional variation in the impact of head micromovements on functional connectomics. Neuroimage 76, 183-201.

We did that way.

Thanks for the reply,

I have a follow-up question regarding the Cambridge dataset you used in this paper. I tried preprocessing the Cambridge data with DPARSF 4.3 and spm 12 and I get a lot of failed normalizations with Dartel (>30). In the paper you only excluded 4 subjects due to bad normalization. I don't understand why  I get so many although it is the same dataset and DPARSF as well.

Moreover, in almost all of the 1000 fcp datasets either the segmentation fails with this error:

Failed  'Segment'

Error using svd

Input to SVD must not contain NaN or Inf.

or the normalization with DARTEL is totally bad in all subjects so I can't use the data. Since you also processed the FCP data in one of your papers also with DPARSF, I was wondering if you had similar problems or if you have any ideas what the problem could be? 

Any help appreciated!

 

Hi,

I havn't got such problems.

1. Did you try bet as well for Cambridge dataset?

2. for the data with that error, try to examine if there is NaN in the T1 image.

thank you! It seems using the already skullstripped files generated those problems. Using the anonymized non-skullstrippend and running bet afterwards produces better results!