Hi Mike,

My recollection was that the unnamed .dscalar.nii files were zstats, not beta 
maps.  I added the beta maps later when it became clear that using statistical 
significance maps was inappropriate for parcellation.

Matt.

From: <hcp-users-boun...@humanconnectome.org> on behalf of Reza Rajimehr 
<rajim...@gmail.com>
Date: Thursday, May 30, 2019 at 12:30 PM
To: "Harms, Michael" <mha...@wustl.edu>, Nooshin Abbasi 
<nooshinabbas...@gmail.com>
Cc: hcp-users <hcp-users@humanconnectome.org>
Subject: Re: [HCP-Users] Group-averaging of task fMRI data

Thanks Michael for your detailed and helpful answers.

Best,
Reza


On Thu, May 30, 2019 at 6:10 PM Harms, Michael 
<mha...@wustl.edu<mailto:mha...@wustl.edu>> wrote:

Hi Reza,

1) We’ve already generated Cohen’s d-style effect size maps for all contrasts, 
using all subjects, as part of the “Group Average Dataset” available at 
https://db.humanconnectome.org/data/projects/HCP_1200.  If you need it computed 
for a specific subset of subjects, then yes, you can use the approach that you 
outlined.  Note that the ensuing “effect size” does not account for the family 
structure in the data (i.e., to the extent that the estimate of the std across 
subjects is biased by the family structure, then the estimate of the effect 
size is biased as well).

2) A .dtseries.nii file is still a “spatial map”.  We just didn’t bother to 
formally convert those particular outputs to a .dscalar.nii (e.g., via 
-cifti-change-mapping).  A dscalar version of all the copes (merged into a 
single file) for a given task and subject are available in the root level of 
the .feat directory containing the Level2 task analysis results for that task 
and subject.  In newer pipeline versions, we create separate merged files for 
both the “zstat” and “cope” files of the individual contrasts.  However, at the 
time of the processing of the HCP-YA data, only a single merged dscalar was 
created, and that was for the copes (and it does not unfortunately have “cope” 
as part of its filename).

3) We recommend using PALM for group statistical analysis.  You can find a 
tutorial in the “tfMRI and PALM” practical available as part of the HCP Course: 
https://store.humanconnectome.org/courses/2018/exploring-the-human-connectome.php.
  And no, you generally do *not* want to use the individual subject “zstat1” 
maps as inputs to a statistical computation, which would be “computing 
statistics of a statistic” (rather than the statistic of an effect size).

4) The outputs produced are simply the same as those produced by FSL’s FLAMEO, 
albeit in CIFTI rather than NIFTI format.  So see FSL’s FLAMEO documentation.

Cheers,
-MH

--
Michael Harms, Ph.D.
-----------------------------------------------------------
Associate Professor of Psychiatry
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid 
Ave<https://www.google.com/maps/search/660+South+Euclid+Ave?entry=gmail&source=g>.
                        Tel: 314-747-6173
St. Louis, MO  63110                          Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
<hcp-users-boun...@humanconnectome.org<mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Reza Rajimehr <rajim...@gmail.com<mailto:rajim...@gmail.com>>
Date: Wednesday, May 29, 2019 at 7:36 PM
To: hcp-users 
<hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Group-averaging of task fMRI data

Hi,

For a group of subjects (e.g. 100 subjects in HCP S1200), we want to generate a 
group-average Cohen’s d map for a particular contrast in the working memory 
task. For this, we take level2 “cope1.dtseries.nii” file in cope20.feat folder 
of all those subjects, merge them using -cifti-merge, then -cifti-reduce mean, 
-cifti-reduce stdev, and -cifti-math mean/stdev.

Questions:

1) Is the above procedure correct? Or you recommend other commands?

2) Why the file name is cope1.dtseries.nii when it is not a time-series data? 
Why not naming it cope1.dscalar.nii, as it is a spatial map?

3) If we want to generate a group-average zstat map, what should we do? I guess 
it should involve using the “zstat1.dtseries.nii” files, but don’t know how.

4) Is there any documentation somewhere describing all the files within the 
cope folder? E.g. these files:

mean_random_effects_var1.dtseries.nii
pe1.dtseries.nii
res4d.dtseries.nii
tdof_t1.dtseries.nii
tstat1.dtseries.nii
varcope1.dtseries.nii
weights1.dtseries.nii

Thanks,
Reza

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to