I've just created the following docs for mri_mcsim doug mri_mcsim --o top-output-dir --base csdbase --surface subjectname hemi --nreps nrepetitions --fwhm FWHM <FWHM2 FWHM3 ...> --fwhm-max FWHMMax : sim with fwhm=1:FWHMMax (default 30)
--avgvtxarea : report cluster area based on average vtx area --seed randomseed : default is to choose based on ToD --label labelfile : default is ?h.cortex.label --mask maskfile : instead of label --no-label : do not use a label to mask --no-save-mask : do not save mask to output (good for mult jobs) --surfname surfname : default is white --log LogFile --done DoneFile : will create DoneFile when finished --stop stopfile : default is ourdir/mri_mcsim.stop --save savefile : default is ourdir/mri_mcsim.save --save-iter : save output after each iteration --sd SUBJECTS_DIR --debug turn on debugging --checkopts don't run anything, just check options and exit --help print out information on how to use this program --version print out version and exit $Id: mri_mcsim.c,v 1.26 2014/04/11 15:31:56 greve Exp $ This program computes tables for performing multiple comparisons on the surface based on a monte carlo simulation using white gaussian noise smoothed on the surface. mri_mcsim was used to generate the tables found in $FREESURFER_HOME/average/mult-comp-cor and used by mri_glmfit-sim with the --cache option. The tables in mult-comp-cor are for fsaverage and fsaverage_sym for a search space over the whole cortex. mri_mcsim can be used to create new tables for a new subject or for fsaverage/fsaverage_sum with a smaller search space. Example 1: Create tables for a new subject for whole hemisphere mri_mcsim --o /path/to/mult-comp-cor/newsubject/lh/cortex --base mc-z --save-iter --surf newsubject lh --nreps 10000 This may take hours (or even days) to run; see below for parallelizing. When running mri_glmfit-sim, add --cache-dir /path/to/mult-comp-cor Example 2: Create tables for superior temporal gyrus for fsaverage First, create the label by running mri_annotation2label --subject fsaverage --hemi lh --outdir labeldir mri_mcsim --o /path/to/mult-comp-cor/fsaverage/lh/superiortemporal --base mc-z --save-iter --surf fsaverage lh --nreps 10000 --label labeldir/lh.superiortemporal.label When running mri_glmfit, make sure to use --label labeldir/lh.superiortemporal.label When running mri_glmfit-sim, add --cache-dir /path/to/mult-comp-cor --cache-label superiortemporal Example 3: running simulations in parallel (two jobs, 5000 iterations each for a total of 10000) mri_mcsim --o /path/to/mult-comp-cor/fsaverage/lh/superiortemporal --base mc-z.j001 --save-iter --surf fsaverage lh --nreps 5000 --label labeldir/lh.superiortemporal.label mri_mcsim --o /path/to/mult-comp-cor/fsaverage/lh/superiortemporal --base mc-z.j002 --save-iter --surf fsaverage lh --nreps 500 --label labeldir/lh.superiortemporal.label When those jobs are done, merge the results into a single table with mri_surfcluster --csd /path/to/mult-comp-cor/fsaverage/lh/superiortemporal/fwhm10/abs/th20/mc-z.j001.csd --csd /path/to/mult-comp-cor/fsaverage/lh/superiortemporal/fwhm10/abs/th20/mc-z.j002.csd --csd-out /path/to/mult-comp-cor/fsaverage/lh/superiortemporal/fwhm10/abs/th20/mc-z.csd --csdpdf /path/to/mult-comp-cor/fsaverage/lh/superiortemporal/fwhm10/abs/th20/mc-z.cdf --csdpdf-only Repeat the above command for each FWHM, sign (pos, neg, abs) and threshold On 04/11/2014 11:08 AM, Douglas N Greve wrote: > That should have been --nreps 10000 (not 100:) > > On 04/11/2014 11:04 AM, Douglas N Greve wrote: >> On 04/11/2014 02:52 AM, Gautam Tammewar wrote: >>> Hi all, >>> >>> I'm trying to run Monte Carlo simulations on cortical thickness data >>> I got using mri_glmfit, but I'm running into some errors. >>> >>> 1) I used the command >>> >>> mri_glmfit-sim --glmdir >>> lh.thickness.fwhm20.diagnosis_scanner.dods.glmdir/ --cache 2.0 pos >>> >>> The error I receive says "cannot find >>> /usr/local/freesurfer_x86_64-5.1.0/average/mult-comp-cor/fsaverage/lh/cortex/fwhm40/pos/th20/mc-z.csd" >> Even though you've smoothed by 20, the actual FWHM in the data is 40, >> which is pretty high, and the tables do not go that high. This may >> indicate that there is something wrong with your data >>> I'm wondering why the script is looking in the directory "fwhm40" >>> when the data is smoothed using FWHM 20. The directories available >>> under fsaverage/lh/cortex go from fwhm01 to fwhm30. >>> >>> 2) Also, for another test I'm running, I created my own average >>> subjects (instead of using fsaverage). Is there a way to generate the >>> files that are created under /average/mult-comp-cor for this average >>> subject so that I don't have to use the "--sim" flag to run simulations? >>> >> Yes, you can use mri_mcsim, something like >> >> mri_mcsim --o /path/to/mult-comp-cor/yoursubject/lh/cortex --base mc-z >> --save-iter --surf yoursubject lh --nreps 100 >> >> This may take several hours or days to run >> >> When running mri_glmfit-sim, add --cache-dir /path/to/mult-comp-cor >> >> doug >> >>> Thanks, >>> GT >>> >>> >>> _______________________________________________ >>> Freesurfer mailing list >>> Freesurfer@nmr.mgh.harvard.edu >>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer -- Douglas N. Greve, Ph.D. MGH-NMR Center gr...@nmr.mgh.harvard.edu Phone Number: 617-724-2358 Fax: 617-726-7422 Bugs: surfer.nmr.mgh.harvard.edu/fswiki/BugReporting FileDrop: https://gate.nmr.mgh.harvard.edu/filedrop2 www.nmr.mgh.harvard.edu/facility/filedrop/index.html Outgoing: ftp://surfer.nmr.mgh.harvard.edu/transfer/outgoing/flat/greve/ _______________________________________________ Freesurfer mailing list Freesurfer@nmr.mgh.harvard.edu https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer The information in this e-mail is intended only for the person to whom it is addressed. If you believe this e-mail was sent to you in error and the e-mail contains patient information, please contact the Partners Compliance HelpLine at http://www.partners.org/complianceline . If the e-mail was sent to you in error but does not contain patient information, please contact the sender and properly dispose of the e-mail.