On 9/20/2022 5:01 PM, Harriott, Emily M wrote:

        External Email - Use Caution

Dear FreeSurfer experts,

I hope this email finds you all doing well. I'm a second-year doctoral student (and beginner FreeSurfer user) trying to use FreeSurfer to segment gray and white matter in 90 T1 images of children/adolescents (with reading disorders, Neurofibromatosis type 1, and typically developing) for a project; I have two questions about processing these images, and I would be so very grateful to receive some thoughts/suggestions/advice etc. from some FreeSurfer experts such as yourselves. I ran recon-all on each of those 90 images, but one single run of recon-all left too much dura in the gray matter segmentation and too much white matter out of the white matter segmentation (the amount of white matter left out varied depending on the intensity contrast of each image). To fix the dura/gray matter issue, I re-ran recon-all using the -gcut flag and edited out by hand the remaining dura; this seemed to help. To fix the white matter issue, I tried adding control points and adding white matter voxels and re-running recon-all; this unfortunately did not seem to help, as control points led to worse segmentation and the added white matter voxels had minimal to no effect. So, after some experimentation, I have created a full processing pipeline to use on my data.But, before I apply it to all 90 images, I wanted to confirm with you all experts that this will 1) work and 2) do what I want it to do.

Question 1: If I process my data in FS 7.2.0, can I go back and analyze it using QDEC in FS 6.0.0? Or do I have to process it in 6.0.0 to use QDEC?
Yes, that is  fine,  though keep in mind that we are not supporting qdec any more.

Question 2: Here is my pipeline (see below). Will this work? Will it do what I want it to do? Do you have any suggestions? I'm a beginner; is there anything I should be careful of or watch out for that I don't know about? After some experimentation on a subsample, this pipeline appears to work, but I really want to ensure it is doing what I want it to do and will do that consistently.
I did not follow your pipeline description entirely. What you have below looks like it will do what you want it to do, but I'd change a few things. First, you need to add -autorecon3 so that it completes the process. You can also use -autorecon2-cp instead of -autorecon2 (this will skip the most time consuming part). Finally, I would not run -qcache until after you are satisfied with the results (you can/should run recon-all with just the -qcache flag)


For one subject....

I first run the initial recon-all (with some flags).
/$recon-all -i <input image> -s <subjID> -sd <subjects_dir> -wsthresh 10 -gcut -qcache/

Next, I open brainmask.mgz in FreeView and edit out the remaining dura mater by hand. When I save this new brainmask.mgz, I overwrite the old brainmask.mgz with this new edited brainmask.mgz.

I then run multiple second iterations of recon-all, allowing me to use different seg-wlo thresholds (seg-wlo thresholds of 40, 50, 60, 70, 80, 90 are applied to each image).
The multiple second iterations of recon-all are:
/$recon-all autorecon2 -s <subjID> -sd <subjects_90> -seg-wlo 90 -qcache/
/$/recon-all autorecon2 -s <subjID> -sd/ <subjects_80> -seg-wlo 80 -qcache/ //$/recon-all autorecon2 -s <subjID> -sd/ <subjects_70> -seg-wlo 70 -qcache/
/
//$/recon-all autorecon2 -s <subjID> -sd /<subjects_60> -seg-wlo 60 -qcache/
/
////$/recon-all autorecon2 -s <subjID> -sd/ <subjects_50> -seg-wlo 50 -qcache//
//
////////$/recon-all autorecon2 -s <subjID> -sd/<subjects_40> -seg-wlo 40 -qcache////
////
////(Note that I have 6 distinct subjects_?0 folders for 6 distinct seg-wlo values)////

In FreeView, I then visually assess the image/segmentation produced by each seg-wlo threshold to select the threshold that most optimally segmented the gray and white matter. I save the subject folder with the optimal seg-wlo threshold, as determined visually in FreeView by me, to a new subjects directory; this new subjects directory's folder is specified "SELECT".

I now have measures of cortical thickness, surface area, and volume, but I also want measures of local gyrification index (LGI). With MATLAB 2022a active (I'm working on a large computing cluster, so I have to activate MATLAB/2022a as a module), I run the LGI recon-all.
/$recon-all -s <subjID> -sd <subjects_SELECT> -localGI -qcache/
/
/
I then obtain the measurements of LGI. To run this line, I also have a .bashrc file in my subjects_SELECT folder that has one line: MEASURE1 = pial_lgi.
/$recon-all -s <subjID> -sd <subjects_SELECT> -measure pial_lgi -qcache/

Then the data go into QDEC and I should be able to analyze from there.

Any confirmations/denials, suggestions, or advice you all have would be so very greatly appreciated.


Thank you so very much!
Emily Harriott
Doctoral Student, Vanderbilt University
Nashville, TN


_______________________________________________
Freesurfer mailing list
Freesurfer@nmr.mgh.harvard.edu
https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer
_______________________________________________
Freesurfer mailing list
Freesurfer@nmr.mgh.harvard.edu
https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer
The information in this e-mail is intended only for the person to whom it is 
addressed.  If you believe this e-mail was sent to you in error and the e-mail 
contains patient information, please contact the Mass General Brigham 
Compliance HelpLine at https://www.massgeneralbrigham.org/complianceline 
<https://www.massgeneralbrigham.org/complianceline> .
Please note that this e-mail is not secure (encrypted).  If you do not wish to 
continue communication over unencrypted e-mail, please notify the sender of 
this message immediately.  Continuing to send or respond to e-mail after 
receiving this message means you understand and accept this risk and wish to 
continue to communicate over unencrypted e-mail. 

Reply via email to