Re: [HCP-Users] Cleaning up intermediate files from the minimal pre-processing pipelines

2018-02-21 Thread Glasser, Matthew
Yes that is particularly true when using the latest version of the pipelines.  
There are also files in T2w and T1w that could be deleted, but will not save as 
much space as Mike’s suggestion.

Peace,

Matt.

From: "Harms, Michael" mailto:mha...@wustl.edu>>
Date: Wednesday, February 21, 2018 at 12:18 PM
To: "Cook, Philip" 
mailto:coo...@pennmedicine.upenn.edu>>, 
"hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Cc: Matt Glasser mailto:glass...@wustl.edu>>
Subject: Re: [HCP-Users] Cleaning up intermediate files from the minimal 
pre-processing pipelines


Hi,
While the documentation is overall very good, I don’t know if I’d rely on that 
pdf for a detailed list of all the files that we recommend “keeping”.  For 
that, you could download and unpack the packages for a subject with complete 
data (e.g., 100307), and see what you all get.

As a relatively simpler clean-up, I *think* that if you keep the entire 
contents of anything in $subj/T1w and $subj/MNINonLinear that you’ll have most 
of what you need for any further downstream processing, while achieving 
substantial space savings.  i.e., Most of the intermediates in the fMRI 
processing end up in the $subj/$task directories, and I think that any that 
have been deemed important (e.g., .native.func.gii) have been copied to 
$subj/MNINonLinear/Results/$task.  @Matt: Can you confirm that?

e.g,. For a subject from the HCP-Young Adult study, the output from the MPP of 
a single REST run (e.g., $subj/MNINonLinear/Results/rfMRI_REST1_LR) is about 
3.7 GB, whereas the contents of $subj/rfMRI_REST1_LR are 28 GB).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Cook, Philip" 
mailto:coo...@pennmedicine.upenn.edu>>
Date: Wednesday, February 21, 2018 at 11:49 AM
To: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Cleaning up intermediate files from the minimal 
pre-processing pipelines

Hi,

I am trying to reduce disk usage after running the HCP minimal pre-processing 
pipelines. I would like to clean up intermediate files but retain things needed 
for ongoing analysis. As a reference I have found a list of file names in

WU-Minn HCP 900 Subjects Data Release: Reference Manual
Appendix III - File Names and Directory Structure for 900 Subjects Data

https://www.humanconnectome.org/storage/app/media/documentation/s900/HCP_S900_Release_Appendix_III.pdf

I would like to retain these and clean up the remainder of the output. Are 
there any scripts available to help with this?


Thanks

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Extracting Thalamus Brainordinates

2018-02-21 Thread Timothy Coalson
It may be better to use the individual subject's "native space"
definitions.  The files in the T1w folder are in what we refer to as native
volume space (it is actually rigidly-aligned MNI space, but rigid alignment
preserves shape, so it can be used as if it were distortion-corrected
scanner coordinates, after the appropriate resamplings).  There are
aparc*+aseg.nii.gz files in that folder, which should be the freesurfer
segmentation/parcellation for that single subject, which contains a
thalamus definition.

If you prefer to use our standard cifti MNINonLinear group thalamus
definition, and work that backwards through each subject's atlas warpfield,
you can get the entire ROI of left and right thalamus by using
-cifti-separate with -volume THALAMUS_LEFT  -roi  (and
similar with RIGHT) on any small full-brain cifti file (91282 grayordinates
for 3T data), for instance the template in the Pipelines at
global/templates/91282_Greyordinates/91282_Greyordinates.dscalar.nii .

Tim


On Wed, Feb 21, 2018 at 3:24 AM, Claude Bajada 
wrote:

> Dear all,
>
> I would like to perform tractography seeding from the thalamus. So as to
> ensure that I have correspondence between individuals, I would like to
> use the thalamus defined in terms of the brainordinates already used in
> each participant.
>
> I cannot find a way to identify the brainordinates that define the
> thalamus and extract their voxels / MNI coordinates. Can someone point
> me in the right direction?
>
> Regards,
>
> Claude
>
>
>
> 
> 
> 
> 
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
> Prof. Dr. Sebastian M. Schmidt
> 
> 
> 
> 
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Extracting Thalamus Brainordinates

2018-02-21 Thread Ely, Benjamin
Hi Claude,

Are you using workbench viewer? If so, you should be able to identify
particular thalamus or other subcortical MNI coordinates in the Volume tab
(brainordinate subcortical coordinates = MNI coordinates). You can also
split the thalamus structure out from subject¹s CIFTI file, if you want to
view/manipulate the data with other tools. The command would look
something like:

wb_command -cifti­separate ${input_file}.dscalar.nii COLUMN -volume
THALAMUS ${extracted_thalamus_file}.nii

Best,
-Ely


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] gif surface to obj

2018-02-21 Thread Timothy Coalson
It appears that .obj is a fairly simple text format (
https://en.wikipedia.org/wiki/Wavefront_.obj_file), so you could hack it
together in shell, if nothing else.  You can use wb_command -gifti-convert
to convert to ASCII encoding, and that would allow you to get the
coordinate and triangle information required (gifti is always XML, but
usually uses encoded binary to save space, ASCII encoding is more like
.obj, human readable).

Tim


On Wed, Feb 21, 2018 at 2:55 PM, Glasser, Matthew 
wrote:

> mris_convert from FreeSurfer may be able to.  Or at least to convert a
> GIFTI file to another format that another converter can convert to obj.
>
> Peace,
>
> Matt.
>
> From:  on behalf of "Harwell,
> John" 
> Date: Wednesday, February 21, 2018 at 12:55 PM
> To: "Shadi, Kamal" 
> Cc: "hcp-users@humanconnectome.org" 
> Subject: Re: [HCP-Users] gif surface to obj
>
> Hello,
>
> No.  Connectome Workbench is unable to convert GIFTI files to OBJ files.
>
> John Harwell
>
> On Feb 21, 2018, at 10:18 AM, Shadi, Kamal 
> wrote:
>
> Hi all,
>
> Does human connectome workbench have a tool to convert gii surfaces to obj?
>
> Thanks,
> Kamal
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] An error in Post-FreeSurfer Pipeline

2018-02-21 Thread Glasser, Matthew
That is the shell command to source a script.  Are you using bash as your shell?

Peace,

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of Darko Komnenić mailto:komnen...@gmail.com>>
Date: Wednesday, February 21, 2018 at 2:20 PM
Cc: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] An error in Post-FreeSurfer Pipeline

Dear HCP experts,
I wrote to this mailing list about a month ago with a problem that the 
FreeSurfer Pipeline would get interrupted after finishing recon-all, and 
starting FreeSurferHiResWhite script. The error log showed
START: FreeSurferHighResWhite
Unmatched ".
and below
set -- --subject=VIMS_HC_070   
--subjectDIR=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w   
--t1=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T1w_acpc_dc_restore.nii.gz
   
--t1brain=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T1w_acpc_dc_restore_brain.nii.gz
   
--t2=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T2w_acpc_dc_restore.nii.gz
   --printcom=
. /home/lisak/Desktop/Pipelines/Examples/Scripts/SetUpHCPPipeline.sh


The suggestion I got was that I probably removed or added a quote somewhere 
while editing the scripts. However, after looking carefully and multiple times 
at FreeSurferPipelineBatch.sh, FreeSurferPipeline.sh, as well as 
FreeSurferHiresWhite.sh, I have not been able to find any unmatched quotes.

Looking at the message above the only thing that seems a bit weird is a dot 
between the equal sign and the beginning of the file path for the 
SetUpHCPPipeline script, after the --printcom segment. We don't see that dot 
and space after other headings, such as subject, t1, etc. However, I haven't 
been able to identify what is causing that space and dot to appear only after 
printcom. The corresponding segments of my script files seem identical to the 
scripts that are in the latest release from the GitHub website.

Does this seem like a reasonable line of thought (that a space and a dot 
between the equal sign and the beginning of the path are not normal)? And if 
so, do you have any idea what might be causing it?
Thanks in advance for any suggestions!
Best,
Darko


On Mon, Jan 29, 2018 at 10:39 PM, Glasser, Matthew 
mailto:glass...@wustl.edu>> wrote:
gedit with bash shell highlighting (sh).

Matt

On 1/29/18, 11:50 AM, "Darko Komnenić" 
mailto:komnen...@gmail.com>> wrote:

>Hi Matt,
>could you please tell me which editor with shell syntax highlighting
>you are using? I don't see any option to change the view/display mode
>of the text in the one I have installed on my computer.
>Thanks,
>Darko
>
>On 1/26/18, Glasser, Matthew mailto:glass...@wustl.edu>> 
>wrote:
>> Here is the relevant part of the log:
>>
>>  START: FreeSurferHighResWhite
>> Unmatched ".
>>
>> You don't ever get a message that this script finishes.  If I paste the
>> current version of the script from GitHub into a editor with shell
>>syntax
>> highlighting there are no unopposed quotes, so you must have done
>>something
>> to your version of the script.
>>
>> Peace,
>>
>> Matt.
>>
>> From: Darko Komnenić 
>> mailto:komnen...@gmail.com>>>
>> Date: Friday, January 26, 2018 at 11:58 AM
>> To: Matt Glasser 
>> mailto:glass...@wustl.edu>>>
>> Cc:
>>"hcp-users@humanconnectome.org>"
>> mailto:hcp-users@humanconnectome.org>>>
>> Subject: Re: [HCP-Users] An error in Post-FreeSurfer Pipeline
>>
>> If I interpreted the error message correctly, the unmatched quote error
>> appears after recon-all, and right after the pipeline tries to perform
>> FreeSurferHiresWhite.sh
>> This is what that part of script looks like in my case:
>> #Highres white stuff and Fine Tune T2w to T1w Reg
>> log_Msg "High resolution white matter and fine tune T2w to T1w
>> registration"
>> "$PipelineScripts"/FreeSurferHiresWhite.sh "$SubjectID" "$SubjectDIR"
>> "$T1wImage" "$T2wImage"
>>
>> There doesn't seem to me to be a quote problem, but I'm not sure? Also,
>>the
>> FreeSurferHiresWhite.sh script was never edited, and I've had the
>>pipelines
>> run successfully before, so I doubt that the problem is there.
>> Any ideas on where I should keep looking (i.e. which particular script
>>or
>> section)? Thanks!
>>
>> On Fri, Jan 26, 2018 at 6:34 PM, Darko Komnenić
>> mailto:komnen...@gmail.com>>>
>>  wrote:
>> I did the usual changes of subject list and folder paths, as well as the
>> number of cores in the FreeSurfer script (not the
>>FreeSurferPipelineBatch,
>> but the one that's found in the Pipelines/FreeSurfer folder), I will go
>>over
>> them now, thanks!
>>
>> On

Re: [HCP-Users] gif surface to obj

2018-02-21 Thread Glasser, Matthew
mris_convert from FreeSurfer may be able to.  Or at least to convert a GIFTI 
file to another format that another converter can convert to obj.

Peace,

Matt.

From: 
mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Harwell, John" mailto:jharw...@wustl.edu>>
Date: Wednesday, February 21, 2018 at 12:55 PM
To: "Shadi, Kamal" mailto:kamal.shad...@gatech.edu>>
Cc: "hcp-users@humanconnectome.org" 
mailto:hcp-users@humanconnectome.org>>
Subject: Re: [HCP-Users] gif surface to obj

Hello,

No.  Connectome Workbench is unable to convert GIFTI files to OBJ files.

John Harwell

On Feb 21, 2018, at 10:18 AM, Shadi, Kamal 
mailto:kamal.shad...@gatech.edu>> wrote:

Hi all,

Does human connectome workbench have a tool to convert gii surfaces to obj?

Thanks,
Kamal

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] An error in Post-FreeSurfer Pipeline

2018-02-21 Thread Darko Komnenić
Dear HCP experts,
I wrote to this mailing list about a month ago with a problem that the
FreeSurfer Pipeline would get interrupted after finishing recon-all, and
starting FreeSurferHiResWhite script. The error log showed
START: FreeSurferHighResWhite
Unmatched ".
and below
set -- --subject=VIMS_HC_070
--subjectDIR=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w
--t1=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T1w_acpc_dc_restore.nii.gz
--t1brain=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T1w_acpc_dc_restore_brain.nii.gz
--t2=/home/lisak/Desktop/NMDA_HCP/controls/VIMS_HC_070/T1w/T2w_acpc_dc_restore.nii.gz
--printcom=
. /home/lisak/Desktop/Pipelines/Examples/Scripts/SetUpHCPPipeline.sh


The suggestion I got was that I probably removed or added a quote somewhere
while editing the scripts. However, after looking carefully and multiple
times at FreeSurferPipelineBatch.sh, FreeSurferPipeline.sh, as well as
FreeSurferHiresWhite.sh, I have not been able to find any unmatched quotes.

Looking at the message above the only thing that seems a bit weird is a dot
between the equal sign and the beginning of the file path for the SetUp
HCPPipeline script, after the --printcom segment. We don't see that dot and
space after other headings, such as subject, t1, etc. However, I haven't
been able to identify what is causing that space and dot to appear only
after printcom. The corresponding segments of my script files seem
identical to the scripts that are in the latest release from the GitHub
website.

Does this seem like a reasonable line of thought (that a space and a dot
between the equal sign and the beginning of the path are not normal)? And
if so, do you have any idea what might be causing it?
Thanks in advance for any suggestions!
Best,
Darko


On Mon, Jan 29, 2018 at 10:39 PM, Glasser, Matthew 
wrote:

> gedit with bash shell highlighting (sh).
>
> Matt
>
> On 1/29/18, 11:50 AM, "Darko Komnenić"  wrote:
>
> >Hi Matt,
> >could you please tell me which editor with shell syntax highlighting
> >you are using? I don't see any option to change the view/display mode
> >of the text in the one I have installed on my computer.
> >Thanks,
> >Darko
> >
> >On 1/26/18, Glasser, Matthew  wrote:
> >> Here is the relevant part of the log:
> >>
> >>  START: FreeSurferHighResWhite
> >> Unmatched ".
> >>
> >> You don't ever get a message that this script finishes.  If I paste the
> >> current version of the script from GitHub into a editor with shell
> >>syntax
> >> highlighting there are no unopposed quotes, so you must have done
> >>something
> >> to your version of the script.
> >>
> >> Peace,
> >>
> >> Matt.
> >>
> >> From: Darko Komnenić mailto:komnen...@gmail.com>>
> >> Date: Friday, January 26, 2018 at 11:58 AM
> >> To: Matt Glasser mailto:glass...@wustl.edu>>
> >> Cc:
> >>"hcp-users@humanconnectome.org"
> >> mailto:hcp-users@humanconnectome.org>>
> >> Subject: Re: [HCP-Users] An error in Post-FreeSurfer Pipeline
> >>
> >> If I interpreted the error message correctly, the unmatched quote error
> >> appears after recon-all, and right after the pipeline tries to perform
> >> FreeSurferHiresWhite.sh
> >> This is what that part of script looks like in my case:
> >> #Highres white stuff and Fine Tune T2w to T1w Reg
> >> log_Msg "High resolution white matter and fine tune T2w to T1w
> >> registration"
> >> "$PipelineScripts"/FreeSurferHiresWhite.sh "$SubjectID" "$SubjectDIR"
> >> "$T1wImage" "$T2wImage"
> >>
> >> There doesn't seem to me to be a quote problem, but I'm not sure? Also,
> >>the
> >> FreeSurferHiresWhite.sh script was never edited, and I've had the
> >>pipelines
> >> run successfully before, so I doubt that the problem is there.
> >> Any ideas on where I should keep looking (i.e. which particular script
> >>or
> >> section)? Thanks!
> >>
> >> On Fri, Jan 26, 2018 at 6:34 PM, Darko Komnenić
> >> mailto:komnen...@gmail.com>> wrote:
> >> I did the usual changes of subject list and folder paths, as well as the
> >> number of cores in the FreeSurfer script (not the
> >>FreeSurferPipelineBatch,
> >> but the one that's found in the Pipelines/FreeSurfer folder), I will go
> >>over
> >> them now, thanks!
> >>
> >> On Fri, Jan 26, 2018 at 6:24 PM, Glasser, Matthew
> >> mailto:glass...@wustl.edu>> wrote:
> >> Did you accidentally change on of the pipeline scripts?  If you look at
> >>the
> >> end of that log file it complaining about an unmatched quote.
> >>
> >> Peace,
> >>
> >> Matt.
> >>
> >> From:
> >>
> >>mailto:hcp
> -users-bounces@humanconn
> >>ectome.org>>
> >> on behalf of Darko Komnenić
> >> mailto:komnen...@gmail.com>>
> >> Date: Friday, January 26, 2018 at 10:58 AM
> >>
> >> Cc:
> >>"hcp-users@humanconnectome.org"
> >> mailto:hcp-users@humanconnectome.org>>
> >> Subject: Re: [HCP-Users] An error in Post-FreeSurfer Pipeline
> >>
> >> Dear HCP experts, I reran the PreFS and FS pipeline, with an added
> >>argument
> >> for

Re: [HCP-Users] gif surface to obj

2018-02-21 Thread Harwell, John
Hello,

No.  Connectome Workbench is unable to convert GIFTI files to OBJ files.

John Harwell

On Feb 21, 2018, at 10:18 AM, Shadi, Kamal 
mailto:kamal.shad...@gatech.edu>> wrote:

Hi all,

Does human connectome workbench have a tool to convert gii surfaces to obj?

Thanks,
Kamal

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Cleaning up intermediate files from the minimal pre-processing pipelines

2018-02-21 Thread Harms, Michael

Hi,
While the documentation is overall very good, I don’t know if I’d rely on that 
pdf for a detailed list of all the files that we recommend “keeping”.  For 
that, you could download and unpack the packages for a subject with complete 
data (e.g., 100307), and see what you all get.

As a relatively simpler clean-up, I *think* that if you keep the entire 
contents of anything in $subj/T1w and $subj/MNINonLinear that you’ll have most 
of what you need for any further downstream processing, while achieving 
substantial space savings.  i.e., Most of the intermediates in the fMRI 
processing end up in the $subj/$task directories, and I think that any that 
have been deemed important (e.g., .native.func.gii) have been copied to 
$subj/MNINonLinear/Results/$task.  @Matt: Can you confirm that?

e.g,. For a subject from the HCP-Young Adult study, the output from the MPP of 
a single REST run (e.g., $subj/MNINonLinear/Results/rfMRI_REST1_LR) is about 
3.7 GB, whereas the contents of $subj/rfMRI_REST1_LR are 28 GB).

Cheers,
-MH

--
Michael Harms, Ph.D.
---
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.Tel: 314-747-6173
St. Louis, MO  63110  Email: 
mha...@wustl.edu

From:  on behalf of "Cook, Philip" 

Date: Wednesday, February 21, 2018 at 11:49 AM
To: "hcp-users@humanconnectome.org" 
Subject: [HCP-Users] Cleaning up intermediate files from the minimal 
pre-processing pipelines

Hi,

I am trying to reduce disk usage after running the HCP minimal pre-processing 
pipelines. I would like to clean up intermediate files but retain things needed 
for ongoing analysis. As a reference I have found a list of file names in

WU-Minn HCP 900 Subjects Data Release: Reference Manual
Appendix III - File Names and Directory Structure for 900 Subjects Data

https://www.humanconnectome.org/storage/app/media/documentation/s900/HCP_S900_Release_Appendix_III.pdf

I would like to retain these and clean up the remainder of the output. Are 
there any scripts available to help with this?


Thanks

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Cleaning up intermediate files from the minimal pre-processing pipelines

2018-02-21 Thread Cook, Philip
Hi,

I am trying to reduce disk usage after running the HCP minimal pre-processing 
pipelines. I would like to clean up intermediate files but retain things needed 
for ongoing analysis. As a reference I have found a list of file names in

WU-Minn HCP 900 Subjects Data Release: Reference Manual
Appendix III - File Names and Directory Structure for 900 Subjects Data

https://www.humanconnectome.org/storage/app/media/documentation/s900/HCP_S900_Release_Appendix_III.pdf

I would like to retain these and clean up the remainder of the output. Are 
there any scripts available to help with this?


Thanks

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] scan order, session day, scan time variables

2018-02-21 Thread Elam, Jennifer
Hi Csaba,

I believe that information is still only available per subject in the 
{Subject_ID}_3T.csv file in the Structural unprocessed and Structural 
preprocessed packages. Unpacked, the file should appear in the 
{Study_ID}/{Subject_ID}/unprocessed/3T and  {Study_ID}/{Subject_ID}/T1w 
directories.


Best,

Jenn

Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org



From: hcp-users-boun...@humanconnectome.org 
 on behalf of Csaba Orban 

Sent: Wednesday, February 21, 2018 2:05:53 AM
To: hcp-users@humanconnectome.org
Subject: [HCP-Users] scan order, session day, scan time variables

Hi,

I was wondering if it’s now possible to download scan order, session day, and 
scan time variables as part of the bulk csv downloads?

A post from 2016 mentioned that it might become available in the S1200 release.
hhttps://www.mail-archive.com/hcp-users@humanconnectome.org/msg02922.html

Best wishes,

Csaba

~
Csaba Orban, PhD
Postdoctoral Research Fellow
Department of Electrical & Computer Engineering
National University of Singapore





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] gif surface to obj

2018-02-21 Thread Shadi, Kamal
Hi all,

Does human connectome workbench have a tool to convert gii surfaces to obj?

Thanks,
Kamal

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] Extracting Thalamus Brainordinates

2018-02-21 Thread Claude Bajada
Dear all,

I would like to perform tractography seeding from the thalamus. So as to
ensure that I have correspondence between individuals, I would like to
use the thalamus defined in terms of the brainordinates already used in
each participant.

I cannot find a way to identify the brainordinates that define the
thalamus and extract their voxels / MNI coordinates. Can someone point
me in the right direction?

Regards,

Claude





Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
Prof. Dr. Sebastian M. Schmidt




___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


[HCP-Users] scan order, session day, scan time variables

2018-02-21 Thread Csaba Orban
Hi,

I was wondering if it’s now possible to download scan order, session day, and 
scan time variables as part of the bulk csv downloads?

A post from 2016 mentioned that it might become available in the S1200 release.
hhttps://www.mail-archive.com/hcp-users@humanconnectome.org/msg02922.html 
https://www.mail-archive.com/hcp-users@humanconnectome.org/msg02922.html>

Best wishes,

Csaba

~
Csaba Orban, PhD
Postdoctoral Research Fellow
Department of Electrical & Computer Engineering 
National University of Singapore





___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users