Dear Sir,
When i am using wb_shortcuts for my analysis , i am incurring following
message , could you please let me know the why i am getting the following
warning message :
wb_shortcuts: line 635: global_temporary_files: unbound variable
Thanks
Vasudev
Dear all ,
I am using PALM in order to perform Two sample Unpaired T test for my
group study .
I followed the instructions provided in PALM examples working with CIFTI
files.
When i am running the palm i am incurring following error
1. palm -i L_IX_AB_sub.nii -d design.mat -t
Dear all ,
If possible could you please let me know how i can download s1200 Group
average data
https://www.humanconnectome.org/study/hcp-young-adult/article/s1200-group-average-data-release,
in my firefox aspera connect plugin is not working well, I have tried it
with Chrome browser and i am
Dear Sir,
I am facing problems in capturing an imaging using wb_view , Currently i am
running on Ubuntu 16.04 Xenial
I am annexing the error that i am getting in txt file and the image
(label.png), please kindly review the attachment and let me know if there
is any fault with in my system.
d consulting the documentation
> first and e-mail the list second if something in the documentation doesn’t
> make sense.
>
> Peace,
>
> Matt,
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Friday, March 31, 2017 at 8:42 AM
> To: Matt Glasser <glass...@wustl
Dear all ,
I am incurring some problems when i try to capture the image in
Connectome workbench viewer , it seems some binaries are not working
properly . the sample image output is attached for your reference ( .png
file) . The error that i am getting is listed below
" WARNING: X Error:
onnectome.org> on behalf of Dev vasu <
> vasudevamurthy.devulapa...@gmail.com>
> Date: Thursday, March 30, 2017 at 7:59 PM
> To: Timothy Coalson <tsc...@mst.edu>
> Cc: "<hcp-users@humanconnectome.org>" <hcp-users@humanconnectome.org>
> Subject: Re:
Dear all,
I have label files in nifti format and i am unable to visualize color bars
and associated labels , should i refine the labels using
*-label-to-volume-mapping
*and is there any mandatory naming convention and format ( NIFITI or
GIFTI ) for Connectome workbench to recognize the labels
t;0 1 0 ""$MatrixY" >> "$FreeSurferFolder"/mri/c_ras.mat
echo "0 0 1 ""$MatrixZ" >> "$FreeSurferFolder"/mri/c_ras.mat
echo "0 0 0 1" >> "$FreeSurferFolder"/mri/c_ras.mat
Thanks
Vasudev
On 23 February 2017 at
Dear all,
When i tried to run Freesurfer Pipeline i am facing an error where the
pipeline could not generate brain.finalsurfs.mgz . Could you please let me
know the reason for this error?.
START: FS2CaretConvertRegisterNonlinear
FreeSurfer2CaretConvertAndRegisterNonLinear.sh: RegName: MSMSulc
Dear Sir,
Is there any approach where i can perform Group wise multi modal
parcellation, I want to include T1w,rfMRI, and DTI of 2 groups of subjects
(25 healthy and 25 diseased controls with BLVP) and perform multi modal
parcellation on whole brain.
If you have some Journals or some approaches
Dear Sir,
Is there any approach where i can perform Group wise multi modal
parcellation, I want to include T1w,rfMRI, and DTI of 2 groups of subjects
(25 healthy and 25 diseased controls with BLVP) and perform multi modal
parcellation on whole brain.
If you have some Journals or some approaches
ouis, MO 63110
> 314-362-9387
> e...@wustl.edu
> www.humanconnectome.org
>
> --
> *From:* hcp-users-boun...@humanconnectome.org <hcp-users-bounces@
> humanconnectome.org> on behalf of Dev vasu <vasudevamurthy.devulapally@
> gmail.com>
> *Sent:* Mond
Dear Sir /madam,
I am interested in Group wise parcellation of multimodal data T1w, rfMRI
,DTI from 50 subjects divided into 2 groups (25 healthy controls and 25
patients with BLVP ) , I have read your recent journal in nature - " A
Multi modal parcellation of human cerebral cortex " in which
Dear Sir/madam,
I would like to know if HCP has updated 7T structural scans, so far i only
notice 3T 1.6mm structural data when i download any 7T subject.
Kindly please update me when you have uploaded 7T structural scans.
Or if possible please kindly send me some 7T structural data
Thanks
Dear Sir/madam,
I would like to know if HCP has updated 7T structural scans, so far i only
notice 3T 1.6mm structural data when i download any 7T subject.
Kindly please update me when you have uploaded 7T structural scans.
Or if possible please kindly send me some 7T structural data
Thanks
at 17:41, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear Sir,
>
> How could i possibly measure FOV of an image, and what specific
> requirements pertaining to FOV for HCP pipelines ?.
>
>
> Thanks
> Vasudev
>
> On 31 August 2016 at 15:06, Dev vasu &
Dear Sir,
How could i possibly measure FOV of an image, and what specific
requirements pertaining to FOV for HCP pipelines ?.
Thanks
Vasudev
On 31 August 2016 at 15:06, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear Sir,
>
> Can i possibly use rfRMI image in pla
t; Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Wednesday, August 31, 2016 at 7:57 AM
>
> To: Matt Glasser <glass...@wustl.edu>
> Cc: "<hcp-users@humanconnectome.org>" <hcp-users@humanconnectome.org>
> Subj
Dear Sir,
I would like to Parcellate Nuclei specific region i.e. Brain Stem Nuclei as
ROI and use it as a mask, i have asked Free Surfer community about
parcellation of brain stem structures and they have recommended following
functionality (
ote:
> You need to use 1mm templates instead of 0.7mm templates. Keep the 2mm
> ones the same.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Thursday, August 25, 2016 at 12:00 PM
>
> To: Matt Glasser <glass...@wus
August 2016 at 18:14, Glasser, Matthew <glass...@wustl.edu> wrote:
> You should use the 1mm templates instead if your data are 1mm isotropic.
> This might fix the issue…
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date
Dear Sir ,
In MNINonLinear / ROIs folder of 7T Preprocessed HCP data , I see the data
is not parcellated for Brain stem region and Thalamus region, could you
please let me know how i could perform parcellation for these two specific
regions of interest ( i.e. Brain stem and Thalamus ).
I am
r}/${Subject}/RawData folder and then put files in that.
>
> You can use fslhd to find out the voxel resolution of your images.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Tuesday, August 23, 2016 at 12:39 AM
>
ck to make sure the
> templates you are using match the input data in terms of the voxel
> resolution.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Monday, August 22, 2016 at 9:35 AM
>
> To: Matt Glasser <glass...@wust
on’t understand.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Saturday, August 20, 2016 at 10:00 AM
>
> To: Matt Glasser <glass...@wustl.edu>
> Cc: "<hcp-users@humanconnectome.org>" <hcp-users@hum
the settings. You could read
> through the example script and see which places you need to make changes.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Saturday, August 20, 2016 at 9:27 AM
>
> To: Matt Glasser <glass.
e to make changes to the settings so that they are appropriate
> for your study.
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Saturday, August 20, 2016 at 9:23 AM
>
> To: Matt Glasser <glass...@wustl.edu>
Dear Sir / madam,
HCP team has mentioned that you will upload BedpostX-processed Diffusion
data after uploading 7T data , could you please let me know when it will
be done.
Is there any alternate procedure to obtain BedpostX-processed Diffusion
data from your end ?
Thanks
Vasudev
Dear Sir / madam,
I have tried to download few HCP 7T data and i am unable to download any
subject data, the Aspera connect is not launching when i click the download
button, could you please let me know if there are any problems with HCP
server ?.
Thanks
vasudev
Dear Sir,
In order to run freesurfer pipeline i have installed libnetcdf.so.6 and
libhdf5_hl.so.6 libraries in /usr/lib directory , after installation of
these libraries the disk space automatically decreases in my computer
because
*/var/log/cups/error_log becomes huge , approximately 190
sorry about that.wb_view: ../../src/xcb_io.c:274: poll_for_event:
Assertion `!xcb_xlib_threads_sequence_lost' failed.Aborted (core dumped)"*
Thanks
Vasudev
On 14 July 2016 at 01:05, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear Sir,
>
> I am using Ubuntu 14.04-
Dear Sir,
I have tried to capture an image in wb_view but i am incurring following
error, could you please let me know the reason for this
*" [xcb] Unknown sequence number while processing queue[xcb] Most likely
this is a multi-threaded client and XInitThreads has not been called[xcb]
nsider sooner rather than later.
>
>
> On Jul 11, 2016, at 10:36 AM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
> > Dear madam,
> >
> > Is there any way to measure the distance from Pial surface boundary with
> GM/WM boundary in freesur
use Freesurfer to
> generate cortical thickness measurements from T1w volumes:
>
> https://surfer.nmr.mgh.harvard.edu/fswiki
>
> Donna
>
>
> On Jul 11, 2016, at 5:20 AM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
> > Dear Sir,
> >
Dear Sir /madam,
I am able to successfully run Structural preprocessing pipeline on a group
of 24 subjects, I would like to know how i can create cortical thickness
maps for all the subjects in MNI152_T1_0.7mm.nii.gz template.
Thanks
Vasudev
___
Dear sir,
I would like to threshold my results to a p-value of p<0.05 and i couldn't
do it from workbench GUI, I can only threshold the data to percentage
changes.
Please kindly let me know how i can possibly threshold the results to real
p-stats.
Thanks
Vasudev
.
Is there any way i can obtain co-ordinates for these regions ?.
Thanks
Vasudev
On 28 June 2016 at 16:05, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear madam,
>
>
> I would like to study the distribution of vestibular tracks from thalamus
> to vestibular cortex for w
o create a cortical ROI.
>
> For drawing volumetric ROIs, there are all sorts of options out there.
>
> Donna
>
>
> On Jun 27, 2016, at 9:45 AM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
> > Dear Sir,
> >
> > I would lik
Dear Sir,
I would like to know how to define structural and functional ROI in
workbench,should we define the ROI manually ? or is there any alternate
procedure for it, I am aware of process in SPM but not in workbench.
Thanks
Vasudev
___
HCP-Users
er for the Neuroscience of Mental Disorders
> Washington University School of Medicine
> Department of Psychiatry, Box 8134
> 660 South Euclid Ave. Tel: 314-747-6173
> St. Louis, MO 63110 Email: mha...@wustl.edu
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of
iversity School of Medicine
> Department of Psychiatry, Box 8134
> 660 South Euclid Ave. Tel: 314-747-6173
> St. Louis, MO 63110 Email: mha...@wustl.edu
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date: Tuesday, June 21, 2016 at 2:36 PM
>
> To:
Dear sir,
Could you please let me know the spatial resolution of DTI raw data in HCP
for 7T .
Thanks
Vasudev
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users
Dear Sir,
We are using VERIO MRI scanner with spatial resolution about 2mm and we had
only 32 Diffusion directions. That’s definitely a difference to the HCP
data set. Any suggestions how to resolve this issue would be greatly
appreciated.
Thanks
Vasudev
On 21 June 2016 at 21:36, Dev vasu
4-747-6173
> St. Louis, MO 63110 Email: mha...@wustl.edu
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Dev vasu <
> vasudevamurthy.devulapa...@gmail.com>
> Date: Tuesday, June 21, 2016 at 2:02 PM
> To: "Harms, Michael" <mha...@wustl.edu>
lass 32 gb
> hcp_tmegpreproc 32gb
> hcp_eravg 32 gb
> hcp_tfavg 32 gb
> hcp_srcavglcmv 16gb
> hcp_srcavgdics 16gb
> hcp_tmegconnebasic 16 gb
>
> Best
>
> Giorgos
>
>
>
>
>
>
>
>
> On 5/25/2016 12:02
ry
> Articles and Tutorials by LRZ" I would suggest you follow the links
> provided in that section, read that documentation, and submit any questions
> you have to the service desk for the LRZ (a link to the service desk is
> also on the page you supplied a link to.)
>
> Tim
>
>
Dear Sir,
Currently i am running HCP Pipelines on a Standalone computer but i would
like to set up the pipeline on a Linux cluster, if possible could you
please provide me some details concerning procedures that i have to follow .
Thanks
Vasudev
___
of
> > gburg...@wustl.edu> wrote:
> >
> > The form itself states how long it takes to process.
> >
> >
> http://www.humanconnectome.org/data/data-use-terms/DataUseTerms_HCP_Restric
> > tedAccess_26Jan2016.pdf
> >
> > --Greg
> >
> > ______
you read the terms on how you are allowed to use the data. I
> hope that helps. Good luck!
>
> Best regards,
> Magda.
>
> On Fri, May 20, 2016 at 8:46 AM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
>> Dear Sir,
>>
>> I would like to kno
Dear Sir,
I would like to know if there is any Description about the Subject data of
HCP Data - 900 Subjects, I need right handed healthy controls for my study
, could you please let me know how i can possibly verify the details about
the HCP subject data .
Thanks
Vasudev
; Use a machine with more memory, install more memory in that machine, or
> increase your swap space and expect it to take a while to run.
>
> Tim
>
> On Tue, May 17, 2016 at 6:58 AM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
>> Dear Sir,
>>
&g
Dear Sir,
I am encountering a conversion error using "Single" when i run the
hcp_tmegconnebasic pipeline
Following is the error
*" Error using singleOut of memory. Type HELP MEMORY for your options.Error
in hcp_tmegconnebasic_contrasts (line 732)
Dear Sir,
When i run hcp_tmegconnebasic.m for connectivity analysis i am
encountering following warnings
*" Warning: could not find the file "102816_MEG_6-Wrkmem_tmegpreproc_TIM"
onthe path > In hcp_which (line 40) In hcp_read_matlab (line 32) In
hcp_tmegconnebasic_contrasts
Dear Sir,
When i run GenericfMRIVolumeProcessingPipeline.sh I am incurring an error
during, Warpfield.nii.gz image is not getting created
I have set up the path for gradient_unwarp.py accurately but i am still
incurring this error during fMRIVolumeProcessing.
*" So 3. Apr 12:53:12 CEST 2016 -
Dear Sir,
When i run GenericfMRIVolumeProcessingPipeline.sh I am incurring an error
during
" So 3. Apr 12:53:12 CEST 2016 -
DistortionCorrectionAndEPIToT1wReg_FLIRTBBRAndFreeSurferBBRbased.sh -
create a spline interpolated image of scout (distortion corrected in same
space)
Image Exception :
Dear all,
I have resolved the error, You can close this thread.
Thanks
Vasudev
On 1 April 2016 at 14:55, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear Sir,
>
> When i am running the DiffusionPreprocessing.sh i am encountering
> following error
>
> *"
Dear sir,
Could you please let me know how i can possibly get the shared library
libhdf5_hl.so.6, I have installed hdf5 in my /usr/lib but i couldn't find
the file libhdf5_hl.so.6 in it.
Thanks
Vasudev
___
HCP-Users mailing list
dicine
> Department of Psychiatry, Box 8134
> 660 South Euclid Ave. Tel: 314-747-6173
> St. Louis, MO 63110 Email: mha...@wustl.edu
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Dev vasu <
> vasudevamurthy.devulapa...@gmail.com>
> Date: Satur
Dear Sir,
While running the pipeline i am getting following error "
"mris_make_surfaces: error while loading shared libraries: libnetcdf.so.6:
cannot open shared object file: No such file or directory "
I have seen that this question was asked before and you have suggested
an answer here
e file again and
checking the options that i have set . If you could recognize some thing
which i cannot could you please correct me
Thanks
Vasudev
On 17 March 2016 at 18:38, Timothy B. Brown <tbbr...@wustl.edu> wrote:
> Vasudev,
>
> On 14 Mar 2016 at 12:13, Dev vasu <
int.
>
> Also, if you visit the following web page
> http://www.humanconnectome.org/courses/2015/exploring-the-human-connectome.php,
> scroll down to the section labelled "Course Schedule and Resources", then
> find and select the link for Practical 2 on Day 1
> (Day1_Practical2
or (stderr)
>from your successful run of the PreFreeSurferPipeline.sh script.
>- Exact text of your invocation of the FreeSurferPipeline.sh script.
>- The captured standard output (stdout) and standard error (stderr)
>from you run of the FreeSurferPipeline.sh script.
>
> Tim
&
Dear Sir,
I would like to know if the "*acpc_dc2standard*" and "*standard2acpc_dc*"
atlas was provided by HCP, or whether they are output generated from
Freesurfer.sh script, also could you please let me know if the "
*acpc_dc_restore* " files which is used as naming convention in
Dear Sir,
When i run Prefreesurfer pipeline, the execution fails at the start of
:Field Map Preprocessing and Gradient Unwarping in
T2WToT1wDistortionCorrectAndReg.sh following is the error that i am getting
*" START: T2WToT1wDistortionCorrectAndReg.sh START: Field Map
Preprocessing
Dear sir,
I am incurring segmentation fault error after during the course of running
Freesurferpipeline
Following is the error
*"/home/vasudev/Documents/Pipelines-master/FreeSurfer/scripts/FreeSurferHiresPial.sh:
line 48: 24964 Segmentation fault (core dumped) mris_make_surfaces
Dear Sir,
I am able to solve the problem.
Thanks
Vasudev
On 13 March 2016 at 20:34, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wrote:
> Dear Sir,
>
> When i started running Freesurfer Pipeline by specifying the path to
> T1wImage,T1wimagebrain,
> T2wimage , I am e
Dear Sir,
When i run Prefreesurfer pipeline, the execution fails at the start of
:Field Map Preprocessing and Gradient Unwarping in
T2WToT1wDistortionCorrectAndReg.sh following is the error that i am getting
*" START: T2WToT1wDistortionCorrectAndReg.shSTART: Field Map
Preprocessing
d line above, and you still get the
> "Permission denied" error, then try issuing the following command and
> sending the output to me.
>
> $ ls -l /usr/share/fsl/5.0/etc/fslconf
>
> Regards,
>
> Tim
>
> On Thu, Mar 10, 2016, at 14:41, Dev vasu wrote:
>
Dear Sir,
I have started running PreFreeSurferPipeline.sh and every time i run , i
am incurring following error,
*"START: ACPCAlignmentFinal FOV is: 0.00 160.00 0.00 256.00
86.00 150.00 Traceback (most recent call last): File
"/usr/share/fsl/5.0/bin/aff2rigid",
ption'
Aborted (core dumped)
cp: cannot stat ‘Whole_Brain_Trajectory_1.25.nii.gz’: No such file or
directory
./DTIV1toWorkbench.sh: 43: ./DTIV1toWorkbench.sh: /wb_command: not found "
Thanks
Vasudev
On 1 March 2016 at 11:59, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
wr
uld be included in the HCP course material. Ideally in the future
> the Connectome Workbench could directly support such functionality.
>
> Best
> Stam
>
>
>
> On 29 Feb 2016, at 11:51, Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> wrote:
>
>
>
>
Dear Professor,
I would like to visualize DTI eigen vector map on fractional anisotropy in
workbench in your slides for DTI analysis, it was mentioned that this was
done using script DTIV1_to_Workbench.sh, could you please provide me some
sample example of the script.
Thanks
Vasudev
> inputs to the PreFreeSurferPipeline.sh script (and the Structural
> Preprocessing Pipeline of which the PreFreeSurferPipeline is the initial
> step) *are the T1w and T2w images*.
>
> I am not experienced enough to suggest alternatives for you when
> structural images are not available. I will de
at 05:25, Glasser, Matthew wrote:
>
> Read the part about environment script:
> https://github.com/Washington-University/Pipelines/wiki/v3.4.0-Release-Notes,-Installation,-and-Usage
>
>
> Peace,
>
> Matt.
>
> From: Dev vasu <vasudevamurthy.devulapa...@gmail.com>
> Date:
Dear Professor,
I would like to use DiffusionPreprocessing Pipeline for my DTI structural
connectivity analysis, I have completed the tutorial on how to use HCP
Pipelines for Preprocessing, I have followed the instruction and i have
provided the path to study folder and subject, when i run the
line scripts.
>
> Tim
>
>
> On Mon, Feb 8, 2016 at 5:52 PM, Dev vasu <
> vasudevamurthy.devulapa...@gmail.com> wrote:
>
>> Dear Professor :
>>
>> I have resolved this issue on my own, but i still couldn't perform
>> pre-processing of DTI data , This time
e:
> That looks like you don’t have the pipelines environment script properly
> set up (empty/global/…). Have another look at the set up instructions.
>
> Peace,
>
> Matt.
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Dev vasu <
> vasudevamurthy.dev
78 matches
Mail list logo