You can use fscalc to multiply the pet and area maps,eg
fscalc pet.lh.nii.gz mul lh.white.avg.area.mgh -o weighted.nii.gz
On 11/14/2016 11:53 AM, Matthieu Vanhoutte wrote:
> Ok I will try both.
>
> Best,
> Matthieu
>
> 2016-11-14 17:50 GMT+01:00 Douglas N Greve
If you did not smooth them explicitly, then there will be only a little
vertex-like smoothing. As to which weighting is better, I'm not sure. If
the ROI is big, it probably won't make much difference. Can you try both
and see?
On 11/14/2016 11:39 AM, Matthieu Vanhoutte wrote:
> Dear Douglas,
Good question. I'm not sure as there could be reasons to do either. I
guess if you are not smoothing your data, then weighting by area would
be the most appropriate
doug
On 11/11/2016 05:41 PM, Matthieu Vanhoutte wrote:
>
> Is it better to compute mean to weight by number of vertices or
>
Is it better to compute mean to weight by number of vertices or surface
area ?
Best,
Matthieu
Le 11 nov. 2016 11:33 PM, "Douglas N Greve" a
écrit :
> Vertices do not have equal areas and are not equally spaced
>
>
> On 11/11/2016 05:11 PM, Matthieu Vanhoutte wrote:
>
Vertices do not have equal areas and are not equally spaced
On 11/11/2016 05:11 PM, Matthieu Vanhoutte wrote:
>
> Thank you Douglas for giving me a way to compute area from segmented
> surface data.
>
> Are vertices equally spaced along cortex or do triangles all have same
> area ?
>
> Best,
>
Thank you Douglas for giving me a way to compute area from segmented
surface data.
Are vertices equally spaced along cortex or do triangles all have same area
?
Best,
Matthieu
Le 11 nov. 2016 10:55 PM, "Douglas N Greve" a
écrit :
If you want to do it on fsaverage,
If you want to do it on fsaverage, then
mris_segstats --i $SUBJECTS/fsaverage/surf/lh.white.avg.area.mgh --seg
lh.sign_clust.bin.mgh
--excludeid 0 --sum lh.bin.area.sum --accumulate
On 11/11/2016 04:48 PM, Matthieu Vanhoutte wrote:
>
> Dear Douglas,
>
> Yes I would like to in order to
Dear Douglas,
Yes I would like to in order to compute mean of some means. Maybe not if it
is equivalent to number of vertices (are vertices distributed equaly
distant on the cortical surface ?)
In case I need surface area, do I have to convert overlay to annotation
file to be used with
NVox is the number of vertices. The Volume_mm3 is not meaningful. Do you
want area?
On 11/11/2016 04:53 AM, Matthieu Vanhoutte wrote:
> Dear Douglas,
>
> I come back to you concerning stats made from binary .mgh surface data
> file:
>
> mri_segstats --i lh.fsaverage.sm10.mgh --seg
Dear Douglas,
I come back to you concerning stats made from binary .mgh surface data file:
mri_segstats --i lh.fsaverage.sm10.mgh --seg lh.sign_clust.bin.mgh --excludeid
0 --sum lh.bin.sum --avgwf lh.wav.bin.txt
In the output file « lh.bin.sum » all is considered as volume input/output as
Thank you Douglas !
Le 10 nov. 2016 7:21 PM, "Douglas N Greve" a
écrit :
> You need to weight by the number of vertices
>
> n = [27805 2321 552];
> m = [8.8194 10.3661 10.3365];
> sum(n.*m)/sum(n)
>
> ans =
>
> 8.9637
>
>
> On 11/10/2016 06:44 AM, Matthieu
You need to weight by the number of vertices
n = [27805 2321 552];
m = [8.8194 10.3661 10.3365];
sum(n.*m)/sum(n)
ans =
8.9637
On 11/10/2016 06:44 AM, Matthieu Vanhoutte wrote:
> Dear Freesurfer's experts,
>
> Could anyone please explain me the difference I got with command line
> in
Dear Freesurfer's experts,
Could anyone please explain me the difference I got with command line in
below mail ?
Best regards,
Matthieu
2016-08-12 12:07 GMT+02:00 Matthieu Vanhoutte :
> Dear experts,
>
> I am in trouble with two ways of computing mean intensity
Hello FS's experts,
Would anyone have an advice for my problem ?
Best regards,
Matthieu
2016-08-12 12:07 GMT+02:00 Matthieu Vanhoutte :
> Dear experts,
>
> I am in trouble with two ways of computing mean intensity with
> mri_segstats.
>
> First I have used on
Dear experts,
I am in trouble with two ways of computing mean intensity with mri_segstats.
First I have used on .annot files with three different labels inside (SegId
1 to 3) :
*mri_segstats --annot fsaverage lh cache.th23.pos.sig.ocn.annot --i
lh.PET.fsaverage.sm10.mgh --sum lh.pet.sum*
which
15 matches
Mail list logo