Dear HCP maintainers,
for the automated code testing in http://mne-tools.github.io/mne-hcp/ I
created a testing dataset that consists of all essential MEG files for 1
subject. However, to facilitate repeated automated testing at scale, we
cropped to raw files to 2 seconds and decimated the
ectory structure of symbolic links. After looking over those
instructions, if it is not obvious to you what script I'm referring to and
how to use it, feel free to send a follow up question to me.
Hope that's helpful,
Tim
On 10/18/2016 10:51 AM, Denis-Alexander Engemann wrote:
Dear HCPers,
I recently ha
Dear HCPers,
I recently had a conversation with Robert who suggested to me that it
should be possible to directly mount the HCP data like an EBS volume
instead of using the s3 tools for copying the data file by file.
Any hint would be appreciated.
Cheers,
Denis
see any other issues.
>
>
> Best,
>
> Michael
>
> --
> *From:* Hodge, Michael
> *Sent:* Thursday, September 8, 2016 2:18 PM
> *To:* Denis-Alexander Engemann; Jennifer Elam;
> hcp-users@humanconnectome.org
> *Subject:* RE: [HCP-Users]
gt; Regards,
>
>
>
> Mike
>
>
>
>
>
> *From:* hcp-users-boun...@humanconnectome.org [mailto:
> hcp-users-boun...@humanconnectome.org] *On Behalf Of *Denis-Alexander
> Engemann
> *Sent:* Thursday, September 8, 2016 12:42 PM
> *To:* Hodge, Michael <hod...@wustl.ed
frozen
at 0).
Would you have any suggestions how to access the files?
Or has there been any progress with updating the files on AWS?
Best,
Denis
On Wed, Apr 20, 2016 at 1:56 AM Denis-Alexander Engemann <
denis.engem...@gmail.com> wrote:
> Hi Jenn,
>
> thanks for your reply, th
>
>
> Jennifer Elam, Ph.D.
> Scientific Outreach, Human Connectome Project
> Washington University School of Medicine
> Department of Neuroscience, Box 8108
> 660 South Euclid Avenue
> St. Louis, MO 63110
> 314-362-9387
> e...@wustl.edu
> www.humanconnectome.org
&
, Apr 16, 2016 at 11:51 AM Denis-Alexander Engemann <
denis.engem...@gmail.com> wrote:
> Great, thanks for the update! Something seems to have moved indeed, but
> not too much apparently. I just checked some 90 MEG subjects, and still the
> structural preprocessing is not fully complet
t; Best,
>
> Jenn
>
>
>
> Jennifer Elam, Ph.D.
> Scientific Outreach, Human Connectome Project
>
>
> Washington University School of Medicine
>
> Department of Neuroscience, Box 8108
>
>
> 660 South Euclid Avenue
> St. Louis, MO 63110
> 314-362-9387
>
> e.
;
>
>
> It turns out that the structural extended preprocessing package is not
> available for the new subjects.
>
>
>
> Is there any fix planned for this or is there a way to at least get some
> of the outputs from 'surf' and 'mri'?
>
>
>
> Denis
>
>
&g
It turns out that the structural extended preprocessing package is not
available for the new subjects.
Is there any fix planned for this or is there a way to at least get some of
the outputs from 'surf' and 'mri'?
Denis
On Tue, Mar 29, 2016 at 10:35 AM Denis-Alexander Engemann <
denis.en
Dear HCPers,
it seems that the freesurfer files included in the structural extended
subjects are not available on aws for the HCP_900 subjects, at least not
for the MEG subjects. I am referring to the following outputs:
${subject}/T1w/${subject}/mri
${subject}/T1w/${subject}/surf
Adding: it seems us-east-2-virginia is the good region to use, I measure 27
seconds instead of 4 to get the same file.
Thoughts?
On Fri, Mar 11, 2016 at 1:22 AM, Denis-Alexander Engemann <
denis.engem...@gmail.com> wrote:
> Hi HCPers,
>
> I was wondering if the data are mirro
Hi HCPers,
I was wondering if the data are mirrored or if using a specific reason
access to the data should be faster.
Currently I get 4 minutes for resting state fMRI nii.gz files on a big EC2
instance with SSD.
Any hints would be appreciated.
Denis
Hi Michael,
For what you are describing and what I was trying initially I'm getting
these nasty errors:
A client error (PermanentRedirect) occurred when calling the ListObjects
operation: The bucket you are attempting to access must be addressed using
the specified endpoint. Please send all
Dear HCPs,
currently I'm trying to access the HCP data from the command line on Linux.
I configured the access keys based on the console options in the
ConnectomeDB. With other buckets a command like this would work:
aws s3 ls s3://BUCKET-NAME --recursive --human-readable --summarize
--profile
I was wondering wheather the transforms would be suitable for mapping the
freesurfer surface coordinates to the helmet space.
If this step would work, the transforms and the coreg would be usable with
other software such as for example the MNE suite.
We could then write a routine that skips the
17 matches
Mail list logo