Hello,
I’ve been mounting the HCP 1200 subject release by editing my fstab file to be
HCP_1200 instead of HCP_900 (since I’m using HCP v.044 AMI which I know will be
unavailable soon) and adding an s3fs-passwd file that includes my AWS access
code.
With the recent migration of HCP data, I
Hello all,
I'm a little confused as to whether the timings in the EVs folder
(fear/neutral.txt) need adjusting? The GLM model .png output for the EMO
task (attached below) seems to have no padding at the end of the run.
Any clarification would be much appreciated, thank you!!
-Michelle
Hello all,
I was running preprocessing on subjects from HCP 1200 release (gambling task
specifically) with AWS and noticed the following four subjects in the first
~200 don’t have the same directory structure as the rest . Everyone else seems
to have
Hello all,
Following the steps on
https://wiki.humanconnectome.org/display/PublicData/How+to+Create+an+EC2+instance+for+HCP+Pipeline+Processing
allows me to mount HCP_900 onto /s3/hcp in my EC2 spot instance.
I've changed my /etc/fstab from "HCP_900" to "HCP_1200" for the 1200
release data,
Hello,
I recently tried to transfer my EBS volume to Standard bucket for cheaper,
more long-term storage but was met with the following error:
The authorization mechanism you have provided is not supported. Please
use AWS4-HMAC-SHA256.
I originally created a spot instance with EBS volume
Does anyone know how I can download a complete excel spreadsheet for the
restricted S1200 dataset (I have restricted access) that includes the
columns denoting whether 3T MR is available? Right now I seem to only be
able to download restricted behavioral data and view 3T availability in a
ity/Pipelines/tree/master/ICAFIX
>
> We hope to run this on all the HCP task fMRI data at some point, but there
> are many priorities and few resources…
>
> Peace,
>
> Matt.
>
> From: <hcp-users-boun...@humanconnectome.org> on behalf of Michelle Chiu <
&g
I'm trying to run ICA_AROMA on subjects in the HCP through AWS but I'm
getting the following error back informing me there is no such file or
directory (for a few subject numbers) for the following:
/s3/hcp/126426/MNINonLinear/Results/tfMRI_GAMBLING_RL/Movement_Regressors.txt
I see on
Thank you very much for the detailed answer Tim! We followed the steps you
provided and successfully created our spot instances with NITRC data!
-Michelle
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
Hi,
Does anyone know how to create a spot instance on AWS with the NITRC
Computational AMI?
Configuring and using on-demand instances has been really straightforward
thanks to the wiki page1, but when I try to make a spot instance, I have
two main issues: 1) There's no NITRC option for the AMI
10 matches
Mail list logo