[HCP-Users] Mounting HCP temp data

2018-07-08 Thread Michelle Chiu
Hello, I’ve been mounting the HCP 1200 subject release by editing my fstab file to be HCP_1200 instead of HCP_900 (since I’m using HCP v.044 AMI which I know will be unavailable soon) and adding an s3fs-passwd file that includes my AWS access code. With the recent migration of HCP data, I

[HCP-Users] Adjusting EV timing text files in EMO task

2018-05-01 Thread Michelle Chiu
Hello all, I'm a little confused as to whether the timings in the EVs folder (fear/neutral.txt) need adjusting? The GLM model .png output for the EMO task (attached below) seems to have no padding at the end of the run. Any clarification would be much appreciated, thank you!! -Michelle

[HCP-Users] Mismatch in directory structure HCP 1200 subject release

2018-04-17 Thread Michelle Chiu
Hello all, I was running preprocessing on subjects from HCP 1200 release (gambling task specifically) with AWS and noticed the following four subjects in the first ~200 don’t have the same directory structure as the rest . Everyone else seems to have

[HCP-Users] How to safely unmount /s3/hcp

2018-04-09 Thread Michelle Chiu
Hello all, Following the steps on https://wiki.humanconnectome.org/display/PublicData/How+to+Create+an+EC2+instance+for+HCP+Pipeline+Processing allows me to mount HCP_900 onto /s3/hcp in my EC2 spot instance. I've changed my /etc/fstab from "HCP_900" to "HCP_1200" for the 1200 release data,

[HCP-Users] NITRC-CE HCP v0.44 VS v0.45 AMI

2018-02-04 Thread Michelle Chiu
Hello, I recently tried to transfer my EBS volume to Standard bucket for cheaper, more long-term storage but was met with the following error: The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256. I originally created a spot instance with EBS volume

[HCP-Users] S1200 complete dataset download

2018-01-30 Thread Michelle Chiu
Does anyone know how I can download a complete excel spreadsheet for the restricted S1200 dataset (I have restricted access) that includes the columns denoting whether 3T MR is available? Right now I seem to only be able to download restricted behavioral data and view 3T availability in a

Re: [HCP-Users] missing movement_regressors.txt file

2018-01-30 Thread Michelle Chiu
ity/Pipelines/tree/master/ICAFIX > > We hope to run this on all the HCP task fMRI data at some point, but there > are many priorities and few resources… > > Peace, > > Matt. > > From: <hcp-users-boun...@humanconnectome.org> on behalf of Michelle Chiu < &g

[HCP-Users] missing movement_regressors.txt file

2018-01-30 Thread Michelle Chiu
I'm trying to run ICA_AROMA on subjects in the HCP through AWS but I'm getting the following error back informing me there is no such file or directory (for a few subject numbers) for the following: /s3/hcp/126426/MNINonLinear/Results/tfMRI_GAMBLING_RL/Movement_Regressors.txt I see on

Re: [HCP-Users] Processing NITRC data using spot instances on AWS

2018-01-23 Thread Michelle Chiu
Thank you very much for the detailed answer Tim! We followed the steps you provided and successfully created our spot instances with NITRC data! -Michelle ___ HCP-Users mailing list HCP-Users@humanconnectome.org

[HCP-Users] Processing NITRC data using spot instances on AWS

2017-12-06 Thread Michelle Chiu
Hi, Does anyone know how to create a spot instance on AWS with the NITRC Computational AMI? Configuring and using on-demand instances has been really straightforward thanks to the wiki page1, but when I try to make a spot instance, I have two main issues: 1) There's no NITRC option for the AMI