Re: [HCP-Users] a problem in opening multi-volume .nii file
The files are fine in FSL and Connectome Workbench, so I'm not sure what the problem is. If you need to read the NIFTIs into matlab you could use FSL's read_avw.m for this. Peace, Matt. From: L Deng l.deng_s...@yahoo.com Reply-To: L Deng l.deng_s...@yahoo.com Date: Saturday, March 1, 2014 7:28 AM To: hcp-users@humanconnectome.org hcp-users@humanconnectome.org Subject: [HCP-Users] a problem in opening multi-volume .nii file Hello. I have problem in opening the multi-volume nii file. I downloaded the pre-processed resting state fMRI data (10 unrelated subjects), checked the integrity of the files, and unzipped them according the instruction in the Q3_Release_Reference _Manual. However, the problem is, when I tried to load the rfMRI_REST1_LR(or RL).nii into MATLAB workspace using the function 'spm_read_vols' in SPM8, it always failed. And I found that MATLAB can actually access the first 10 volumes of the .nii file, but failed to read the rest of it. This problem appeared in ALL of the 10 unrelated subjects' rfMRI_REST1_preproc data. I believe that this problem is caused by the .nii file itself for the additional reasons below: 1) I tried to convert the 4D nii into 1200 3D niis by using the 'dcm2nii' tool in the MRICRON package, and it can only convert the first 10 volumes, 2) I viewed the LAST 10 volumes of the nii file using MRICRON, and very interestingly, the brain was misplaced in the last 10 volumes, 3) this problem appeared in different platforms(windows, linux, 32-bit, and 64-bit). All files are unlikely to be corrupted during the downloading process, because I have checked the hash of the files. So, I am quite puzzled with this problem. Please offer some advise. Thank you for your time! Lifu ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail. ___HCP-Users mailing listHCP-Users@humanconnectome.orghttp://lists.humanconnectome.org/mailman/listinfo/hcp-users
Re: [HCP-Users] nibabel gifti files with workbench
John or Tim can you comment on this? Thanks, Matt. From: Russ Poldrack poldr...@utexas.edu Date: Saturday, March 1, 2014 3:13 PM To: hcp-users@humanconnectome.org hcp-users@humanconnectome.org Subject: [HCP-Users] nibabel gifti files with workbench hi all - it turns out that gifti files created by nibabel.gifti.giftiio are not readable by connectome workbench, due to two main problems. first, the files generated by nibabel use the endian code 'GIFTI_ENDIAN_LITTLE' whereas workbench seems to require the alternate code 'LittleEndian'. second, nibabel inserts line breaks with the Data section of the file, but workbench doesn't seem to be able to handle those. I don't know the gifti standard well enough to know which of these is breaking the standard, but it would be great if the workbench could handle the files generated by nibabel. in the meantime I have generated a hacky little script that can fix the files generated by nibabel to make them compatible with the workbench: https://github.com/poldrack/wbench/blob/master/wbify_gifti.py cheers russ ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail. ___HCP-Users mailing listHCP-Users@humanconnectome.orghttp://lists.humanconnectome.org/mailman/listinfo/hcp-users
Re: [HCP-Users] nibabel gifti files with workbench
From the .dtd on https://www.nitrc.org/projects/gifti/, I see this: Endian (BigEndian | LittleEndian) #REQUIRED From GIFTI_Surface_Format.pdf on the same site (which also echos the above on page 7), it looks like line breaks are also implicitly forbidden: The second encoding, Base64Binary (http://www.ietf.org/rfc/rfc3548.txt or http://email.about.com/cs/standards/a/base64_encoding.htm), encodes binary data into a sequence of ASCII characters. From http://www.ietf.org/rfc/rfc3548.txt: Implementations MUST NOT not add line feeds to base encoded data unless the specification referring to this document explicitly directs base encoders to add line feeds after a specific number of characters. I think nibabel needs to change how it writes gifti files. Tim On Sat, Mar 1, 2014 at 3:13 PM, Russ Poldrack poldr...@utexas.edu wrote: hi all - it turns out that gifti files created by nibabel.gifti.giftiio are not readable by connectome workbench, due to two main problems. first, the files generated by nibabel use the endian code 'GIFTI_ENDIAN_LITTLE' whereas workbench seems to require the alternate code 'LittleEndian'. second, nibabel inserts line breaks with the Data section of the file, but workbench doesn't seem to be able to handle those. I don't know the gifti standard well enough to know which of these is breaking the standard, but it would be great if the workbench could handle the files generated by nibabel. in the meantime I have generated a hacky little script that can fix the files generated by nibabel to make them compatible with the workbench: https://github.com/poldrack/wbench/blob/master/wbify_gifti.py cheers russ ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users
Re: [HCP-Users] nibabel gifti files with workbench
thanks Tim - I'm cc'ing Matthew Brett here since I'm not sure if he's on this list. cheers russ On Sun, Mar 2, 2014 at 3:13 PM, Timothy Coalson tsc...@mst.edu wrote: From the .dtd on https://www.nitrc.org/projects/gifti/, I see this: Endian (BigEndian | LittleEndian) #REQUIRED From GIFTI_Surface_Format.pdf on the same site (which also echos the above on page 7), it looks like line breaks are also implicitly forbidden: The second encoding, Base64Binary (http://www.ietf.org/rfc/rfc3548.txt or http://email.about.com/cs/standards/a/base64_encoding.htm), encodes binary data into a sequence of ASCII characters. From http://www.ietf.org/rfc/rfc3548.txt: Implementations MUST NOT not add line feeds to base encoded data unless the specification referring to this document explicitly directs base encoders to add line feeds after a specific number of characters. I think nibabel needs to change how it writes gifti files. Tim On Sat, Mar 1, 2014 at 3:13 PM, Russ Poldrack poldr...@utexas.edu wrote: hi all - it turns out that gifti files created by nibabel.gifti.giftiio are not readable by connectome workbench, due to two main problems. first, the files generated by nibabel use the endian code 'GIFTI_ENDIAN_LITTLE' whereas workbench seems to require the alternate code 'LittleEndian'. second, nibabel inserts line breaks with the Data section of the file, but workbench doesn't seem to be able to handle those. I don't know the gifti standard well enough to know which of these is breaking the standard, but it would be great if the workbench could handle the files generated by nibabel. in the meantime I have generated a hacky little script that can fix the files generated by nibabel to make them compatible with the workbench: https://github.com/poldrack/wbench/blob/master/wbify_gifti.py cheers russ ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users ___ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users