On Thu, 16 Aug 2018, Harms, Michael wrote:
>Sorry, but not to my knowledge. We use ‘dcm2niix’ currently for DICOM to
>NIFTI conversion (‘dcm2niix’ generates nice .json files containing a bunch
>of relevant parameters of the scan). That particular step is pretty
>straightforward.
On Wed, 28 Feb 2018, Shankar Tumati wrote:
> Hello experts,
> I would like to get a list of neighbors of each vertex in a 32k surface file.
> How can I do this with workbench? If not with workbench, is there another way
> to get this info?
quick one: we rely on neighborhood information in Py
http://nipy.org/nibabel/reference/nibabel.cifti2.html
?
On October 12, 2017 12:18:32 AM EDT, Aaron Crank wrote:
>Dear HCP experts,
>
>
>I have a question about loading CIFTI files in the Python. Would you
>please suggest if there are any well-established tools in the Python to
>work with CIFTI fo
Thank you Michael and Jennifer again!
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>To add to what Mike Harms just wrote, it still sounds like you are
>thinking of the packages as data bundles for groups of subjects.
yes -- indeed -- it was probably the main reason for a bit of disconnect
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>A listing of the by subject unpacked files available, organized by
>modality and processing level, are available in Appendix 3 of the
>Reference Manual.
>The files are listed there as they unpack into a standard directory
>structure.
On Tue, 06 Dec 2016, Elam, Jennifer wrote:
>A listing of the by subject unpacked files available, organized by
>modality and processing level, are available in Appendix 3 of the
>Reference Manual.
>The files are listed there as they unpack into a standard directory
>structure.
On Tue, 06 Dec 2016, Hodge, Michael wrote:
> I've attached unzip -l output for the packages of a couple of subjects. One
> has MEG data in addition to the standard 3T data, and the other has 7T data
> in addition to the 3T data, so you can see what's in the packages.
Thank you Michael!
S
Dear HCP gurus,
db.humanconnectome.org/ provides convenient bundles of subjects/data.
Is it possible to obtain the lists of files (I guess as a subset
of files within hcp-openaccess/HCP or hcp-openaccess/HCP_900 s3 buckets)
which comes within each bundle? (without downloading all those bundles
f
On Fri, 26 Aug 2016, Elam, Jennifer wrote:
>WB v1.2.3 is compatible with the WB v1.0 tutorial and the processed 900
>Subjects Group Average Data available to download at
>http://humanconnectome.org/connectome/get-connectome-workbench.html and on
>the ConnectomeDB HCP project page.
On Thu, 19 May 2016, Elam, Jennifer wrote:
>Connectome Workbench v1.2.0 Released
congrats!
>WB v1.2.0 is compatible with the WB v1.0 tutorial and the processed 900
>Subjects Group Average Data available to download at
>http://humanconnectome.org/connectome/get-connectome-workben
Dear HCP folks,
I wondered if there are some videos somewhere from the past courses?
e.g. for
http://www.humanconnectome.org/courses/2015/exploring-the-human-connectome.php
On Wed, 11 May 2016, Elam, Jennifer wrote:
>We are pleased to announce the 2016 HCP Course: "Exploring the Human
>
awesome -- thanks!
On Thu, 24 Apr 2014, Harms, Michael wrote:
> You can get the behavior you desire by opening the Information Box and
> then clicking "Volume ID".
--
Yaroslav O. Halchenko, Ph.D.
http://neuro.debian.net http://www.pymvpa.org http://www.fail2ban.org
Senior Research Associate,
Dear Developers of the workbench,
in the Volume view where you have e.g. all 3 panes visible, it would be
great if there was an easy way to point to the location defining
which slices should be visible.
I did find previous question/discussion
https://www.mail-archive.com/hcp-users@humanconnectome
On Mon, 07 Apr 2014, Timothy Coalson wrote:
>As per the announcement, yes, it does include the source (on github,
>[1]https://github.com/Washington-University/workbench, see releases for
>the code matching the released binaries), with some basic instructions for
>building (src/READ
On Mon, 07 Apr 2014, Jennifer Elam wrote:
>The latest version Workbench beta 0.85 release has not yet gone live, but
>is set to occur later today, hopefully by 5pm CDT. Look for the
>announcement, download the new version and check to see if the problem
>still occurs.
ah - awesome
On Mon, 10 Mar 2014, Timothy Coalson wrote:
> > � �As a quick update, we are aiming for the next workbench release
> (v0.85;
> > � �cifti-2 compatibility), including source code, to occur in about a
> month.
> That is great -- thanks for the update!
> It would be es
On Sat, 08 Mar 2014, David Van Essen wrote:
>As a quick update, we are aiming for the next workbench release (v0.85;
>cifti-2 compatibility), including source code, to occur in about a month.
That is great -- thanks for the update!
It would be especially nice if source release would be
On Sat, 01 Mar 2014, Russ Poldrack wrote:
>hi all - it turns out that gifti files created by nibabel.gifti.giftiio
>are not readable by connectome workbench
reciprocally related question which I have not got answer in my previous
inquiries: are/will be(?) sources of connectome workbench
Dear Dr. Van Essen,
It is great to see scientific data formats to be developed with community
awareness. I have no experience of working with CIFTI-2 myself yet, so I could
only praise the effort and wish CIFTI-2 wide adoption and lo-o-ong years
before any need for CIFTI-3 would develop.
Would s
On Tue, 20 Aug 2013, Marcus, Dan wrote:
> There are a couple of additional data access methods that we'll be
> implementing down the road. Working with the INCF, we'll be making the
> data accessible over INCF Dataspace. We will also be putting the packages
> up on the Amazon cloud.
Thank you D
s to what the limitations were of globus? and whether you
>looked into owncloud at all?
>cheers,
>satra
>On Tue, Aug 20, 2013 at 12:17 PM, Archie, Kevin <[1]arch...@mir.wustl.edu>
>wrote:
> On Aug 20, 2013, at 10:44 AM, Yaroslav Halchenko wrote:
On Tue, 20 Aug 2013, Archie, Kevin wrote:
> > that would be great! Whenever initiating S3 repository for that please
> > enable "versioning"... will it be available via HTTP?
> The Q1 data was put up on S3 for the OHBM hackathon -- if you have opinions
> on how HCP data should be organized on
humanconnectome.org/display/DataUse/Exploring+ConnectomeDB+with+Python
> , and a script that uses pyxnat+ascp to download packages here:
> https://wiki.humanconnectome.org/display/DataUse/Downloading+data+for+10+%28or+40%29+unrelated+subjects
> - Kevin
> On Aug 20, 2013, at 10:11 A
Thank you Matthew and Dan for replies!
On Tue, 20 Aug 2013, Marcus, Dan wrote:
> We were unable to find an open source alternative to Aspera that matches
> its performance and feature set. I do want to emphasize that the Aspera
> client software is free to end users, so while the Aspera infrastruc
Dear HCP Gurus,
Aspera client seems to perform really well to utilize the available
bandwidth, but it is a closed source proprietary software (although
happily using open source libraries such as Qt, which it can do).
I wonder if there is any other alternative method to access data,
possibly throu
25 matches
Mail list logo