Dear HCP maintainers,

for the automated code testing in http://mne-tools.github.io/mne-hcp/ I
created a testing dataset that consists of all essential MEG files for 1
subject. However, to facilitate repeated automated testing at scale, we
cropped to raw files to 2 seconds and decimated the preprocessed epochs by
a factor 100 in the time sample domain. This yields 500MB instead of 23MB.
Currently the testing relies on encrypted download from my HCP account.
Further comfort could be achieved by downloading the data from some
alternative public repository. This would have the advantage that we could
get rid of the encryption which currently prevents automated builds in Pull
Requests to the MNE-HCP repository, hence, makes it more difficult to
contribute.

Here is my question. From the point of view of the data usage terms, and
your gut feeling, would it be OK if I made available these testing data for
testing purposes?
Making them available would mean that there would be a repository from
which these data could be accessed and the code in MNE-HCP would expose
that URL in an un-obfuscated way.

I would appreciate any feedback,
Denis

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to