Dear Aaron,
I think that the Zenodo limit is as their email to you states, per dataset
cited from an article ie equals one doi. I recall that at International Data
Week in Denver 2016 I mentioned in open discussion at the session on data
repositories the zenodo limit per dataset of 5 Gbytes and
This is what Zenodo emailed me: "By default, we provide a one-time quota
increase up to 100GB for a dataset that will be cited from a peer-reviewed
article. Zenodo is a free-to-use service, an in order to keep it this way, we
have to restrict the incoming data volume rate as very large datasets
The zenodo policies seem to the most workable as a start. I would suggest
contacting them for the cases that go over 50GB, but at worst splitting
into 50GB chunks. -- Herbert
On Fri, Jan 18, 2019 at 10:49 AM Andreas Förster <
andreas.foers...@dectris.com> wrote:
> Hi Aaron,
>
> can you slice
Hi,
> Is anyone aware of online repositories that will store huge sets of
> raw data (>100 GB)? I’m aware of Zenodo and SBGrid, but Zenodo’s
> limit is typically 50 GB and their absolute limit is 100 GB. SBGrid
> has yet to respond to my emails.
The Coherent X-ray Imaging Data Bank may be
Hi Aaron,
can you slice your data and then link to the bits?
We're currently trying to find out what "unlimited Google Drive storage"
means by uploading pi in chunks of 70 GB or so.
All best.
Andreas
On Fri, Jan 18, 2019 at 4:31 PM Aaron Finke wrote:
> Dear CCP4ites,
>
> Is anyone aware
Hi Aaron
I would guess most places would start to want $$ for storing multiples of 100 GB
Google, Amazon, Microsoft all offer this kind of thing. Getting the data in and
out can be slow, and I would expect as the data size tends towards big and the
time tends to a long time it would be
Dear CCP4ites,
Is anyone aware of online repositories that will store huge sets of raw data
(>100 GB)? I’m aware of Zenodo and SBGrid, but Zenodo’s limit is typically 50
GB and their absolute limit is 100 GB. SBGrid has yet to respond to my emails.
I could host them myself, but the involuntary