Hi,
in 3.0 the arangod server has a request body size limit of 512 MB, so all
requests sent to ArangoDB should be smaller at most 512 MB big. That is the
upper limit.
In practice it will be much better to send batches of 100 or 1,000
documents at a time until you have reached the end of the input data.
This way it is very unlikely you will ever hit the request body size limit.
Whether it's 100 or 1,000 or any other number of documents at a time is up
to you. The smaller the documents are, the more you can put into a batch.
To import multiple documents with AQL, you can do the following:
FOR doc IN @docs
INSERT doc INTO collection
And pass the documents as an array in bind parameter @docs, e.g.
[ { "value1": "test", "value2": "something" }, { "foo": "bar", "baz":
"bat" }, ... ]
Best regards
J
Am Dienstag, 30. August 2016 10:01:47 UTC+2 schrieb shivaraj naidu:
>
> Recently I have came across the situation where I need to bulk import data
> from csv file size of 10 MiB (with arangoJS on NodeJS project)...
>
> And Works fine while using Bulk Import..
> But My colic said don't import that much data once...
>
> Because that will affect process/performance negatively..
>
> So He said "Do Import 100 records after 100 records"...
>
> So I want to know is there any safer limit is there for Bulk Import or I
> can use it to import as much data as i can..
>
> And If 100 after 100 is the safest way.. then there is any easy way to do
> this on array of docs with AQL
>
--
You received this message because you are subscribed to the Google Groups
"ArangoDB" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.