Hi,

In one of our application there is a need of bulk update of documents. Here
user can make change over a set of documents. This set may contains 10 or
may be more than thousand documents and changes may be of some node or all
nodes of each document in the set.

Also the normal insertion of a xml document in database goes through
insertion pipeline and by adding some extra node within ingestion process
it insert three copy of the document with in three different database
folder called Admin, Live and History.

So, our problem is either we can make bulkupdate using the this pipeline
structure which automatically re-insert the document all the places or we
can only replace those nodes which user want to update.

In the second approach we need to update explicitly all the three places
using marklogic transactions. So that we can make sure data is consistent
everywhere.

As, I gone through xdmp:document-insert documentation I found  If a
document already exists at the specified URI, the function replaces the
content of the existing document with the specified content (the $root
parameter)
as an update operation.
So is it mean the overhead in replacing a node of a document and
re-inserting that document after making change of that node are same..?

If not please suggest what is the good way to handle it out through
MarkLogic.


Thanks,
Varun Varunesh
_______________________________________________
General mailing list
General@developer.marklogic.com
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to