I'm writing a replicator sort of application for CouchDB. My problem though is 
that I have very long _revisions.ids list and fetching them is taking a lot of 
bandwidth. Especially considering I just want to prepend a new id.

Is there a way to do a bulk update and have CouchDB automatically prepend the 
inserted rev?

Currently I'm doing this:

const revisions = await this._getRevisions(values) // bulkGet({ revs: true })
const docs = []
for (let n = 0; n < values.length; ++n) {
  const { id, data, rev } = values[n]
  const [ start, revId ] = rev.split('-')
  const revIds = revisions[n].ids // ids is a very big array

  const doc = {
    ...data,
    _id: id,
    _rev: rev,
    _revisions: {
      ids: [revId, ...revIds],
      start: parseInt(start)
    }
  }
  docs.push(doc)
}

await this._db.bulkDocs(docs, { new_edits: false })
What I'd like to do (and achieve the same thing!) is something like:

const docs = []
for (let n = 0; n < values.length; ++n) {
  const { id, data, rev } = values[n]

  const doc = {
    ...data,
    _id: id,
    _rev: rev,
  }
  docs.push(doc)
}

await this._db.bulkDocs(docs, { new_edits: false })
Is this possible?

Best Regards,

Robert Nagy

Robert Nagy
Mail: robert.n...@boffins.se <mailto:robert.n...@boffins.se> | Phone: +46 
(0)735441639 | Web: Boffins Technologies AB <http://www.boffins.se/>

Boffins Technologies
Södervägen 12
232 52 Åkarp
Sweden

Reply via email to