if your using the 'data' event, go with pause/resume, they are supposed to
work with it.
you could implement a writable stream, which collects enough xml to
transform and write, something like this:
Mystream.prototype._write = function(chunk, encoding, callback){
this.xmlBuffer.push(chunk)
var json = this.parse()
if( json != null ) //parse returns json, when it can parse a valid db
entity
db.store(json, callback)
else
callback()
}
Am Montag, 9. September 2013 21:12:38 UTC+2 schrieb HungryHippo:
>
> I have a read stream which is a HUGE file of xml. I want to parse the file
> transforming the data and converting it to json. It will then be written to
> a database.
>
> the write to the database (done over http) is really slow so i need to
> somehow wait for the writes to complete before receiving and parsing more
> data. How do i do this ? is it acceptable for the write to the database to
> be done synchronously if its in a pipe? I don't see how i can slow down the
> read stage without blocking ?
>
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en
---
You received this message because you are subscribed to the Google Groups
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.