On Thu, Jun 9, 2022 at 5:58 PM Ken Hancock <hancoc...@gmail.com> wrote:
>
> I have a very large (> JVM memory) payload that requires the extraction of 
> certain fields for processing.  The twist is the fields have escaped json as 
> the payload. Does Jackson have any capability of parsing this as a stream?
>
> Example:
> {
>   "field1":"value1",
>   "field2":"value2",
>   "verylargedatablock": 
> "{\"large1\":\"value1\",\"large2\":\"value2\",\"bigarray\":[{\"foo\":\"bar\"}]}"
> }
>
> I need to be able to  parse out fields from "verylargedatablock" without 
> loading the entire value into memory in order to deserialize it.

There is one way to stream String values, by using JsonParser method

  getText(Writer w)

which may handle decoding incrementally: implementations may or may
not buffer content.
But for JSON parser in particular I think actual streaming is in fact
implemented. So you would need to construct a JsonParser and iterate
over tokens, then call this method with a `Writer` you implement to
process content.

But the problem then becomes that of how to parse content passed via Writer.

This is almost doable since Jackson has async (non-blocking) parser
implementation too... unfortunately it expects byte-based input. You
could re-encode content as UTF-8 bytes, feed them, so technically this
would be doable.

Other than this approach all individual JSON values are parsed as one
and cannot be streamed.

I hope this helps,

-+ Tatu +-

-- 
You received this message because you are subscribed to the Google Groups 
"jackson-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jackson-user+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jackson-user/CAGrxA27mDhs_OnwX7bsAnqLzdc6wPDij-0Eif%3DEBQYMzbiF%3Dzg%40mail.gmail.com.

Reply via email to