This just reduces to finding a library that can translate a String of
JSON into a POJO, Map, or other representation of the JSON. There are
loads of these, like Gson or Jackson. Sure, you can easily use these
in a function that you apply to each JSON string in each line of the
file. It's not different when this is run in Spark.
On Thu, Apr 2, 2015 at 2:22 PM, James King wrote:
> I'm reading a stream of string lines that are in json format.
>
> I'm using Java with Spark.
>
> Is there a way to get this from a transformation? so that I end up with a
> stream of JSON objects.
>
> I would also welcome any feedback about this approach or alternative
> approaches.
>
> thanks
> jk
>
>
>
>
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org