Adding a new data type is an enormous undertaking and very invasive. I
don't think it is worth it in this case given there are clear, simple
workarounds.
On Thu, Nov 17, 2016 at 12:24 PM, kant kodali wrote:
> Can we have a JSONType for Spark SQL?
>
> On Wed, Nov 16, 2016 at
Can we have a JSONType for Spark SQL?
On Wed, Nov 16, 2016 at 8:41 PM, Nathan Lande wrote:
> If you are dealing with a bunch of different schemas in 1 field, figuring
> out a strategy to deal with that will depend on your data and does not
> really have anything to do
If you are dealing with a bunch of different schemas in 1 field, figuring
out a strategy to deal with that will depend on your data and does not
really have anything to do with spark since mapping your JSON payloads to
tractable data structures will depend on business logic.
The strategy of
On Wed, Nov 16, 2016 at 2:49 AM, Hyukjin Kwon wrote:
> Maybe it sounds like you are looking for from_json/to_json functions after
> en/decoding properly.
>
Which are new built-in functions that will be released with Spark 2.1.
Maybe it sounds like you are looking for from_json/to_json functions after
en/decoding properly.
On 16 Nov 2016 6:45 p.m., "kant kodali" wrote:
>
>
> https://spark.apache.org/docs/2.0.2/sql-programming-guide.
> html#json-datasets
>
> "Spark SQL can automatically infer the
https://spark.apache.org/docs/2.0.2/sql-programming-guide.html#json-datasets
"Spark SQL can automatically infer the schema of a JSON dataset and load it
as a DataFrame. This conversion can be done using SQLContext.read.json() on
either an RDD of String, or a JSON file."
val df =