I have a similar situation in an app of mine. I implemented a custom ML 
Transformer that wraps the Jackson ObjectMapper - this gives you full control 
over how your custom entities / structs are serialized.

> On Mar 11, 2016, at 11:53 AM, Caires Vinicius <caire...@gmail.com> wrote:
> 
> Hmm. I think my problem is a little more complex. I'm using 
> https://github.com/databricks/spark-redshift 
> <https://github.com/databricks/spark-redshift> and when I read from JSON file 
> I got this schema.
> 
> root
> |-- app: string (nullable = true)
> 
>  |-- ct: long (nullable = true)
> 
>  |-- event: struct (nullable = true)
> 
> |    |-- attributes: struct (nullable = true)
> 
>  |    |    |-- account: string (nullable = true)
> 
>  |    |    |-- accountEmail: string (nullable = true)
> 
> 
>  |    |    |-- accountId: string (nullable = true)
> 
> 
> 
> I want to transform the Column event into String (formatted as JSON). 
> 
> I was trying to use udf but without success.
> 
> 
> On Fri, Mar 11, 2016 at 1:53 PM Tristan Nixon <st...@memeticlabs.org 
> <mailto:st...@memeticlabs.org>> wrote:
> Have you looked at DataFrame.write.json( path )?
> https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrameWriter
>  
> <https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.DataFrameWriter>
> 
> > On Mar 11, 2016, at 7:15 AM, Caires Vinicius <caire...@gmail.com 
> > <mailto:caire...@gmail.com>> wrote:
> >
> > I have one DataFrame with nested StructField and I want to convert to JSON 
> > String. There is anyway to accomplish this?
> 

Reply via email to