You can use struct function of org.apache.spark.sql.function class to
combine two columns to create struct column.
Sth like.
val nestedCol = struct(df("d"), df("e"))
df.select(df(a), df(b), df(c), nestedCol)
On Aug 7, 2015 3:14 PM, "Rishabh Bhardwaj" <rbnex...@gmail.com> wrote:

> I am doing it by creating a new data frame out of the fields to be nested
> and then join with the original DF.
> Looking for some optimized solution here.
>
> On Fri, Aug 7, 2015 at 2:06 PM, Rishabh Bhardwaj <rbnex...@gmail.com>
> wrote:
>
>> Hi all,
>>
>> I want to have some nesting structure from the existing columns of
>> the dataframe.
>> For that,,I am trying to transform a DF in the following way,but couldn't
>> do it.
>>
>> scala> df.printSchema
>> root
>>  |-- a: string (nullable = true)
>>  |-- b: string (nullable = true)
>>  |-- c: string (nullable = true)
>>  |-- d: string (nullable = true)
>>  |-- e: string (nullable = true)
>>  |-- f: string (nullable = true)
>>
>> *To*
>>
>> scala> newDF.printSchema
>> root
>>  |-- a: string (nullable = true)
>>  |-- b: string (nullable = true)
>>  |-- c: string (nullable = true)
>>  |-- newCol: struct (nullable = true)
>>  |    |-- d: string (nullable = true)
>>  |    |-- e: string (nullable = true)
>>
>>
>> help me.
>>
>> Regards,
>> Rishabh.
>>
>
>

Reply via email to