Hi Grisha
This is Great :) It worked thanks alot

I have this requirement , I will be running my spark application on EMR
and build a custom logging to create logs on S3. Any idea what should I do?
or In general if i create a custom log (with my Application name ), where
will logs be generated when run in cluster mode (since in cluster mode jobs
are executed all over different machine)

On Sat, Dec 30, 2023 at 1:56 PM Grisha Weintraub <grisha.weintr...@gmail.com>
wrote:

> In Java, it expects an array of Columns, so you can simply cast your list
> to an array:
>
> array_df.select(fields.toArray(new Column[0]))
>
>
>
>
> On Fri, Dec 29, 2023 at 10:58 PM PRASHANT L <prashant...@gmail.com> wrote:
>
>>
>> Team
>> I am using Java and want to select columns from Dataframe , columns are
>> stored in List<Column>
>> equivalent of below scala code
>> *  array_df=array_df.select(fields: _*)*
>>
>>
>> When I try array_df=array_df.select(fields) , I get error saying Cast to
>> Column
>>
>> I am using Spark 3.4
>>
>

Reply via email to