Re: how to us DataFrame.na.fill based on condition

2015-11-23 Thread Davies Liu
You could create a new column based on the expression: IF (condition1,
value1, old_column_value)

On Mon, Nov 23, 2015 at 11:57 AM, Vishnu Viswanath
 wrote:
> Thanks for the reply Davies
>
> I think replace, replaces a value with another value. But what I want to do
> is fill in the null value of a column.( I don't have a to_replace here )
>
> Regards,
> Vishnu
>
> On Mon, Nov 23, 2015 at 1:37 PM, Davies Liu  wrote:
>>
>> DataFrame.replace(to_replace, value, subset=None)
>>
>>
>> http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace
>>
>> On Mon, Nov 23, 2015 at 11:05 AM, Vishnu Viswanath
>>  wrote:
>> > Hi
>> >
>> > Can someone tell me if there is a way I can use the fill method in
>> > DataFrameNaFunctions based on some condition.
>> >
>> > e.g., df.na.fill("value1","column1","condition1")
>> > df.na.fill("value2","column1","condition2")
>> >
>> > i want to fill nulls in column1 with values - either value 1 or value 2,
>> > based on some condition.
>> >
>> > Thanks,
>
>
>
>
> --
> Thanks and Regards,
> Vishnu Viswanath
> +1 309 550 2311
> www.vishnuviswanath.com

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to us DataFrame.na.fill based on condition

2015-11-23 Thread Vishnu Viswanath
Thanks for the reply Davies

I think replace, replaces a value with another value. But what I want to do
is fill in the null value of a column.( I don't have a to_replace here )

Regards,
Vishnu

On Mon, Nov 23, 2015 at 1:37 PM, Davies Liu  wrote:

> DataFrame.replace(to_replace, value, subset=None)
>
>
> http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace
>
> On Mon, Nov 23, 2015 at 11:05 AM, Vishnu Viswanath
>  wrote:
> > Hi
> >
> > Can someone tell me if there is a way I can use the fill method in
> > DataFrameNaFunctions based on some condition.
> >
> > e.g., df.na.fill("value1","column1","condition1")
> > df.na.fill("value2","column1","condition2")
> >
> > i want to fill nulls in column1 with values - either value 1 or value 2,
> > based on some condition.
> >
> > Thanks,
>



-- 
Thanks and Regards,
Vishnu Viswanath
+1 309 550 2311
*www.vishnuviswanath.com *


Re: how to us DataFrame.na.fill based on condition

2015-11-23 Thread Davies Liu
DataFrame.replace(to_replace, value, subset=None)

http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace

On Mon, Nov 23, 2015 at 11:05 AM, Vishnu Viswanath
 wrote:
> Hi
>
> Can someone tell me if there is a way I can use the fill method in
> DataFrameNaFunctions based on some condition.
>
> e.g., df.na.fill("value1","column1","condition1")
> df.na.fill("value2","column1","condition2")
>
> i want to fill nulls in column1 with values - either value 1 or value 2,
> based on some condition.
>
> Thanks,

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



how to us DataFrame.na.fill based on condition

2015-11-23 Thread Vishnu Viswanath
Hi

Can someone tell me if there is a way I can use the fill method
in DataFrameNaFunctions based on some condition.

e.g., df.na.fill("value1","column1","condition1")
df.na.fill("value2","column1","condition2")

i want to fill nulls in column1 with values - either value 1 or value 2,
based on some condition.

Thanks,