You could create a new column based on the expression: IF (condition1,
value1, old_column_value)
On Mon, Nov 23, 2015 at 11:57 AM, Vishnu Viswanath
wrote:
> Thanks for the reply Davies
>
> I think replace, replaces a value with another value. But what I want to do
> is fill in the null value of a
Thanks for the reply Davies
I think replace, replaces a value with another value. But what I want to do
is fill in the null value of a column.( I don't have a to_replace here )
Regards,
Vishnu
On Mon, Nov 23, 2015 at 1:37 PM, Davies Liu wrote:
> DataFrame.replace(to_replace, value, subset=None
DataFrame.replace(to_replace, value, subset=None)
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace
On Mon, Nov 23, 2015 at 11:05 AM, Vishnu Viswanath
wrote:
> Hi
>
> Can someone tell me if there is a way I can use the fill method in
> DataFrameNaFunct
Hi
Can someone tell me if there is a way I can use the fill method
in DataFrameNaFunctions based on some condition.
e.g., df.na.fill("value1","column1","condition1")
df.na.fill("value2","column1","condition2")
i want to fill nulls in column1 with values - either value 1 or value 2,
based