There is an isNotNull function on any column.

df._1.isNotNull

or

from pyspark.sql.functions import *
col("myColumn").isNotNull

On Wed, Jul 1, 2015 at 3:07 AM, Olivier Girardot <ssab...@gmail.com> wrote:

> I must admit I've been using the same "back to SQL" strategy for now :p
> So I'd be glad to have insights into that too.
>
> Le mar. 30 juin 2015 à 23:28, pedro <ski.rodrig...@gmail.com> a écrit :
>
>> I am trying to find what is the correct way to programmatically check for
>> null values for rows in a dataframe. For example, below is the code using
>> pyspark and sql:
>>
>> df = sqlContext.createDataFrame(sc.parallelize([(1, None), (2, "a"), (3,
>> "b"), (4, None)]))
>> df.where('_2 is not null').count()
>>
>> However, this won't work
>> df.where(df._2 != None).count()
>>
>> It seems there is no native Python way with DataFrames to do this, but I
>> find that difficult to believe and more likely that I am missing the
>> "right
>> way" to do this.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Check-for-null-in-PySpark-DataFrame-tp23553.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>

Reply via email to