I am trying to find what is the correct way to programmatically check for
null values for rows in a dataframe. For example, below is the code using
pyspark and sql:

df = sqlContext.createDataFrame(sc.parallelize([(1, None), (2, "a"), (3,
"b"), (4, None)]))
df.where('_2 is not null').count()

However, this won't work
df.where(df._2 != None).count()

It seems there is no native Python way with DataFrames to do this, but I
find that difficult to believe and more likely that I am missing the "right
way" to do this.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Check-for-null-in-PySpark-DataFrame-tp23553.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to