Ed Lee created SPARK-20617:
------------------------------

             Summary: pyspark.sql,  isin when columns contain null
                 Key: SPARK-20617
                 URL: https://issues.apache.org/jira/browse/SPARK-20617
             Project: Spark
          Issue Type: Bug
          Components: PySpark, SQL
    Affects Versions: 2.2.0
         Environment: Ubuntu Xenial 16.04
            Reporter: Ed Lee


Hello encountered a filtering bug using 'isin' in pyspark sql on version 2.2.0, 
Ubuntu 16.04.    

Enclosed below an example to replicate:

import pandas as pd
test_df = pd.DataFrame({"col1": [None, None, "a", "b", "c"],
                        "col2": range(5)
                        })

test_sdf = spark.createDataFrame(test_df)
test_sdf.show()

# +----+----+
# |col1|col2|
# +----+----+
# |null|   0|
# |null|   1|
# |   a|   2|
# |   b|   3|
# |   c|   4|
# +----+----+


# Below shows null is considered 'isin' the list ["a"]:

test_sdf.filter(sf.col("col1").isin(["a"]) == False).show()
Or:
test_sdf.filter(~sf.col("col1").isin(["a"])).show()

#Expecting
# +----+----+
# |col1|col2|
# +----+----+
# |null|   0|
# |null|   1|
# |   b|   3|
# |   c|   4|
# +----+----+

# Got:
# +----+----+
# |col1|col2|
# +----+----+
# |   b|   3|
# |   c|   4|
# +----+----+


# My workarounds:

# 1.
# null is considered 'in', so add OR isNull conditon!
test_sdf.filter((sf.col("col1").isin(["a"])== False) | (
sf.col("col1").isNull())).show()

# 2.  Use left join and filter
join_df = pd.DataFrame({"col1": ["a"],
                        "isin": 1
                        })

join_sdf = spark.createDataFrame(join_df)

test_sdf.join(join_sdf, on="col1", how="left") \
    .filter(sf.col("isin").isNull()) \
    .show()

# +----+----+----+
# |col1|col2|isin|
# +----+----+----+
# |null|   0|null|
# |null|   1|null|
# |   c|   4|null|
# |   b|   3|null|
# +----+----+----+





--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to