[ https://issues.apache.org/jira/browse/SPARK-19732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Len Frodgers updated SPARK-19732: --------------------------------- Description: In PySpark, the fillna function of DataFrame inadvertently casts bools to ints, so fillna cannot be used to fill True/False. e.g. `spark.createDataFrame([Row(a=True),Row(a=None)]).fillna(True).collect()` yields `[Row(a=True), Row(a=None)]` It should be a=True for the second Row The cause is this bit of code: {code} if isinstance(value, (int, long)): value = float(value) {code} There needs to be a separate check for isinstance(bool), since in python, bools are ints too was: In PySpark, the fillna function of DataFrame inadvertently casts bools to ints, so fillna cannot be used to fill True/False. e.g. `spark.createDataFrame([Row(a=True),Row(a=None)]).fillna(True).collect()` yields `[Row(a=True), Row(a=None)]` It should be a=True for the second Row The cause is this bit of code: if isinstance(value, (int, long)): value = float(value) There needs to be a separate check for isinstance(bool), since in python, bools are ints too > DataFrame.fillna() does not work for bools in PySpark > ----------------------------------------------------- > > Key: SPARK-19732 > URL: https://issues.apache.org/jira/browse/SPARK-19732 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.1.0 > Reporter: Len Frodgers > > In PySpark, the fillna function of DataFrame inadvertently casts bools to > ints, so fillna cannot be used to fill True/False. > e.g. > `spark.createDataFrame([Row(a=True),Row(a=None)]).fillna(True).collect()` > yields > `[Row(a=True), Row(a=None)]` > It should be a=True for the second Row > The cause is this bit of code: > {code} > if isinstance(value, (int, long)): > value = float(value) > {code} > There needs to be a separate check for isinstance(bool), since in python, > bools are ints too -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org