Github user lw-lin commented on the issue:

    https://github.com/apache/spark/pull/14118
  
    Here are some findings as I dug a little:
    
    1. Since https://github.com/databricks/spark-csv/pull/102(Jul, 2015), we 
would cast `""` as `null` for all types other than strings. For strings, `""` 
would still be `""`;
    
    2. Then we had added `treatEmptyValuesAsNulls` in 
https://github.com/databricks/spark-csv/pull/147(Sep, 2015), after which, `""` 
would be `null` when `treatEmptyValuesAsNulls == true` and would be still `""` 
otherwise;
    
    3. Then we had added `nullValue` in 
https://github.com/databricks/spark-csv/pull/224(Dec, 2015), so people could 
specify some string like `"MISSING"` other than the default `""` to represent 
null values.
    
    Then after the above 1.2.3., we have the following, which seems reasonable 
and is backward-compatible:
    
    <table>
    <tr>
        <td align="center"></td>
        <td align="center">(default) when <i>nullVale == ""</i></td>
        <td align="center">when <i>nullValue == "MISSING"</i></td>
    </tr>
    <tr>
        <td align="center">(default) when <i>treatEmptyValuesAsNulls == 
false</i></td>
        <td align="center">"" would cast to ""</td>
        <td align="center">"" would cast to ""</td>
    </tr>
    <tr>
        <td align="center">when <i>treatEmptyValuesAsNulls == true</i></td>
        <td align="center">"" would cast to null</td>
        <td align="center">"" would cast to ""</td>
    </tr>
    </table>
    
    However we don't have this `treatEmptyValuesAsNulls` in Spark 2.0. @falaki 
would it be OK with you if I add it back?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to