[ 
https://issues.apache.org/jira/browse/SPARK-17039?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-17039.
-------------------------------
    Resolution: Duplicate

Oh I understand the issue now, my fault. I agree this is a duplicate, 
regardless of what the specific final fix is.

> cannot read null dates from csv file
> ------------------------------------
>
>                 Key: SPARK-17039
>                 URL: https://issues.apache.org/jira/browse/SPARK-17039
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 2.0.0
>            Reporter: Barry Becker
>
> I see this exact same bug as reported in this [stack overflow 
> post|http://stackoverflow.com/questions/38265640/spark-2-0-pre-csv-parsing-error-if-missing-values-in-date-column]
>   using Spark 2.0.0 (released version).
> In scala, I read a csv using 
> sqlContext.read
>           .format("csv")
>           .option("header", "false")
>           .option("inferSchema", "false") 
>           .option("nullValue", "?")
>           .option("dateFormat", "yyyy-MM-dd'T'HH:mm:ss")
>           .schema(dfSchema)
>           .csv(dataFile)
> The data contains some null dates (represented with ?).
> The error I get is:
> {code}
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in 
> stage 8.0 failed 1 times, most recent failure: Lost task 0.0 in stage 8.0 
> (TID 10, localhost): java.text.ParseException: Unparseable date: "?"
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to