[ 
https://issues.apache.org/jira/browse/SPARK-23308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16354250#comment-16354250
 ] 

Márcio Furlani Carmona commented on SPARK-23308:
------------------------------------------------

Yeah, I set it back to `ignoreCorruptFiles=false` to prevent this. But then if 
there's indeed a corrupt file, our job will never succeed until we fix that.

The biggest problem for me was that silent failure you mentioned. I just found 
out there was something wrong after running a job for the same input multiple 
times and noticing some missing data, then I started investigating the reason 
why and figured out it was due to this flag and the SocketTimeoutException I 
mentioned.

I agree the documentation should at least mention the risks of setting this 
flags and for which exceptions it considers the data as corrupt. Right now I 
believe this flag is not even documented officially, is it? 
https://spark.apache.org/docs/latest/configuration.html

> ignoreCorruptFiles should not ignore retryable IOException
> ----------------------------------------------------------
>
>                 Key: SPARK-23308
>                 URL: https://issues.apache.org/jira/browse/SPARK-23308
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.1
>            Reporter: Márcio Furlani Carmona
>            Priority: Minor
>
> When `spark.sql.files.ignoreCorruptFiles` is set it totally ignores any kind 
> of RuntimeException or IOException, but some possible IOExceptions may happen 
> even if the file is not corrupted.
> One example is the SocketTimeoutException which can be retried to possibly 
> fetch the data without meaning the data is corrupted.
>  
> See: 
> https://github.com/apache/spark/blob/e30e2698a2193f0bbdcd4edb884710819ab6397c/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileScanRDD.scala#L163



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to