I wasn't using spark sql before.
But by default spark should retry the exception for 4 times.
I'm curious why it aborted after 1 failure
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark1-0-spark-sql-saveAsParquetFile-Error-tp7006p7252.html
Sent from the
I've got an exception when run the example of spark sql -- Using Parquet
When I call saveAsParquetFile method on people schemaRDD, throws an
exception like below, I don't know why, can anyone help me with it ?
Exception:
parquet.io.ParquetDecodingException: Can not read value at 0 in block -1