etSplits(arg0); } catch (InvalidInputException e) {
return
Collections. emptyList(); } }}
Thanks !
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Ignoring-S3-0-files-exception-tp6101p6252.html
Sent from the
to ignore 0 files ?
Thanks
Regards,
Laurent T
- Mail original -
De: "Mayur Rustagi"
À: "laurent thoulon"
Envoyé: Mercredi 21 Mai 2014 13:51:46
Objet: Re: Ignoring S3 0 files exception
You can try newhaoopapi in spark context. Should be able to c
aRDD b =
sc.textFile("s3n://"+existingFilenamePattern) JavaRDD aPlusB =
a.union(b);aPlusB.reduceByLey(MyReducer); // <-- This throws the error
I'd like to ignore the exception caused by a to process b without
troubles.Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.
erns" so that it can work with the
RDDs that actually found files ?
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Ignoring-S3-0-files-exception-tp6101.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.