Re: pyspark exception catch

2014-12-19 Thread imazor
catch them and decide on the next step? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-exception-catch-tp20483p20788.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: pyspark exception catch

2014-12-16 Thread cfregly
s #2, but use empty array for a failure and a single-element array for a success. hope that helps! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-exception-catch-tp20483p20730.html Sent from the Apache Spark User List mailing l

pyspark exception catch

2014-12-05 Thread Igor Mazor
Hi , Is it possible to catch exceptions using pyspark so in case of error, the program will not fail and exit. for example if I am using (key, value) rdd functionality but the data don't have actually (key, value) format, pyspark will throw exception (like ValueError) that I am unable to catch.