Hi,

➜  spark git:(master) ✗ ./bin/spark-submit whatever || echo $?
Error: Cannot load main class from JAR file:/Users/jacek/dev/oss/spark/whatever
Run with --help for usage help or --verbose for debug output
1

I see 1 and there are other cases for 1 too.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Feb 3, 2017 at 10:46 PM, Ali Gouta <ali.go...@gmail.com> wrote:
> Hello,
>
> +1, i have exactly the same issue. I need the exit code to make a decision
> on oozie executing actions. Spark-submit always returns 0 when catching the
> exception. From spark 1.5 to 1.6.x, i still have the same issue... It would
> be great to fix it or to know if there is some work around about it.
>
> Ali Gouta.
>
> Le 3 févr. 2017 22:24, "Jacek Laskowski" <ja...@japila.pl> a écrit :
>
> Hi,
>
> An interesting case. You don't use Spark resources whatsoever.
> Creating a SparkConf does not use YARN...yet. I think any run mode
> would have the same effect. So, although spark-submit could have
> returned exit code 1, the use case touches Spark very little.
>
> What version is that? Do you see "There is an exception in the script
> exiting with status 1" printed out to stdout?
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Fri, Feb 3, 2017 at 8:06 PM, Shashank Mandil
> <mandil.shash...@gmail.com> wrote:
>> Hi All,
>>
>> I wrote a test script which always throws an exception as below :
>>
>> object Test {
>>
>>
>>   def main(args: Array[String]) {
>>     try {
>>
>>       val conf =
>>         new SparkConf()
>>           .setAppName("Test")
>>
>>       throw new RuntimeException("Some Exception")
>>
>>       println("all done!")
>>     } catch {
>>       case e: RuntimeException => {
>>         println("There is an exception in the script exiting with status
>> 1")
>>         System.exit(1)
>>       }
>>     }
>> }
>>
>> When I run this code using spark-submit I am expecting to get an exit code
>> of 1,
>> however I keep getting exit code 0.
>>
>> Any ideas how I can force spark-submit to return with code 1 ?
>>
>> Thanks,
>> Shashank
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to