Re: Application not showing in Spark History

2016-08-02 Thread Sun Rui
bin/spark-submit will set some env variable, like SPARK_HOME, that Spark later 
will use to locate the spark-defaults.conf from which default settings for 
Spark will be loaded.

I would guess that some configuration option like spark.eventLog.enabled in the 
spark-defaults.conf is skipped by directly using the SparkSubmit class instead 
of “bin/spark-submit”.

The formal way to launch a Spark application within Java is to use 
SparkLauncher. Remember to call SparkLaunch.setSparkHome() to set the Spark 
Home directory.

> On Aug 2, 2016, at 16:53, Rychnovsky, Dusan 
>  wrote:
> 
> Hi,
> 
> I am trying to launch my Spark application from within my Java application 
> via the SparkSubmit class, like this:
> 
> 
> List args = new ArrayList<>();
> 
> args.add("--verbose");
> args.add("--deploy-mode=cluster");
> args.add("--master=yarn");
> ...
> 
> SparkSubmit.main(args.toArray(new String[args.size()]));
> 
> 
> This works fine, with one catch - the application does not appear in Spark 
> History after it's finished.
> 
> If, however, I run the application using `spark-submit.sh`, like this:
> 
> 
> spark-submit \
>   --verbose \
>   --deploy-mode=cluster \
>   --master=yarn \
>   ...
> 
> 
> the application appears in Spark History correctly.
> 
> What am I missing?
> 
> Also, is this a good way to launch a Spark application from within a Java 
> application or is there a better way?
> 
> Thanks,
> Dusan



Re: Application not showing in Spark History

2016-08-02 Thread Noorul Islam Kamal Malmiyoda
Have you tried https://github.com/spark-jobserver/spark-jobserver

On Tue, Aug 2, 2016 at 2:23 PM, Rychnovsky, Dusan
 wrote:
> Hi,
>
>
> I am trying to launch my Spark application from within my Java application
> via the SparkSubmit class, like this:
>
>
>
> List args = new ArrayList<>();
>
> args.add("--verbose");
> args.add("--deploy-mode=cluster");
> args.add("--master=yarn");
> ...
>
>
> SparkSubmit.main(args.toArray(new String[args.size()]));
>
>
>
> This works fine, with one catch - the application does not appear in Spark
> History after it's finished.
>
>
> If, however, I run the application using `spark-submit.sh`, like this:
>
>
>
> spark-submit \
>
>   --verbose \
>
>   --deploy-mode=cluster \
>
>   --master=yarn \
>
>   ...
>
>
>
> the application appears in Spark History correctly.
>
>
> What am I missing?
>
>
> Also, is this a good way to launch a Spark application from within a Java
> application or is there a better way?
>
> Thanks,
>
> Dusan
>
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org