Hi Kant,

Ans :Yes

The org.apache.spark.launcher
<https://spark.apache.org/docs/latest/api/java/index.html?org/apache/spark/launcher/package-summary.html>
package
provides classes for launching Spark jobs as child processes using a simple
Java API.
*Doc:*  https://spark.apache.org/docs/latest/rdd-programming-guide.html


*Library for launching Spark applications.*

This library allows applications to launch Spark programmatically. There's
only one entry point to the library - the SparkLauncher
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/launcher/SparkLauncher.html>
 class.

The SparkLauncher.startApplication(
org.apache.spark.launcher.SparkAppHandle.Listener...)
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/launcher/SparkLauncher.html#startApplication-org.apache.spark.launcher.SparkAppHandle.Listener...->
can
be used to start Spark and provide a handle to monitor and control the
running application:


   import org.apache.spark.launcher.SparkAppHandle;
   import org.apache.spark.launcher.SparkLauncher;

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       SparkAppHandle handle = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .startApplication();
       // Use handle API to monitor / control application.
     }
   }


It's also possible to launch a raw child process, using the
SparkLauncher.launch()
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/launcher/SparkLauncher.html#launch-->
 method:



   import org.apache.spark.launcher.SparkLauncher;

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       Process spark = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .launch();
       spark.waitFor();
     }
   }


*Note :*

a user application is launched using the bin/spark-submit script. This
script takes care of setting up the classpath with Spark and its
dependencies, and can support different cluster managers and deploy modes
that Spark supports:

Regards,
Vaquar khan

On Wed, Aug 30, 2017 at 3:58 PM, Irving Duran <irving.du...@gmail.com>
wrote:

> I don't know how this would work, but maybe your .jar calls spark-submit
> from within your jar if you were to compile the jar with the spark-submit
> class.
>
>
> Thank You,
>
> Irving Duran
>
> On Wed, Aug 30, 2017 at 10:57 AM, kant kodali <kanth...@gmail.com> wrote:
>
>> Hi All,
>>
>> I understand spark-submit sets up its own class loader and other things
>> but I am wondering if it  is possible to just compile the code and run it
>> using "java -jar mysparkapp.jar" ?
>>
>> Thanks,
>> kant
>>
>
>


-- 
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago

Reply via email to