We are trying to submit a Spark application from a Tomcat application running
our business logic. The Tomcat app lives in a seperate non-hadoop cluster.
We first were doing this by using the spark-yarn package to directly call
Client#runApp() but found that the API we were using in Spark is being
If you are able to use YARN in your hadoop cluster, then the following
technique is pretty straightforward:
http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/
We use this in our system and it's super easy to execute from our Tomcat
application.
--
View this message in context:
We are unable to run more than one application at a time using Spark 1.0.0 on
CDH5. We submit two applications using two different SparkContexts on the
same Spark Master. The Spark Master was started using the following command
and parameters and is running in standalone mode:
You can also look into using ooyala's job server at
https://github.com/ooyala/spark-jobserver
This already has a spary server built in that allows you to do what has
already been explained above. Sounds like it should solve your problem.
Enjoy!
--
View this message in context: