Running spark-submit from a remote machine using a YARN application

2014-12-11 Thread ryaminal
We are trying to submit a Spark application from a Tomcat application running our business logic. The Tomcat app lives in a seperate non-hadoop cluster. We first were doing this by using the spark-yarn package to directly call Client#runApp() but found that the API we were using in Spark is being

Re: Calling spark from a java web application.

2014-12-01 Thread ryaminal
If you are able to use YARN in your hadoop cluster, then the following technique is pretty straightforward: http://blog.sequenceiq.com/blog/2014/08/22/spark-submit-in-java/ We use this in our system and it's super easy to execute from our Tomcat application. -- View this message in context:

Multiple Applications(Spark Contexts) Concurrently Fail With Broadcast Error

2014-11-07 Thread ryaminal
We are unable to run more than one application at a time using Spark 1.0.0 on CDH5. We submit two applications using two different SparkContexts on the same Spark Master. The Spark Master was started using the following command and parameters and is running in standalone mode:

Re: application as a service

2014-08-17 Thread ryaminal
You can also look into using ooyala's job server at https://github.com/ooyala/spark-jobserver This already has a spary server built in that allows you to do what has already been explained above. Sounds like it should solve your problem. Enjoy! -- View this message in context: