Spark Streaming as a Service

2017-07-28 Thread ajit roshen
We have few Spark Streaming Apps running on our AWS Spark 2.1 Yarn cluster.
We currently log on to the Master Node of the cluster and start the App
using "spark-submit", calling the jar.

We would like to open up this to our users, so that they can submit their
own Apps, but we would not be able to give Users access to the Master or
any other Nodes of the cluster.  I have the below questions.

   - What would be a good interface to invoke the spark-submit in this
   case? I read about Livy & jobserver but couldn't find out if it provides a
   UI.


   - Is there a way to specify the start-date and end-date for the
   streaming Apps? Option to know the current App status and option to Kill
   App. I know Resource Manager provides this but not sure if it is a good
   idea to open up RM to users.


   - Is there a way to control Access to the App such that User-A can only
   execute App-1 and App-3?


   - Is there a way to control Resources usage such that User-A can only
   use 2 Executors, 2 Cores/Executor, 10GB/Executor etc:?

Thank you.
Ajit


subscribe

2017-07-28 Thread ajit roshen