I¹m also in early stages of setting up long running Spark jobs. Easiest way
I¹ve found is to set up a cluster and submit the job via YARN. Then I can
come back and check in on progress when I need to. Seems the trick is tuning
the queue priority and YARN preemption to get the job to run in a reasonable
amount of time without disrupting the other jobs.

- SteveN


From:  Chris Schneider <ch...@christopher-schneider.com>
Reply-To:  <user@spark.apache.org>
Date:  Wednesday, July 23, 2014 at 7:39
To:  <user@spark.apache.org>
Subject:  Cluster submit mode - only supported on Yarn?

We are starting to use Spark, but we don't have any existing infrastructure
related to big-data, so we decided to setup the standalone cluster, rather
than mess around with Yarn or Mesos.

But it appears like the driver program has to stay up on the client for the
full duration of the job ("client mode").

What is the simplest way to setup "cluster" submission mode, to allow our
client boxes to submit jobs and then move on with the other work they need
to do without keeping a potentially long running java process up?

Thanks,
Chris






-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to