On Wed, Apr 12, 2017 at 4:11 PM, Sam Elamin wrote:
>
> When it comes to scheduling Spark jobs, you can either submit to an
> already running cluster using things like Oozie or bash scripts, or have a
> workflow manager like Airflow or Data Pipeline to create new clusters for
> you. We went down t
Hi All,
Really useful information on this thread. We moved a bit off topic since
the initial question was how to schedule spark jobs in AWS. I do think
however that there are loads of great insights here within the community so
I have renamed the subject to "Deploying Spark Applications. Best
Prac