I know there are some community efforts shown in Spark summits before,
mostly around reusing the same Spark context with multiple “jobs”.

I don’t think reducing Spark job startup time is a community priority afaik.

Tim
On Fri, Jul 6, 2018 at 7:12 PM Tien Dat <tphan....@gmail.com> wrote:

> Dear Timothy,
>
> It works like a charm now.
>
> BTW (don't judge me if I am to greedy :-)), the latency to start a Spark
> job
> is around 2-4 seconds, unless I am not aware of some awesome optimization
> on
> Spark. Do you know if Spark community is working on reducing this latency?
>
> Best
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to