Using FAIR mode.
If no other way. I think there is a limitation on number of parallel jobs
that spark can run. Is there a way that more number of jobs can run in
parallel. This is alright because, this sparkcontext would only be used
during web service calls.
I looked at spark configuration page
Are you using the scheduler in fair mode instead of fifo mode?
Sent from my iPhone
> On Sep 22, 2018, at 12:58 AM, Jatin Puri wrote:
>
> Hi.
>
> What tactics can I apply for such a scenario.
>
> I have a pipeline of 10 stages. Simple text processing. I train the data with
> the pipeline
Hi.
What tactics can I apply for such a scenario.
I have a pipeline of 10 stages. Simple text processing. I train the data
with the pipeline and for the fitted data, do some modelling and store the
results.
I also have a web-server, where I receive requests. For each request
(dataframe of