Tobias
>
> From: Andy Davidson <a...@santacruzintegration.com>
> Date: Tuesday 3 May 2016 at 17:26
> To: Tobias Eriksson <tobias.eriks...@qvantel.com>, "user@spark.apache.org"
> <user@spark.apache.org>
> Subject: Re: Multiple Spark Applications that use Cassa
gt;>
Date: Tuesday 3 May 2016 at 17:26
To: Tobias Eriksson
<tobias.eriks...@qvantel.com<mailto:tobias.eriks...@qvantel.com>>,
"user@spark.apache.org<mailto:user@spark.apache.org>"
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Multiple
You can run multiple Spark applications simultaneously. Just limit the # of
cores and memory allocated to each application. For example, if each node has 8
cores and there are 10 nodes and you want to be able to run 4 applications
simultaneously, limit the # of cores for each application to 20.
Hi Tobias
I am very interested implemented rest based api on top of spark. My rest
based system would make predictions from data provided in the request using
models trained in batch. My SLA is 250 ms.
Would you mind sharing how you implemented your rest server?
I am using spark-1.6.1. I have