Re: Multiple Spark Applications that use Cassandra, how to share resources/nodes

2016-05-04 Thread Alonso Isidoro Roman
Tobias > > From: Andy Davidson <a...@santacruzintegration.com> > Date: Tuesday 3 May 2016 at 17:26 > To: Tobias Eriksson <tobias.eriks...@qvantel.com>, "user@spark.apache.org" > <user@spark.apache.org> > Subject: Re: Multiple Spark Applications that use Cassa

Re: Multiple Spark Applications that use Cassandra, how to share resources/nodes

2016-05-04 Thread Tobias Eriksson
gt;> Date: Tuesday 3 May 2016 at 17:26 To: Tobias Eriksson <tobias.eriks...@qvantel.com<mailto:tobias.eriks...@qvantel.com>>, "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>> Subject: Re: Multiple

RE: Multiple Spark Applications that use Cassandra, how to share resources/nodes

2016-05-03 Thread Mohammed Guller
You can run multiple Spark applications simultaneously. Just limit the # of cores and memory allocated to each application. For example, if each node has 8 cores and there are 10 nodes and you want to be able to run 4 applications simultaneously, limit the # of cores for each application to 20.

Re: Multiple Spark Applications that use Cassandra, how to share resources/nodes

2016-05-03 Thread Andy Davidson
Hi Tobias I am very interested implemented rest based api on top of spark. My rest based system would make predictions from data provided in the request using models trained in batch. My SLA is 250 ms. Would you mind sharing how you implemented your rest server? I am using spark-1.6.1. I have