gt;>
Date: Tuesday 3 May 2016 at 17:26
To: Tobias Eriksson
<tobias.eriks...@qvantel.com<mailto:tobias.eriks...@qvantel.com>>,
"user@spark.apache.org<mailto:user@spark.apache.org>"
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Multiple
Hi
We are using Spark for a long running job, in fact it is a REST-server that
does some joins with some tables in Casandra and returns the result.
Now we need to have multiple applications running in the same Spark cluster,
and from what I understand this is not possible, or should I say