Hi,
You can start multiple spark apps per cluster. You will have one stream
context per app.
Em 24 de dez de 2016 18:22, "shyla deshpande"
escreveu:
> Hi All,
>
> Thank you for the response.
>
> As per
>
> https://docs.cloud.databricks.com/docs/latest/databricks_
>
Hi All,
Thank you for the response.
As per
https://docs.cloud.databricks.com/docs/latest/databricks_guide/index.html#07%20Spark%20Streaming/15%20Streaming%20FAQs.html
There can be only one streaming context in a cluster which implies only one
streaming job.
So, I am still confused. Anyone
If you have enough cores/resources, run them separately depending on your
use case.
On Thursday 15 December 2016, Divya Gehlot wrote:
> It depends on the use case ...
> Spark always depends on the resource availability .
> As long as you have resource to acoomodate ,can
It depends on the use case ...
Spark always depends on the resource availability .
As long as you have resource to acoomodate ,can run as many spark/spark
streaming application.
Thanks,
Divya
On 15 December 2016 at 08:42, shyla deshpande
wrote:
> How many Spark
How many Spark streaming applications can be run at a time on a Spark
cluster?
Is it better to have 1 spark streaming application to consume all the Kafka
topics or have multiple streaming applications when possible to keep it
simple?
Thanks