unsubscribe

2016-05-06 Thread Pranav Shukla



Scaling Kafka Direct Streming application

2017-03-14 Thread Pranav Shukla
How to scale or possibly auto-scale a spark streaming application consuming
from kafka and using kafka direct streams. We are using spark 1.6.3, cannot
move to 2.x unless there is a strong reason.

Scenario:
Kafka topic with 10 partitions
Standalone cluster running on kubernetes with 1 master and 2 workers

What we would like to know?
Increase the number of partitions (say from 10 to 15)
Add additional worker node without restarting the streaming application and
start consuming off the additional partitions.

Is this possible? i.e. start additional workers in standalone cluster to
auto-scale an existing spark streaming application that is already running
or we have to stop and resubmit the streaming app?

Best Regards,
Pranav Shukla