Hi, I am new to Spark Streaming.
In our project we want to implement a custom receiver to subscribe our log
data.
I have two questions:

1. Do Muti DStream Receivers run in different process or different threads?
2. Union muti DStream, such as 10 DStream, we observed that spark will
create 10 jobs. how many receivers will be start on a worker? We find that,
if the application got 5 executors, 10 receivers will be started on these
executors by random. Is that right?


-- 
Cyanny LIANG

Reply via email to