Hi

I run a spark job, where the executor is within a docker instance.  I want
to push the spark job output (one by one) to a Kafka broker which is
outside the docker instance.

Has anyone tried anything like this where Kafka producer is within a docker
and broker is outside ? I am a newbie to both Spark and Kafka, and looking
for some pointers to start exploring.

Thanks.

-- 
Raghav

Reply via email to