Re: Spark 1.6.1. How to prevent serialization of KafkaProducer

2016-04-22 Thread Alexander Gallego
n just call: > > val kafkaWriter: KafkaWriter = > KafkaWriter(KafkaStreamFactory.getBrokersFromConfig(config), > config.getString(Parameters.topicName), numPartitions = > kafkaWritePartitions) > detectionWriter.write(dataToWriteToKafka) > > > Hop

Re: Spark 1.6.1. How to prevent serialization of KafkaProducer

2016-04-21 Thread Alexander Gallego
Count , the String is sent back and producer.send() is called. > > I guess if you don't find via solution in your current design, you can consider the above. > > On Thu, Apr 21, 2016 at 10:04 AM, Alexander Gallego <agall...@concord.io> wrote: >> >> Hello, &

Spark 1.6.1. How to prevent serialization of KafkaProducer

2016-04-21 Thread Alexander Gallego
Hello, I understand that you cannot serialize Kafka Producer. So I've tried: (as suggested here https://forums.databricks.com/questions/369/how-do-i-handle-a-task-not-serializable-exception.html ) - Make the class Serializable - not possible - Declare the instance only within the lambda