Hi Stefan
Have you got any response from Spark team regarding LZ4 library
compatibility? To avoid this kind of problems, lz4 should be shaded in
Spark distribution, IMHO.
Currently I'm not able to update Spark in my application due to this issue.
It is not possible to consume compressed topics
an extra `}` ;)
On Tue, Aug 18, 2015 at 10:49 PM, Marcin Kuthan marcin.kut...@gmail.com
wrote:
As long as Kafka producent is thread-safe you don't need any pool at
all. Just share single producer on every executor. Please look at my blog
post for more details. http://allegro.tech/spark-kafka
As long as Kafka producent is thread-safe you don't need any pool at all.
Just share single producer on every executor. Please look at my blog post
for more details. http://allegro.tech/spark-kafka-integration.html
19 sie 2015 2:00 AM Shenghua(Daniel) Wan wansheng...@gmail.com
napisaĆ(a):
All of
to try and limit the number of internals it was touching).
On Sunday, March 1, 2015, Marcin Kuthan marcin.kut...@gmail.com wrote:
I have started using Spark and Spark Streaming and I'm wondering how do you
test your applications? Especially Spark Streaming application with window
based
I would expect base trait for testing purposes in spark distribution.
ManualClock should be exposed as well. And some documentation how to
configure SBT to avoid problems with multiple spark contexts. I'm
going to create improvement proposal on Spark issue tracker about it.
Right now I
I have started using Spark and Spark Streaming and I'm wondering how do you
test your applications? Especially Spark Streaming application with window
based transformations.
After some digging I found ManualClock class to take full control over
stream processing. Unfortunately the class is not