I looked at the Streaming UI for my job and it reports that it has
processed many batches, but that none of the batches had any records in
them. Unfortunately, that’s what I expected. :-(
I’ve tried multiple test programs and I’m seeing the same thing. The
Kafka sources are alive and well and
What does the kafka receiver status on the streaming UI say when you are
connected to the Kafka sources? Does it show any error?
Can you find out which machine the receiver is running and see the worker
logs for any exceptions / error messages? Try turning on the DEBUG level in
log4j.
TD
On May
I¹m trying out 1.0 on a set of small Spark Streaming tests and am running
into problems. Here¹s one of the little programs I¹ve used for a long
time ‹ it reads a Kafka stream that contains Twitter JSON tweets and does
some simple counting. The program starts OK (it connects to the Kafka
stream
Also one other thing to try, try removing all of the logic form inside
of foreach and just printing something. It could be that somehow an
exception is being triggered inside of your foreach block and as a
result the output goes away.
On Fri, May 23, 2014 at 6:00 PM, Patrick Wendell
Few more suggestions.
1. See the web ui, is the system running any jobs? If not, then you may
need to give the system more nodes. Basically the system should have more
cores than the number of receivers.
2. Furthermore there is a streaming specific web ui which gives more
streaming specific data.