Check that you get the data from kafka producer
lines.foreachRDD(new FunctionJavaRDDString, Void() {
@Override
public Void call(JavaRDDString rdd) throws
Exception {
ListString collect = rdd.collect();
Here's a simple working version.
import com.google.common.collect.Lists;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.Function2;
import
I am embarrassed to admit but I can't get a basic 'word count' to work
under Kafka/Spark streaming. My code looks like this. I don't see any
word counts in console output. Also, don't see any output in UI. Needless
to say, I am newbie in both 'Spark' as well as 'Kafka'.
Please help. Thanks.
I am not running locally. The Spark master is:
spark://machine name:7077
On Mon, Nov 10, 2014 at 3:47 PM, Tathagata Das tathagata.das1...@gmail.com
wrote:
What is the Spark master that you are using. Use local[4], not local
if you are running locally.
On Mon, Nov 10, 2014 at 3:01 PM,