Hi,
I am integrating Kafka and Spark, using spark-streaming. I have created a topic
as a kafka producer:
bin/kafka-topics.sh --create --zookeeper localhost:2181
--replication-factor 1 --partitions 1 --topic test
I am publishing messages in kafka and trying to read them using
It says:
14/11/27 11:56:05 WARN scheduler.TaskSchedulerImpl: Initial job has not
accepted any resources; check your cluster UI to ensure that workers are
registered and have sufficient memory
A quick guess would be, you are giving the wrong master url. ( spark://
192.168.88.130:7077 ) Open the
...@sigmoidanalytics.com]
Sent: Monday, December 01, 2014 3:56 PM
To: Sarosh, M.
Cc: user@spark.apache.org
Subject: Re: Kafka+Spark-streaming issue: Stream 0 received 0 blocks
It says:
14/11/27 11:56:05 WARN scheduler.TaskSchedulerImpl: Initial job has not
accepted any resources; check your cluster
...@sigmoidanalytics.com]
*Sent:* Monday, December 01, 2014 3:56 PM
*To:* Sarosh, M.
*Cc:* user@spark.apache.org
*Subject:* Re: Kafka+Spark-streaming issue: Stream 0 received 0 blocks
It says:
14/11/27 11:56:05 WARN scheduler.TaskSchedulerImpl: Initial job has not
accepted any resources; check your