Re: Kafka Consumer in Spark Streaming

2014-11-05 Thread Something Something
;> >> On Tue, Nov 4, 2014 at 11:03 PM, Jain Rahul >> wrote: >> >>> >>> I think you are running it locally. >>> Do you have local[1] here for master url? If yes change it to local[2] >>> or more number of threads. >>> It may be due to

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Akhil Das
mismatch also. >> >> sparkConf.setMaster(“local[1]"); >> >> Regards, >> Rahul >> >> From: Something Something >> Date: Wednesday, November 5, 2014 at 12:23 PM >> To: "Shao, Saisai" >> Cc: "user@spark.apache.org

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Something Something
> more number of threads. > It may be due to topic name mismatch also. > > sparkConf.setMaster(“local[1]"); > > Regards, > Rahul > > From: Something Something > Date: Wednesday, November 5, 2014 at 12:23 PM > To: "Shao, Saisai" > Cc:

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Jain Rahul
l.com>> Date: Wednesday, November 5, 2014 at 12:23 PM To: "Shao, Saisai" mailto:saisai.s...@intel.com>> Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.org>> Subject: Re: Kafka Consumer in Spark Streaming Added foreach as f

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Something Something
ay, November 05, 2014 2:28 PM > *To:* Shao, Saisai > *Cc:* user@spark.apache.org > *Subject:* Re: Kafka Consumer in Spark Streaming > > > > The Kafka broker definitely has messages coming in. But your #2 point is > valid. Needless to say I am a newbie to Spark. I can't f

RE: Kafka Consumer in Spark Streaming

2014-11-04 Thread Shao, Saisai
, Saisai Cc: user@spark.apache.org Subject: Re: Kafka Consumer in Spark Streaming The Kafka broker definitely has messages coming in. But your #2 point is valid. Needless to say I am a newbie to Spark. I can't figure out where the 'executor' logs would be. How would I find them? All I

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Sean Owen
this code only expresses a transformation and so does not actually cause any action. I think you intend to use foreachRDD. On Wed, Nov 5, 2014 at 5:57 AM, Something Something wrote: > I've following code in my program. I don't get any error, but it's not > consuming the messages either. Shouldn

Re: Kafka Consumer in Spark Streaming

2014-11-04 Thread Something Something
The Kafka broker definitely has messages coming in. But your #2 point is valid. Needless to say I am a newbie to Spark. I can't figure out where the 'executor' logs would be. How would I find them? All I see printed on my screen is this: 14/11/04 22:21:23 INFO Slf4jLogger: Slf4jLogger started

RE: Kafka Consumer in Spark Streaming

2014-11-04 Thread Shao, Saisai
Hi, would you mind describing your problem a little more specific. 1. Is the Kafka broker currently has no data feed in? 2. This code will print the lines, but not in the driver side, the code is running in the executor side, so you can check the log in worker dir to see if there’s a