If you’re running on a standalone mode, the log is under <SPAR_HOME>/work/ 
directory. I’m not sure for yarn or mesos, you can check the document of Spark 
to see the details.

Thanks
Jerry

From: Something Something [mailto:mailinglist...@gmail.com]
Sent: Wednesday, November 05, 2014 2:28 PM
To: Shao, Saisai
Cc: user@spark.apache.org
Subject: Re: Kafka Consumer in Spark Streaming

The Kafka broker definitely has messages coming in.  But your #2 point is 
valid.  Needless to say I am a newbie to Spark.  I can't figure out where the 
'executor' logs would be.  How would I find them?
All I see printed on my screen is this:

14/11/04 22:21:23 INFO Slf4jLogger: Slf4jLogger started
14/11/04 22:21:23 INFO Remoting: Starting remoting
14/11/04 22:21:24 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://spark@mymachie:60743]
14/11/04 22:21:24 INFO Remoting: Remoting now listens on addresses: 
[akka.tcp://spark@mymachine:60743]
14/11/04 22:21:24 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
14/11/04 22:21:24 INFO JniBasedUnixGroupsMappingWithFallback: Falling back to 
shell based
-------------------------------------------
Time: 1415168520000 ms
-------------------------------------------
-------------------------------------------
Time: 1415168520000 ms
-------------------------------------------
Keeps repeating this...

On Tue, Nov 4, 2014 at 10:14 PM, Shao, Saisai 
<saisai.s...@intel.com<mailto:saisai.s...@intel.com>> wrote:
Hi, would you mind describing your problem a little more specific.


1.      Is the Kafka broker currently has no data feed in?

2.      This code will print the lines, but not in the driver side, the code is 
running in the executor side, so you can check the log in worker dir to see if 
there’s any printing logs under this folder.

3.      Did you see any exceptions when running the app, this will help to 
define the problem.

Thanks
Jerry

From: Something Something 
[mailto:mailinglist...@gmail.com<mailto:mailinglist...@gmail.com>]
Sent: Wednesday, November 05, 2014 1:57 PM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Kafka Consumer in Spark Streaming

I've following code in my program.  I don't get any error, but it's not 
consuming the messages either.  Shouldn't the following code print the line in 
the 'call' method?  What am I missing?

Please help.  Thanks.



        JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new 
Duration(60 * 1 * 1000));

        JavaPairReceiverInputDStream tweets = KafkaUtils.createStream(ssc, 
"<machine>:2181", "1", map);

        JavaDStream<String> statuses = tweets.map(
                new Function<String, String>() {
                    public String call(String status) {
                        System.out.println(status);
                        return status;
                    }
                }
        );

Reply via email to