It's not working because , you haven't collected the data.

Try something like

DStream.forEachRDD((rdd)=> {rdd.foreach(println)})

Thanks,
Rabin


On Wed, Jul 6, 2016 at 5:05 PM, Yu Wei <yu20...@hotmail.com> wrote:

> Hi guys,
>
>
> It seemed that when launching application via yarn on single node,
> JavaDStream.print() did not work. However, occasionally it worked.
>
> If launch the same application in local mode, it always worked.
>
>
> The code is as below,
> SparkConf conf = new SparkConf().setAppName("Monitor&Control");
> JavaStreamingContext jssc = new JavaStreamingContext(conf,
> Durations.seconds(1));
> JavaReceiverInputDStream<String> inputDS = MQTTUtils.createStream(jssc,
> "tcp://114.55.145.185:1883", "Control");
> inputDS.print();
> jssc.start();
> jssc.awaitTermination();
>
>
> Command for launching via yarn, (did not work)
> spark-submit --master yarn --deploy-mode cluster --driver-memory 4g
> --executor-memory 2g target/CollAna-1.0-SNAPSHOT.jar
>
> Command for launching via local mode (works)
> spark-submit --master local[4] --driver-memory 4g --executor-memory 2g
> --num-executors 4 target/CollAna-1.0-SNAPSHOT.jar
>
>
> Any thoughts about the problem?
>
>
> Thanks,
> Jared
>
>

Reply via email to