[jira] [Updated] (SPARK-16762) spark hanging when action method print

2016-07-27 Thread Anh Nguyen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-16762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anh Nguyen updated SPARK-16762:
---
Attachment: Screen Shot 2016-07-28 at 12.34.22 PM.png

> spark hanging when action method print
> --
>
> Key: SPARK-16762
> URL: https://issues.apache.org/jira/browse/SPARK-16762
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Reporter: Anh Nguyen
> Attachments: Screen Shot 2016-07-28 at 12.33.35 PM.png, Screen Shot 
> 2016-07-28 at 12.34.22 PM.png
>
>
> I write code spark Streaming (consumer) intergate kafaka and deploy on mesos 
> FW:
> import org.apache.spark.SparkConf
> import org.apache.spark.streaming._
> import org.apache.spark.streaming.kafka._
> import org.apache.spark.streaming.kafka._
> object consumer {
> def main(args: Array[String]) {
>   if (args.length < 4) {
> System.err.println("Usage: KafkaWordCount   
> ")
> System.exit(1)
>   }
>   val Array(zkQuorum, group, topics, numThreads) = args
>   val sparkConf = new 
> SparkConf().setAppName("KafkaWordCount").set("spark.rpc.netty.dispatcher.numThreads","4")
>   val ssc = new StreamingContext(sparkConf, Seconds(2))
>   ssc.checkpoint("checkpoint")
>   val topicMap = topics.split(",").map((_, numThreads.toInt)).toMap
>   val lines = KafkaUtils.createStream(ssc, zkQuorum, group, 
> topicMap).map(_._2)
>   lines.print()
>   val words = lines.flatMap(_.split(" "))
>   val wordCounts = words.map(x => (x, 1L)).reduceByKeyAndWindow(_ + _, _ 
> - _, Minutes(10), Seconds(2), 2)
>   ssc.start()
>   ssc.awaitTermination()
> }
>   }
> This log:
> I0728 04:28:05.439469  4295 exec.cpp:143] Version: 0.28.2
> I0728 04:28:05.443464  4296 exec.cpp:217] Executor registered on slave 
> 8e00b1ea-3f70-428a-8125-fb0eed88aede-S0
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 16/07/28 04:28:07 INFO CoarseGrainedExecutorBackend: Registered signal 
> handlers for [TERM, HUP, INT]
> 16/07/28 04:28:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 16/07/28 04:28:09 INFO SecurityManager: Changing view acls to: vagrant
> 16/07/28 04:28:09 INFO SecurityManager: Changing modify acls to: vagrant
> 16/07/28 04:28:09 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
> with modify permissions: Set(vagrant)
> 16/07/28 04:28:11 INFO SecurityManager: Changing view acls to: vagrant
> 16/07/28 04:28:11 INFO SecurityManager: Changing modify acls to: vagrant
> 16/07/28 04:28:11 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
> with modify permissions: Set(vagrant)
> 16/07/28 04:28:12 INFO Slf4jLogger: Slf4jLogger started
> 16/07/28 04:28:12 INFO Remoting: Starting remoting
> 16/07/28 04:28:13 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkExecutorActorSystem@slave1:57380]
> 16/07/28 04:28:13 INFO Utils: Successfully started service 
> 'sparkExecutorActorSystem' on port 57380.
> 16/07/28 04:28:13 INFO DiskBlockManager: Created local directory at 
> /home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-332d-4858-9527-64558fe7fb90/blockmgr-4e94f4ea-f074-4dbe-bfa1-34054e6e079c
> 16/07/28 04:28:13 INFO MemoryStore: MemoryStore started with capacity 517.4 MB
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
> spark://CoarseGrainedScheduler@192.168.33.30:36179
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Successfully registered 
> with driver
> 16/07/28 04:28:13 INFO Executor: Starting executor ID 
> 8e00b1ea-3f70-428a-8125-fb0eed88aede-S0 on host slave1
> 16/07/28 04:28:13 INFO Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 48672.
> 16/07/28 04:28:13 INFO NettyBlockTransferService: Server created on 48672
> 16/07/28 04:28:13 INFO BlockManagerMaster: Trying to register BlockManager
> 16/07/28 04:28:13 INFO BlockManagerMaster: Registered BlockManager
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Got assigned task 0
> 16/07/28 04:28:13 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
> 16/07/28 04:28:13 INFO Executor: Fetching 
> http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar 
> with timestamp 1469680084686
> 16/07/28 04:28:14 INFO Utils: Fetching 
> http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar 
> to 
> /home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5

[jira] [Updated] (SPARK-16762) spark hanging when action method print

2016-07-27 Thread Anh Nguyen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-16762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anh Nguyen updated SPARK-16762:
---
Attachment: Screen Shot 2016-07-28 at 12.33.35 PM.png

> spark hanging when action method print
> --
>
> Key: SPARK-16762
> URL: https://issues.apache.org/jira/browse/SPARK-16762
> Project: Spark
>  Issue Type: Bug
>  Components: Deploy
>Reporter: Anh Nguyen
> Attachments: Screen Shot 2016-07-28 at 12.33.35 PM.png
>
>
> I write code spark Streaming (consumer) intergate kafaka and deploy on mesos 
> FW:
> import org.apache.spark.SparkConf
> import org.apache.spark.streaming._
> import org.apache.spark.streaming.kafka._
> import org.apache.spark.streaming.kafka._
> object consumer {
> def main(args: Array[String]) {
>   if (args.length < 4) {
> System.err.println("Usage: KafkaWordCount   
> ")
> System.exit(1)
>   }
>   val Array(zkQuorum, group, topics, numThreads) = args
>   val sparkConf = new 
> SparkConf().setAppName("KafkaWordCount").set("spark.rpc.netty.dispatcher.numThreads","4")
>   val ssc = new StreamingContext(sparkConf, Seconds(2))
>   ssc.checkpoint("checkpoint")
>   val topicMap = topics.split(",").map((_, numThreads.toInt)).toMap
>   val lines = KafkaUtils.createStream(ssc, zkQuorum, group, 
> topicMap).map(_._2)
>   lines.print()
>   val words = lines.flatMap(_.split(" "))
>   val wordCounts = words.map(x => (x, 1L)).reduceByKeyAndWindow(_ + _, _ 
> - _, Minutes(10), Seconds(2), 2)
>   ssc.start()
>   ssc.awaitTermination()
> }
>   }
> This log:
> I0728 04:28:05.439469  4295 exec.cpp:143] Version: 0.28.2
> I0728 04:28:05.443464  4296 exec.cpp:217] Executor registered on slave 
> 8e00b1ea-3f70-428a-8125-fb0eed88aede-S0
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 16/07/28 04:28:07 INFO CoarseGrainedExecutorBackend: Registered signal 
> handlers for [TERM, HUP, INT]
> 16/07/28 04:28:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 16/07/28 04:28:09 INFO SecurityManager: Changing view acls to: vagrant
> 16/07/28 04:28:09 INFO SecurityManager: Changing modify acls to: vagrant
> 16/07/28 04:28:09 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
> with modify permissions: Set(vagrant)
> 16/07/28 04:28:11 INFO SecurityManager: Changing view acls to: vagrant
> 16/07/28 04:28:11 INFO SecurityManager: Changing modify acls to: vagrant
> 16/07/28 04:28:11 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
> with modify permissions: Set(vagrant)
> 16/07/28 04:28:12 INFO Slf4jLogger: Slf4jLogger started
> 16/07/28 04:28:12 INFO Remoting: Starting remoting
> 16/07/28 04:28:13 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkExecutorActorSystem@slave1:57380]
> 16/07/28 04:28:13 INFO Utils: Successfully started service 
> 'sparkExecutorActorSystem' on port 57380.
> 16/07/28 04:28:13 INFO DiskBlockManager: Created local directory at 
> /home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-332d-4858-9527-64558fe7fb90/blockmgr-4e94f4ea-f074-4dbe-bfa1-34054e6e079c
> 16/07/28 04:28:13 INFO MemoryStore: MemoryStore started with capacity 517.4 MB
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
> spark://CoarseGrainedScheduler@192.168.33.30:36179
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Successfully registered 
> with driver
> 16/07/28 04:28:13 INFO Executor: Starting executor ID 
> 8e00b1ea-3f70-428a-8125-fb0eed88aede-S0 on host slave1
> 16/07/28 04:28:13 INFO Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 48672.
> 16/07/28 04:28:13 INFO NettyBlockTransferService: Server created on 48672
> 16/07/28 04:28:13 INFO BlockManagerMaster: Trying to register BlockManager
> 16/07/28 04:28:13 INFO BlockManagerMaster: Registered BlockManager
> 16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Got assigned task 0
> 16/07/28 04:28:13 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
> 16/07/28 04:28:13 INFO Executor: Fetching 
> http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar 
> with timestamp 1469680084686
> 16/07/28 04:28:14 INFO Utils: Fetching 
> http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar 
> to 
> /home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-3

[jira] [Updated] (SPARK-16762) spark hanging when action method print

2016-07-27 Thread Anh Nguyen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-16762?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anh Nguyen updated SPARK-16762:
---
Description: 
I write code spark Streaming (consumer) intergate kafaka and deploy on mesos FW:
import org.apache.spark.SparkConf
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.streaming.kafka._

object consumer {


def main(args: Array[String]) {


  if (args.length < 4) {
System.err.println("Usage: KafkaWordCount   
")
System.exit(1)
  }

  val Array(zkQuorum, group, topics, numThreads) = args
  val sparkConf = new 
SparkConf().setAppName("KafkaWordCount").set("spark.rpc.netty.dispatcher.numThreads","4")
  val ssc = new StreamingContext(sparkConf, Seconds(2))
  ssc.checkpoint("checkpoint")
  val topicMap = topics.split(",").map((_, numThreads.toInt)).toMap


  val lines = KafkaUtils.createStream(ssc, zkQuorum, group, 
topicMap).map(_._2)
  lines.print()

  val words = lines.flatMap(_.split(" "))
  val wordCounts = words.map(x => (x, 1L)).reduceByKeyAndWindow(_ + _, _ - 
_, Minutes(10), Seconds(2), 2)

  ssc.start()
  ssc.awaitTermination()
}
  }

This log:
I0728 04:28:05.439469  4295 exec.cpp:143] Version: 0.28.2
I0728 04:28:05.443464  4296 exec.cpp:217] Executor registered on slave 
8e00b1ea-3f70-428a-8125-fb0eed88aede-S0
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/28 04:28:07 INFO CoarseGrainedExecutorBackend: Registered signal handlers 
for [TERM, HUP, INT]
16/07/28 04:28:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
16/07/28 04:28:09 INFO SecurityManager: Changing view acls to: vagrant
16/07/28 04:28:09 INFO SecurityManager: Changing modify acls to: vagrant
16/07/28 04:28:09 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
with modify permissions: Set(vagrant)
16/07/28 04:28:11 INFO SecurityManager: Changing view acls to: vagrant
16/07/28 04:28:11 INFO SecurityManager: Changing modify acls to: vagrant
16/07/28 04:28:11 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(vagrant); users 
with modify permissions: Set(vagrant)
16/07/28 04:28:12 INFO Slf4jLogger: Slf4jLogger started
16/07/28 04:28:12 INFO Remoting: Starting remoting
16/07/28 04:28:13 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkExecutorActorSystem@slave1:57380]
16/07/28 04:28:13 INFO Utils: Successfully started service 
'sparkExecutorActorSystem' on port 57380.
16/07/28 04:28:13 INFO DiskBlockManager: Created local directory at 
/home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-332d-4858-9527-64558fe7fb90/blockmgr-4e94f4ea-f074-4dbe-bfa1-34054e6e079c
16/07/28 04:28:13 INFO MemoryStore: MemoryStore started with capacity 517.4 MB
16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Connecting to driver: 
spark://CoarseGrainedScheduler@192.168.33.30:36179
16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Successfully registered 
with driver
16/07/28 04:28:13 INFO Executor: Starting executor ID 
8e00b1ea-3f70-428a-8125-fb0eed88aede-S0 on host slave1
16/07/28 04:28:13 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 48672.
16/07/28 04:28:13 INFO NettyBlockTransferService: Server created on 48672
16/07/28 04:28:13 INFO BlockManagerMaster: Trying to register BlockManager
16/07/28 04:28:13 INFO BlockManagerMaster: Registered BlockManager
16/07/28 04:28:13 INFO CoarseGrainedExecutorBackend: Got assigned task 0
16/07/28 04:28:13 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/07/28 04:28:13 INFO Executor: Fetching 
http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar 
with timestamp 1469680084686
16/07/28 04:28:14 INFO Utils: Fetching 
http://192.168.33.30:52977/jars/spaka-1.0-SNAPSHOT-jar-with-dependencies.jar to 
/home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-332d-4858-9527-64558fe7fb90/spark-b833a8b6-b16c-46c6-92a7-89a8f0b3f713/fetchFileTemp4218016376984854956.tmp
16/07/28 04:28:14 INFO Utils: Copying 
/home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-332d-4858-9527-64558fe7fb90/spark-b833a8b6-b16c-46c6-92a7-89a8f0b3f713/-10744548291469680084686_cache
 to 
/home/vagrant/mesos/slave1/work/slaves/8e00b1ea-3f70-428a-8125-fb0eed88aede-S0/frameworks/3b189004-daea-47a3-9ba5-dfac5143f4ab-/executors/0/runs/0433234f-3