Hi All,

I've some Streaming code in Java that works on 0.9.1.  After upgrading to
1.0 (with fix to minor API changes) DStream does not seem to be executing.
The tasks got killed in 1 second by the worker.  Any idea what is causing
it?

The worker log file is not logging my debug statements.  The following
console output seems to be executing something but my debug logs and
stream.print() is not being logged.  I see my console println statement on
the context being started.  I am using Spark standalone cluster.

Any help is much appreciated.

Console Log:
===========
14/06/13 22:54:07 INFO AppClient$ClientActor: Connecting to master
spark://10.206.133.73:7077...
14/06/13 22:54:07 INFO SparkDeploySchedulerBackend: Connected to Spark
cluster with app ID app-20140613225220-0001
14/06/13 22:54:07 INFO AppClient$ClientActor: Executor added:
app-20140613225220-0001/0 on worker-20140613223732-tahiti-ins.xxx.com-46311
(tahiti-ins.xxx.com:46311) with 1 cores
14/06/13 22:54:07 INFO SparkDeploySchedulerBackend: Granted executor ID
app-20140613225220-0001/0 on hostPort tahiti-ins.xxx.com:46311 with 1 cores,
512.0 MB RAM
14/06/13 22:54:07 INFO AppClient$ClientActor: Executor updated:
app-20140613225220-0001/0 is now RUNNING
Reading CSV File
SSC Context Start
14/06/13 22:54:08 INFO StateDStream: Checkpoint interval automatically set
to 10000 ms
14/06/13 22:54:08 INFO StateDStream: Checkpoint interval automatically set
to 10000 ms
14/06/13 22:54:08 INFO MappedDStream: Duration for remembering RDDs set to
20000 ms for org.apache.spark.streaming.dstream.MappedDStream@1158876
14/06/13 22:54:08 INFO QueueInputDStream: Duration for remembering RDDs set
to 20000 ms for org.apache.spark.streaming.dstream.QueueInputDStream@b2100c
14/06/13 22:54:08 INFO TransformedDStream: Duration for remembering RDDs set
to 20000 ms for org.apache.spark.streaming.dstream.TransformedDStream@3ba710
14/06/13 22:54:08 INFO ShuffledDStream: Duration for remembering RDDs set to
20000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@550ea2
14/06/13 22:54:08 INFO MappedDStream: Duration for remembering RDDs set to
20000 ms for org.apache.spark.streaming.dstream.MappedDStream@33c98c
14/06/13 22:54:08 INFO ShuffledDStream: Duration for remembering RDDs set to
20000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@1f50b8f
14/06/13 22:54:08 INFO MappedDStream: Duration for remembering RDDs set to
20000 ms for org.apache.spark.streaming.dstream.MappedDStream@ca0145
. . .
14/06/13 22:54:08 INFO ShuffledDStream: Initialized and validated
org.apache.spark.streaming.dstream.ShuffledDStream@1f50b8f
14/06/13 22:54:08 INFO TransformedDStream: Slide time = 10000 ms
14/06/13 22:54:08 INFO TransformedDStream: Storage level =
StorageLevel(false, false, false, false, 1)
14/06/13 22:54:08 INFO TransformedDStream: Checkpoint interval = null
14/06/13 22:54:08 INFO TransformedDStream: Remember duration = 20000 ms
14/06/13 22:54:08 INFO TransformedDStream: Initialized and validated
org.apache.spark.streaming.dstream.TransformedDStream@3ba710
14/06/13 22:54:08 INFO StateDStream: Slide time = 10000 ms
14/06/13 22:54:08 INFO StateDStream: Storage level = StorageLevel(false,
true, false, false, 1)
14/06/13 22:54:08 INFO StateDStream: Checkpoint interval = 10000 ms
14/06/13 22:54:08 INFO StateDStream: Remember duration = 20000 ms
14/06/13 22:54:08 INFO StateDStream: Initialized and validated
org.apache.spark.streaming.dstream.StateDStream@140bbad
14/06/13 22:54:08 INFO ForEachDStream: Slide time = 10000 ms
14/06/13 22:54:08 INFO ForEachDStream: Storage level = StorageLevel(false,
false, false, false, 1)
14/06/13 22:54:08 INFO ForEachDStream: Checkpoint interval = null
14/06/13 22:54:08 INFO ForEachDStream: Remember duration = 10000 ms
14/06/13 22:54:08 INFO ForEachDStream: Initialized and validated
org.apache.spark.streaming.dstream.ForEachDStream@cf3c09
14/06/13 22:54:08 INFO RecurringTimer: Started timer for JobGenerator at
time 1402725250000
14/06/13 22:54:08 INFO JobGenerator: Started JobGenerator at 1402725250000
ms
14/06/13 22:54:08 INFO JobScheduler: Started JobScheduler



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/DStream-are-not-processed-after-upgrade-to-Spark-1-0-tp7627.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to