JavaRDDLike.scala:225)
>> at
>>
>> org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1.apply(JavaRDDLike.scala:225)
>> at
>>
>> org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$appl
k-Streaming-Job-get-killed-after-running-for-about-1-hour-tp26823.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional comma
$$anonfun$runJob$5.apply(SparkContext.scala:1858)
> at
>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> at org.apache.spark.scheduler.Task.run(Task.scala
d to Kafka not spark.
What do you think the problem is?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Job-get-killed-after-running-for-about-1-hour-tp26823.html
Sent from the Apache Spark User List mailing list archive at Nabble