org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
>
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>
> at org.apache.spark.scheduler.Task.run(Task.scala:88)
>
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor
or.default-dispatcher-4]:
Sent stop signal to all 42 receivers
发件人: Shixiong(Ryan) Zhu [mailto:shixi...@databricks.com]
发送时间: 2016年1月16日 6:28
收件人: 邓刚[技术中心]
抄送: Yogesh Mahajan; user
主题: Re: 答复: 答复: 答复: spark streaming context trigger invoke stop why?
I see. So when your job fails, `jsc.awaitTerm
I see. So when your job fails, `jsc.awaitTermination();` will throw an
exception. Then you app main method will exit and trigger the shutdown hook
and call `jsc.stop()`.
On Thu, Jan 14, 2016 at 10:20 PM, Triones,Deng(vip.com) <
triones.d...@vipshop.com> wrote:
> Thanks for your response .
>
> Our
Thanks for your response .
Our code as below :
public void process(){
logger.info("streaming process start !!!");
SparkConf sparkConf = createSparkConf(this.getClass().getSimpleName());
JavaStreamingContext jsc = this.createJavaStreamingContext(sparkConf);
if(th