time. Then may be crash for spark for
exception.
发件人: Charles O. Bajomo [mailto:charles.baj...@pretechconsulting.co.uk]
发送时间: 2017年2月28日 20:10
收件人: 邓刚[产品技术中心]
抄送: user; d...@spark.apache.org
主题: Re: 答复: spark append files to the same hdfs dir issue for
LeaseExpiredException
Unless this is a
" , d...@spark.apache.org
Sent: Tuesday, 28 February, 2017 10:47:47
Subject: 答复: spark append files to the same hdfs dir issue for
LeaseExpiredException
I am writing data to hdfs file, also the hdfs dir is a hive partition file dir.
Hive does not support sub dirs.. for example m
[mailto:charles.baj...@pretechconsulting.co.uk]
发送时间: 2017年2月28日 18:04
收件人: 邓刚[产品技术中心]
抄送: user; d...@spark.apache.org
主题: Re: spark append files to the same hdfs dir issue for LeaseExpiredException
I see this problem as well with the _temporary directory but from what I have
been able to gather, there is no way
hough.
Kind Regards
From: "Triones,Deng(vip.com)"
To: "user" , d...@spark.apache.org
Sent: Tuesday, 28 February, 2017 09:35:00
Subject: spark append files to the same hdfs dir issue for
LeaseExpiredException
Hi dev and users
Now I am running spark streaming ,
Hi dev and users
Now I am running spark streaming , (spark version 2.0.2) to write
file to hdfs. When my spark.streaming.concurrentJobs is more than one. Like 20.
I meet the exception as below.
We know that when the batch finished, there will be a _SUCCESS file.
As I guess