time. Then may be crash for spark for
exception.
发件人: Charles O. Bajomo [mailto:charles.baj...@pretechconsulting.co.uk]
发送时间: 2017年2月28日 20:10
收件人: 邓刚[产品技术中心]
抄送: user; d...@spark.apache.org
主题: Re: 答复: spark append files to the same hdfs dir issue for
LeaseExpiredException
Unless
uot;Charles O. Bajomo" <charles.baj...@pretechconsulting.co.uk>
Cc: "user" <user@spark.apache.org>, d...@spark.apache.org
Sent: Tuesday, 28 February, 2017 10:47:47
Subject: 答复: spark append files to the same hdfs dir issue for
LeaseExpiredException
I am writing data to hdfs
I am writing data to hdfs file, also the hdfs dir is a hive partition file dir.
Hive does not support sub dirs.. for example my partition folder is
***/dt=20170224/hm=1400 that means I need to write all the data between 1400
to 1500 to the same folder.
发件人: Charles O. Bajomo