For more details, see
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-ppc/333/
[Jun 1, 2017 2:29:29 PM] (brahma) HDFS-11893. Fix
TestDFSShell.testMoveWithTargetPortEmpty failure.
[Jun 1, 2017 4:28:33 PM] (brahma) HDFS-11905. Fix license header inconsistency
in hdfs. Contributed by
[J
Sean Mackrory created HADOOP-14484:
--
Summary: Ensure deleted parent directory tombstones are
overwritten when implicitly recreated
Key: HADOOP-14484
URL: https://issues.apache.org/jira/browse/HADOOP-14484
Steve Loughran created HADOOP-14483:
---
Summary: increase default value of fs.s3a.multipart.size to 128M
Key: HADOOP-14483
URL: https://issues.apache.org/jira/browse/HADOOP-14483
Project: Hadoop Common
Wei-Chiu Chuang created HADOOP-14482:
Summary: Update BUILDING.txt to include the correct steps to
install zstd library
Key: HADOOP-14482
URL: https://issues.apache.org/jira/browse/HADOOP-14482
Pr
Wei-Chiu Chuang created HADOOP-14481:
Summary: Print stack trace when native bzip2 library does not load
Key: HADOOP-14481
URL: https://issues.apache.org/jira/browse/HADOOP-14481
Project: Hadoop Co
Xiao Chen created HADOOP-14480:
--
Summary: Remove Oracle JDK usage in Dockerfile
Key: HADOOP-14480
URL: https://issues.apache.org/jira/browse/HADOOP-14480
Project: Hadoop Common
Issue Type: Impro
For more details, see
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/422/
[Jun 1, 2017 2:29:29 PM] (brahma) HDFS-11893. Fix
TestDFSShell.testMoveWithTargetPortEmpty failure.
[Jun 1, 2017 4:28:33 PM] (brahma) HDFS-11905. Fix license header inconsistency
in hdfs. Contributed by
[J
Ayappan created HADOOP-14479:
Summary: Erasurecode testcase failures with ISA-L
Key: HADOOP-14479
URL: https://issues.apache.org/jira/browse/HADOOP-14479
Project: Hadoop Common
Issue Type: Bug
> On 1 Jun 2017, at 06:15, Akira Ajisaka wrote:
>
> Hi folks,
>
> https://wiki.apache.org/hadoop/ is no longer editable.
> If you want to edit a wiki page in wiki.apache.org,
> you need to migrate the page to
> https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Home.
>
> If you want to