[jira] [Created] (HDFS-12389) Ozone: oz commandline list calls should return valid JSON format output

2017-09-03 Thread Weiwei Yang (JIRA)
Weiwei Yang created HDFS-12389:
--

 Summary: Ozone: oz commandline list calls should return valid JSON 
format output
 Key: HDFS-12389
 URL: https://issues.apache.org/jira/browse/HDFS-12389
 Project: Hadoop HDFS
  Issue Type: Sub-task
  Components: ozone
Affects Versions: HDFS-7240
Reporter: Weiwei Yang
Assignee: Weiwei Yang


At present the outputs of {{listVolume}}, {{listBucket}} and {{listKey}} are 
hard to parse, for example following call

{code}
./bin/hdfs oz -listVolume http://localhost:9864 -user wwei
{code}

lists all volumes in my cluster and it returns

{noformat}
{
"version" : 0,
"md5hash" : null,
"createdOn" : "Mon, 04 Sep 2017 03:25:22 GMT",
"modifiedOn" : "Mon, 04 Sep 2017 03:25:22 GMT",
"size" : 10240,
"keyName" : "key-0-22381",
"dataFileName" : null
  }
 {  
"version" : 0,
"md5hash" : null,
"createdOn" : "Mon, 04 Sep 2017 03:25:22 GMT",
"modifiedOn" : "Mon, 04 Sep 2017 03:25:22 GMT",
"size" : 10240,
"keyName" : "key-0-22381",
"dataFileName" : null
  }
  ...
{noformat}

this is not a valid JSON format output hence it is hard to parse in clients' 
script for further interactions. Propose to reformat them to valid JSON data.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org



[jira] [Created] (HDFS-12388) A bad error message in DFSStripedOutputStream

2017-09-03 Thread Kai Zheng (JIRA)
Kai Zheng created HDFS-12388:


 Summary: A bad error message in DFSStripedOutputStream
 Key: HDFS-12388
 URL: https://issues.apache.org/jira/browse/HDFS-12388
 Project: Hadoop HDFS
  Issue Type: Bug
Reporter: Kai Zheng


Noticed a failure reported by Jenkins in HDFS-11882. The reported error message 
wasn't correct, it should be: {{the number of failed blocks = 4 > the number of 
data blocks = 3}} =>  {{the number of failed blocks = 4 > the number of parity 
blocks = 3}} 
{noformat}
Regression

org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure030.testBlockTokenExpired

Failing for the past 1 build (Since Failed#20973 )
Took 6.4 sec.
Error Message

Failed at i=6294527
Stacktrace

java.io.IOException: Failed at i=6294527
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(TestDFSStripedOutputStreamWithFailure.java:559)
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:534)
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.testBlockTokenExpired(TestDFSStripedOutputStreamWithFailure.java:273)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
Caused by: java.io.IOException: Failed: the number of failed blocks = 4 > the 
number of data blocks = 3
at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamers(DFSStripedOutputStream.java:392)
at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.handleStreamerFailure(DFSStripedOutputStream.java:410)
at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.flushAllInternals(DFSStripedOutputStream.java:1262)
at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamerFailures(DFSStripedOutputStream.java:627)
at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.writeChunk(DFSStripedOutputStream.java:563)
at 
org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:217)
at 
org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:164)
at 
org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:145)
at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:79)
at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:48)
at java.io.DataOutputStream.write(DataOutputStream.java:88)
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.write(TestDFSStripedOutputStreamWithFailure.java:557)
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:534)
at 
org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.testBlockTokenExpired(TestDFSStripedOutputStreamWithFailure.java:273)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
{noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2017-09-03 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/

No changes




-1 overall


The following subsystems voted -1:
findbugs unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager
 
   Hard coded reference to an absolute pathname in 
org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
 At DockerLinuxContainerRuntime.java:absolute pathname in 
org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
 At DockerLinuxContainerRuntime.java:[line 490] 

Failed junit tests :

   hadoop.hdfs.TestLeaseRecoveryStriped 
   hadoop.hdfs.TestClientProtocolForPipelineRecovery 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure130 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090 
   hadoop.hdfs.TestModTime 
   hadoop.hdfs.TestReadStripedFileWithDecoding 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure010 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure120 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070 
   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure 
   
hadoop.yarn.server.resourcemanager.scheduler.capacity.TestContainerAllocation 
   hadoop.yarn.server.TestContainerManagerSecurity 
   hadoop.yarn.client.cli.TestLogsCLI 
   hadoop.mapreduce.v2.hs.webapp.TestHSWebApp 
   hadoop.yarn.sls.TestReservationSystemInvariants 
   hadoop.yarn.sls.TestSLSRunner 

Timed out junit tests :

   org.apache.hadoop.hdfs.TestWriteReadStripedFile 
   org.apache.hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-compile-javac-root.txt
  [292K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-checkstyle-root.txt
  [17M]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-patch-shellcheck.txt
  [20K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/diff-patch-shelldocs.txt
  [12K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/whitespace-eol.txt
  [11M]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/whitespace-tabs.txt
  [1.2M]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager-warnings.html
  [8.0K]

   javadoc:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-javadoc-root.txt
  [2.0M]

   unit:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [1.4M]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
  [64K]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-tests.txt
  [12K]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-client.txt
  [16K]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-hs.txt
  [16K]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/512/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt
  [20K]

Powered by Apache Yetus 0.6.0-SNAPSHOT   http://yetus.apache.org

-
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org