[jira] [Created] (HADOOP-17564) Fix typo in UnixShellGuide.html

2021-03-03 Thread Takanobu Asanuma (Jira)
Takanobu Asanuma created HADOOP-17564:
-

 Summary: Fix typo in UnixShellGuide.html
 Key: HADOOP-17564
 URL: https://issues.apache.org/jira/browse/HADOOP-17564
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Takanobu Asanuma


The file name of hadoop-user-functions.sh.examples should be 
hadoop-user-functions.sh.example in UnixShellGuide.html.

This is reported by [~aref.kh] in HADOOP-17561.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17563) Update Bouncy Castle to 1.68

2021-03-03 Thread Takanobu Asanuma (Jira)
Takanobu Asanuma created HADOOP-17563:
-

 Summary: Update Bouncy Castle to 1.68
 Key: HADOOP-17563
 URL: https://issues.apache.org/jira/browse/HADOOP-17563
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Takanobu Asanuma
Assignee: Takanobu Asanuma


Bouncy Castle 1.60 has Hash Collision Vulnerability. Let's update to 1.68.

https://www.sourceclear.com/vulnerability-database/security/hash-collision/java/sid-6009



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64

2021-03-03 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/226/

No changes




-1 overall


The following subsystems voted -1:
docker


Powered by Apache Yetushttps://yetus.apache.org

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Apache Hadoop qbt Report: trunk+JDK11 on Linux/x86_64

2021-03-03 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/

[Mar 1, 2021 11:36:41 AM] (Stephen O'Donnell) HDFS-14013. Skip any credentials 
stored in HDFS when starting ZKFC. Contributed by Stephen O'Donnell
[Mar 1, 2021 3:52:59 PM] (noreply) HDFS-15854. Make some parameters 
configurable for SlowDiskTracker and SlowPeerTracker (#2718)
[Mar 2, 2021 1:17:23 AM] (Konstantin Shvachko) HDFS-15849. ExpiredHeartbeats 
metric should be of Type.COUNTER. Contributed by Qi Zhu.
[Mar 2, 2021 5:16:11 AM] (noreply) HDFS-15856: Make write pipeline retry times 
configurable. (#2721). Contributed by Qi Zhu
[Mar 2, 2021 9:47:31 PM] (Eric Badger) [MAPREDUCE-7234] ClientHSSecurityInfo 
class is in wrong META-INF file.
[Mar 3, 2021 2:41:05 AM] (noreply) HADOOP-17560. Fix some spelling errors 
(#2730)




-1 overall


The following subsystems voted -1:
blanks mvnsite pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 

Failed junit tests :

   hadoop.hdfs.server.namenode.TestAddOverReplicatedStripedBlocks 
   hadoop.hdfs.server.namenode.TestReconstructStripedBlocks 
   
hadoop.hdfs.server.namenode.sps.TestStoragePolicySatisfierWithStripedFile 
   hadoop.hdfs.server.namenode.TestPersistentStoragePolicySatisfier 
   hadoop.hdfs.server.datanode.fsdataset.impl.TestWriteToReplica 
   hadoop.hdfs.server.namenode.ha.TestPipelinesFailover 
   hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA 
   hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem 
   hadoop.yarn.client.TestRMFailoverProxyProvider 
   hadoop.yarn.client.TestNoHaRMFailoverProxyProvider 
   hadoop.fs.contract.router.web.TestRouterWebHDFSContractCreate 
   
hadoop.yarn.server.timelineservice.reader.TestTimelineReaderWebServicesHBaseStorage
 
   hadoop.yarn.server.router.clientrm.TestFederationClientInterceptor 
   
hadoop.yarn.server.timelineservice.documentstore.TestDocumentStoreCollectionCreator
 
   
hadoop.yarn.server.timelineservice.documentstore.TestDocumentStoreTimelineReaderImpl
 
   
hadoop.yarn.server.timelineservice.documentstore.TestDocumentStoreTimelineWriterImpl
 
   
hadoop.yarn.server.timelineservice.documentstore.writer.cosmosdb.TestCosmosDBDocumentStoreWriter
 
   
hadoop.yarn.server.timelineservice.documentstore.reader.cosmosdb.TestCosmosDBDocumentStoreReader
 
   hadoop.tools.dynamometer.TestDynamometerInfra 
   hadoop.tools.dynamometer.TestDynamometerInfra 
  

   cc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-compile-cc-root.txt
 [116K]

   javac:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-compile-javac-root.txt
 [392K]

   blanks:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/blanks-eol.txt
 [13M]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/blanks-tabs.txt
 [2.0M]

   checkstyle:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-checkstyle-root.txt
 [16M]

   mvnsite:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/patch-mvnsite-root.txt
 [496K]

   pathlen:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-pathlen.txt
 [16K]

   pylint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-pylint.txt
 [20K]

   shellcheck:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/results-shellcheck.txt
 [28K]

   xml:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/132/artifact/out/xml.txt
 [24K]

   javadoc:

  
https://ci-hadoo

Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86_64

2021-03-03 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/

[Mar 2, 2021 5:16:11 AM] (noreply) HDFS-15856: Make write pipeline retry times 
configurable. (#2721). Contributed by Qi Zhu
[Mar 2, 2021 9:47:31 PM] (Eric Badger) [MAPREDUCE-7234] ClientHSSecurityInfo 
class is in wrong META-INF file.




-1 overall


The following subsystems voted -1:
blanks pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 

Failed junit tests :

   hadoop.hdfs.TestReconstructStripedFileWithRandomECPolicy 
   hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks 
   hadoop.mapreduce.v2.hs.TestJobHistoryParsing 
   hadoop.yarn.server.router.clientrm.TestFederationClientInterceptor 
   hadoop.tools.dynamometer.TestDynamometerInfra 
   hadoop.tools.dynamometer.TestDynamometerInfra 
   hadoop.yarn.sls.appmaster.TestAMSimulator 
  

   cc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-compile-cc-root.txt
 [116K]

   javac:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-compile-javac-root.txt
 [368K]

   blanks:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/blanks-eol.txt
 [13M]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/blanks-tabs.txt
 [2.0M]

   checkstyle:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-checkstyle-root.txt
 [16M]

   pathlen:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-pathlen.txt
 [16K]

   pylint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-pylint.txt
 [20K]

   shellcheck:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-shellcheck.txt
 [28K]

   xml:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/xml.txt
 [24K]

   javadoc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/results-javadoc-javadoc-root.txt
 [1.1M]

   unit:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 [328K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-hs.txt
 [16K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-router.txt
 [16K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-tools_hadoop-dynamometer_hadoop-dynamometer-infra.txt
 [8.0K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-tools_hadoop-dynamometer.txt
 [24K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/435/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt
 [12K]

Powered by Apache Yetus 0.13.0   https://yetus.apache.org

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

[jira] [Created] (HADOOP-17562) Provide mechanism for explicitly specifying the compression codec for input files

2021-03-03 Thread Nicholas Chammas (Jira)
Nicholas Chammas created HADOOP-17562:
-

 Summary: Provide mechanism for explicitly specifying the 
compression codec for input files
 Key: HADOOP-17562
 URL: https://issues.apache.org/jira/browse/HADOOP-17562
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Nicholas Chammas


I come to you via SPARK-29280.

I am looking for the file _input_ equivalents of the following settings:
{code:java}
mapreduce.output.fileoutputformat.compress
mapreduce.map.output.compress{code}
Right now, I understand that Hadoop infers the codec to use when reading a file 
from the file's extension.

However, in some cases the files may have the incorrect extension or no 
extension. There are links to some examples from SPARK-29280.

Ideally, you should be able to explicitly specify the codec to use to read 
those files. I don't believe that's possible today. Instead, the current 
workaround appears to be to [create a custom codec 
class|https://stackoverflow.com/a/17152167/877069] and override the 
getDefaultExtension method to specify the extension to expect.

Does it make sense to offer an explicit way to select the compression codec for 
file input, mirroring how things work for file output?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17561) Where is hadoop-user-functions.sh.examples ?

2021-03-03 Thread Takanobu Asanuma (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takanobu Asanuma resolved HADOOP-17561.
---
Resolution: Invalid

> Where is hadoop-user-functions.sh.examples ?
> 
>
> Key: HADOOP-17561
> URL: https://issues.apache.org/jira/browse/HADOOP-17561
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.1.3
>Reporter: Aref Khandan
>Priority: Critical
>
> in UnixShellGuide page 
> [https://hadoop.apache.org/docs/r3.1.3/hadoop-project-dist/hadoop-common/UnixShellGuide.html]
> it is mentioned that 
> {{Examples of function replacement are in the 
> ??{{hadoop-user-functions.sh.examples}}?? file.}}
> I've searched through whole Hadoop directory and source code, but there is no 
> trace of this file except:
> [hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md]
> which only mentions ??Examples of function replacement are in the 
> `hadoop-user-functions.sh.examples` file.??
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17561) Where is hadoop-user-functions.sh.examples ?

2021-03-03 Thread Aref Khandan (Jira)
Aref Khandan created HADOOP-17561:
-

 Summary: Where is hadoop-user-functions.sh.examples ?
 Key: HADOOP-17561
 URL: https://issues.apache.org/jira/browse/HADOOP-17561
 Project: Hadoop Common
  Issue Type: Improvement
  Components: common
Affects Versions: 3.1.3
Reporter: Aref Khandan


in UnixShellGuide page 
[https://hadoop.apache.org/docs/r3.1.3/hadoop-project-dist/hadoop-common/UnixShellGuide.html]

it is mentioned that 

{{Examples of function replacement are in the 
??{{hadoop-user-functions.sh.examples}}?? file.}}

I've searched through whole Hadoop directory and source code, but there is no 
trace of this file except:

[hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md]

which only mentions ??Examples of function replacement are in the 
`hadoop-user-functions.sh.examples` file.??

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org