[jira] [Resolved] (HADOOP-18135) Produce Windows binaries of Hadoop

2024-04-09 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18135?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18135.
-
Fix Version/s: 3.5.0
   Resolution: Fixed

Merged PR [https://github.com/apache/hadoop/pull/6673] to trunk.

> Produce Windows binaries of Hadoop
> --
>
> Key: HADOOP-18135
> URL: https://issues.apache.org/jira/browse/HADOOP-18135
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.5.0
>
>
> We currently only provide Linux libraries and binaries. We need to provide 
> the same for Windows. We need to port the [create-release 
> script|https://github.com/apache/hadoop/blob/5f9932acc4fa2b36a3005e587637c53f2da1618d/dev-support/bin/create-release]
>  to run on Windows and produce the Windows binaries.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-19127) Do not run unit tests on Windows pre-commit CI

2024-03-25 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-19127:

Description: 
We currently have a good number of unit tests failing on Windows. The author of 
a patch won't be able to determine if it really was his/her PR that caused the 
failure or if it was failing historically.

Thus, we need to skip running the unit tests on Windows until we have fixed the 
ones that are failing currently.

This helps us to proceed with enabling the pre-commit that watches out for any 
regression against building Hadoop on Windows.

Please refer to this thread for more context - 
https://github.com/apache/hadoop/pull/5820#issuecomment-1871957975.

  was:
We currently have a good number of unit tests failing on Windows. The author of 
a patch won't be able to determine if it really was his/her PR that caused the 
failure or if it was failing historically.

Thus, we need to skip running the unit tests on Windows until we have fixed the 
ones that are failing currently.

This helps us to proceed with enabling the pre-commit that watches out for any 
regression against building Hadoop on Windows.


> Do not run unit tests on Windows pre-commit CI
> --
>
> Key: HADOOP-19127
> URL: https://issues.apache.org/jira/browse/HADOOP-19127
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build, test
>Affects Versions: 3.5.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
>
> We currently have a good number of unit tests failing on Windows. The author 
> of a patch won't be able to determine if it really was his/her PR that caused 
> the failure or if it was failing historically.
> Thus, we need to skip running the unit tests on Windows until we have fixed 
> the ones that are failing currently.
> This helps us to proceed with enabling the pre-commit that watches out for 
> any regression against building Hadoop on Windows.
> Please refer to this thread for more context - 
> https://github.com/apache/hadoop/pull/5820#issuecomment-1871957975.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-19127) Do not run unit tests on Windows pre-commit CI

2024-03-25 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-19127:
---

 Summary: Do not run unit tests on Windows pre-commit CI
 Key: HADOOP-19127
 URL: https://issues.apache.org/jira/browse/HADOOP-19127
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, test
Affects Versions: 3.5.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


We currently have a good number of unit tests failing on Windows. The author of 
a patch won't be able to determine if it really was his/her PR that caused the 
failure or if it was failing historically.

Thus, we need to skip running the unit tests on Windows until we have fixed the 
ones that are failing currently.

This helps us to proceed with enabling the pre-commit that watches out for any 
regression against building Hadoop on Windows.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-19126) Setup pre-commit CI for Windows

2024-03-25 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19126?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-19126.
-
Fix Version/s: 3.5.0
   Resolution: Duplicate

> Setup pre-commit CI for Windows
> ---
>
> Key: HADOOP-19126
> URL: https://issues.apache.org/jira/browse/HADOOP-19126
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: build
>Affects Versions: 3.5.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.5.0
>
>
> We need to setup a multi-branch pre-commit CI on Jenkins for validation of 
> the PRs against Windows, prior to merging.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-19126) Setup pre-commit CI for Windows

2024-03-25 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-19126:
---

 Summary: Setup pre-commit CI for Windows
 Key: HADOOP-19126
 URL: https://issues.apache.org/jira/browse/HADOOP-19126
 Project: Hadoop Common
  Issue Type: New Feature
  Components: build
Affects Versions: 3.5.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


We need to setup a multi-branch pre-commit CI on Jenkins for validation of the 
PRs against Windows, prior to merging.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-19125) Exclude some files from Apache RAT check

2024-03-25 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-19125:
---

 Summary: Exclude some files from Apache RAT check
 Key: HADOOP-19125
 URL: https://issues.apache.org/jira/browse/HADOOP-19125
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.5.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The following files cause the Apache RAT check to fail on Windows and must be 
excluded from the RAT check -
 # src/main/winutils/winutils.sln - Visual Studio solution file

 # src/test/resources/lz4/sequencefile - Binary file



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18135) Produce Windows binaries of Hadoop

2024-01-05 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18135?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18135:

Description: We currently only provide Linux libraries and binaries. We 
need to provide the same for Windows. We need to port the [create-release 
script|https://github.com/apache/hadoop/blob/5f9932acc4fa2b36a3005e587637c53f2da1618d/dev-support/bin/create-release]
 to run on Windows and produce the Windows binaries.  (was: We currently only 
provide Linux libraries and binaries. We need to provide the same for Windows.)

> Produce Windows binaries of Hadoop
> --
>
> Key: HADOOP-18135
> URL: https://issues.apache.org/jira/browse/HADOOP-18135
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> We currently only provide Linux libraries and binaries. We need to provide 
> the same for Windows. We need to port the [create-release 
> script|https://github.com/apache/hadoop/blob/5f9932acc4fa2b36a3005e587637c53f2da1618d/dev-support/bin/create-release]
>  to run on Windows and produce the Windows binaries.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-19017) Setup pre-commit CI for Windows

2023-12-24 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-19017:
---

 Summary: Setup pre-commit CI for Windows
 Key: HADOOP-19017
 URL: https://issues.apache.org/jira/browse/HADOOP-19017
 Project: Hadoop Common
  Issue Type: New Feature
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


We need to setup a pre-commit CI for validating the Hadoop PRs against Windows.

On a sidenote, we've got the nightly Jenkins CI running for Hadoop on Windows - 
https://ci-hadoop.apache.org/view/Hadoop/job/hadoop-qbt-trunk-java8-win10-x86_64/.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-19017) Setup pre-commit CI for Windows 10

2023-12-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19017?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-19017:

Summary: Setup pre-commit CI for Windows 10  (was: Setup pre-commit CI for 
Windows)

> Setup pre-commit CI for Windows 10
> --
>
> Key: HADOOP-19017
> URL: https://issues.apache.org/jira/browse/HADOOP-19017
> Project: Hadoop Common
>  Issue Type: New Feature
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: Jenkins
>
> We need to setup a pre-commit CI for validating the Hadoop PRs against 
> Windows.
> On a sidenote, we've got the nightly Jenkins CI running for Hadoop on Windows 
> - 
> https://ci-hadoop.apache.org/view/Hadoop/job/hadoop-qbt-trunk-java8-win10-x86_64/.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19016) Unable to build Hadoop in Windows Container due to missing of devenv

2023-12-24 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17800186#comment-17800186
 ] 

Gautham Banasandra commented on HADOOP-19016:
-

[~wekoms] Ah, my bad. You're right, the VS build tools doesn't give the 
*devenv* tool. We thus need to bypass the invocation of *win-vs-upgrade.cmd*. 
This happens when you specify the following maven argument -
{code}
-Duse.platformToolsetVersion=v142
{code}

In fact, you'll need to pass all these arguments to maven, as mentioned in the 
build instructions - 
https://github.com/apache/hadoop/blob/415e9bdfbdeebded520e0233bcb91a487411a94b/BUILDING.txt#L644-L651
{code}
> set classpath=
> set PROTOBUF_HOME=C:\vcpkg\installed\x64-windows
> mvn clean package -Dhttps.protocols=TLSv1.2 -DskipTests -DskipDocs 
> -Pnative-win,dist^
-Drequire.openssl -Drequire.test.libhadoop -Pyarn-ui 
-Dshell-executable=C:\Git\bin\bash.exe^
-Dtar -Dopenssl.prefix=C:\vcpkg\installed\x64-windows^
-Dcmake.prefix.path=C:\vcpkg\installed\x64-windows^
-Dwindows.cmake.toolchain.file=C:\vcpkg\scripts\buildsystems\vcpkg.cmake 
-Dwindows.cmake.build.type=RelWithDebInfo^
-Dwindows.build.hdfspp.dll=off -Dwindows.no.sasl=on 
-Duse.platformToolsetVersion=v142
{code}

> Unable to build Hadoop in Windows Container due to missing of devenv
> 
>
> Key: HADOOP-19016
> URL: https://issues.apache.org/jira/browse/HADOOP-19016
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: client-mounts, common
>Affects Versions: 3.3.4
> Environment: Can reproduce this on 2 of my computers.
> * Windows 11 22631.2861
> * Docker Desktop 4.26.1 (131620)
> * Docker version 24.0.7, build afdd53b
> * Tested Hadoop trunk commit: 77edca8f0a97668722a6d602aa4d08d1fff06172
> * Tested Hadoop 3.3.4 commit: a585a73c3e02ac62350c136643a5e7f6095a3dbb
>Reporter: wy
>Priority: Major
> Attachments: image-2023-12-22-17-12-45-278.png, 
> image-2023-12-22-17-14-49-935.png, image-2023-12-22-17-18-37-712.png, 
> image-2023-12-22-17-20-24-345.png, image-2023-12-22-20-08-59-918.png, 
> screenshot-1.png
>
>
> For Windows, 
> [Dockerfile|https://github.com/apache/hadoop/blob/77edca8f0a97668722a6d602aa4d08d1fff06172/dev-support/docker/Dockerfile_windows_10]
>  and [build 
> instructions|https://github.com/apache/hadoop/blob/trunk/BUILDING.txt] are 
> provided for building Hadoop. However, when starting to Maven build Hadoop 
> project in the container, it will fail at calling `devenv` to upgrade VS 
> solutions:
> !image-2023-12-22-17-12-45-278.png!
> This is caused by 
> [win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/trunk/dev-support/bin/win-vs-upgrade.cmd].
>  The script checks whether there's `devenv` command, and if there's not, exit 
> with error.
> !image-2023-12-22-17-14-49-935.png!
> The script is called during building Hadoop Common project, set in win-native 
> profile of the 
> [POM|https://github.com/apache/hadoop/blob/77edca8f0a97668722a6d602aa4d08d1fff06172/hadoop-common-project/hadoop-common/pom.xml#L903C38-L903C38].
> !image-2023-12-22-17-18-37-712.png!
> But within the container the command is not available, so it will always fail 
> at this step.
> !image-2023-12-22-17-20-24-345.png!
> If we manually edit the file, removing the check and the call to devenv. The 
> build will still fail, because current sln file within the code repo is based 
> on VS 2010. Because the VS tools installed is 2019(16), the versions do not 
> match.
> !image-2023-12-22-20-08-59-918.png!
> I'm not sure if someone has successfully built Hadoop using this Dockerfile 
> before, but currently it doesn't seem to be possible to directly build it 
> just following BUILDING.txt without other change.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19016) Unable to build Hadoop in Windows Container due to missing of devenv

2023-12-22 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17800010#comment-17800010
 ] 

Gautham Banasandra commented on HADOOP-19016:
-

[~wekoms] The Dockerfile that you've mentioned in the description points to 
that of Linux. You'll need to use the one for Windows - 
https://github.com/apache/hadoop/blob/77edca8f0a97668722a6d602aa4d08d1fff06172/dev-support/docker/Dockerfile_windows_10.

We're installing the Visual Studio 2019 Build Tools in these lines - 
https://github.com/apache/hadoop/blob/77edca8f0a97668722a6d602aa4d08d1fff06172/dev-support/docker/Dockerfile_windows_10#L29-L38.
 So, that would provide the *devenv* needed by win-vs-upgrade.cmd.

> Unable to build Hadoop in Windows Container due to missing of devenv
> 
>
> Key: HADOOP-19016
> URL: https://issues.apache.org/jira/browse/HADOOP-19016
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: client-mounts, common
>Affects Versions: 3.3.4
> Environment: Can reproduce this on 2 of my computers.
> * Windows 11 22631.2861
> * Docker Desktop 4.26.1 (131620)
> * Docker version 24.0.7, build afdd53b
> * Tested Hadoop trunk commit: 77edca8f0a97668722a6d602aa4d08d1fff06172
> * Tested Hadoop 3.3.4 commit: a585a73c3e02ac62350c136643a5e7f6095a3dbb
>Reporter: wy
>Priority: Major
> Attachments: image-2023-12-22-17-12-45-278.png, 
> image-2023-12-22-17-14-49-935.png, image-2023-12-22-17-18-37-712.png, 
> image-2023-12-22-17-20-24-345.png, image-2023-12-22-20-08-59-918.png
>
>
> For Windows, 
> [Dockerfile|https://github.com/apache/hadoop/blob/trunk/dev-support/docker/Dockerfile]
>  and [build 
> instructions|https://github.com/apache/hadoop/blob/trunk/BUILDING.txt] are 
> provided for building Hadoop. However, when starting to Maven build Hadoop 
> project in the container, it will fail at calling `devenv` to upgrade VS 
> solutions:
> !image-2023-12-22-17-12-45-278.png!
> This is caused by 
> [win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/trunk/dev-support/bin/win-vs-upgrade.cmd].
>  The script checks whether there's `devenv` command, and if there's not, exit 
> with error.
> !image-2023-12-22-17-14-49-935.png!
> The script is called during building Hadoop Common project, set in win-native 
> profile of the 
> [POM|https://github.com/apache/hadoop/blob/77edca8f0a97668722a6d602aa4d08d1fff06172/hadoop-common-project/hadoop-common/pom.xml#L903C38-L903C38].
> !image-2023-12-22-17-18-37-712.png!
> But within the container the command is not available, so it will always fail 
> at this step.
> !image-2023-12-22-17-20-24-345.png!
> If we manually edit the file, removing the check and the call to devenv. The 
> build will still fail, because current sln file within the code repo is based 
> on VS 2010. Because the VS tools installed is 2019(16), the versions do not 
> match.
> !image-2023-12-22-20-08-59-918.png!
> I'm not sure if someone has successfully built Hadoop using this Dockerfile 
> before, but currently it doesn't seem to be possible to directly build it 
> just following BUILDING.txt without other change.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18834) Install strings utility for git bash on Windows

2023-07-30 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18834:
---

 Summary: Install strings utility for git bash on Windows
 Key: HADOOP-18834
 URL: https://issues.apache.org/jira/browse/HADOOP-18834
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra
 Fix For: 3.4.0


We get the following error while building Hadoop on Windows 10 -

{code}
[2023-07-28T07:16:22.389Z] 

[2023-07-28T07:16:22.389Z] 

[2023-07-28T07:16:22.389Z]  Determining needed tests
[2023-07-28T07:16:22.389Z] 

[2023-07-28T07:16:22.389Z] 

[2023-07-28T07:16:22.389Z] 
[2023-07-28T07:16:22.389Z] 
[2023-07-28T07:16:22.389Z] (Depending upon input size and number of plug-ins, 
this may take a while)
[2023-07-28T07:20:59.610Z] /c/out/precommit/plugins.d/maven.sh: line 275: 
strings: command not found
{code}

We need to install the strings utility for git bash on Windows to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18833) Install bats for building Hadoop on Windows

2023-07-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18833:

Description: 
We get the following error while building Hadoop on Windows (logs attached -  
[^archive.zip] ) -

{code}
[INFO] --- maven-antrun-plugin:1.8:run (common-test-bats-driver) @ 
hadoop-common ---
[INFO] Executing tasks

main:
 [exec] 
 [exec] 
 [exec] ERROR: bats not installed. Skipping bash tests.
 [exec] ERROR: Please install bats as soon as possible.
 [exec] 
{code}

We need to install bats to fix this.

  was:
We get the following error while building Hadoop on Windows -

{code}
[INFO] --- maven-antrun-plugin:1.8:run (common-test-bats-driver) @ 
hadoop-common ---
[INFO] Executing tasks

main:
 [exec] 
 [exec] 
 [exec] ERROR: bats not installed. Skipping bash tests.
 [exec] ERROR: Please install bats as soon as possible.
 [exec] 
{code}

We need to install bats to fix this.


> Install bats for building Hadoop on Windows
> ---
>
> Key: HADOOP-18833
> URL: https://issues.apache.org/jira/browse/HADOOP-18833
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.4.0
>
> Attachments: archive.zip
>
>
> We get the following error while building Hadoop on Windows (logs attached -  
> [^archive.zip] ) -
> {code}
> [INFO] --- maven-antrun-plugin:1.8:run (common-test-bats-driver) @ 
> hadoop-common ---
> [INFO] Executing tasks
> main:
>  [exec] 
>  [exec] 
>  [exec] ERROR: bats not installed. Skipping bash tests.
>  [exec] ERROR: Please install bats as soon as possible.
>  [exec] 
> {code}
> We need to install bats to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18833) Install bats for building Hadoop on Windows

2023-07-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18833:

Attachment: archive.zip

> Install bats for building Hadoop on Windows
> ---
>
> Key: HADOOP-18833
> URL: https://issues.apache.org/jira/browse/HADOOP-18833
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.4.0
>
> Attachments: archive.zip
>
>
> We get the following error while building Hadoop on Windows -
> {code}
> [INFO] --- maven-antrun-plugin:1.8:run (common-test-bats-driver) @ 
> hadoop-common ---
> [INFO] Executing tasks
> main:
>  [exec] 
>  [exec] 
>  [exec] ERROR: bats not installed. Skipping bash tests.
>  [exec] ERROR: Please install bats as soon as possible.
>  [exec] 
> {code}
> We need to install bats to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18833) Install bats for building Hadoop on Windows

2023-07-30 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18833:
---

 Summary: Install bats for building Hadoop on Windows
 Key: HADOOP-18833
 URL: https://issues.apache.org/jira/browse/HADOOP-18833
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra
 Fix For: 3.4.0


We get the following error while building Hadoop on Windows -

{code}
[INFO] --- maven-antrun-plugin:1.8:run (common-test-bats-driver) @ 
hadoop-common ---
[INFO] Executing tasks

main:
 [exec] 
 [exec] 
 [exec] ERROR: bats not installed. Skipping bash tests.
 [exec] ERROR: Please install bats as soon as possible.
 [exec] 
{code}

We need to install bats to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle of the incorrectly concatenated path -
H:\hadoop-common-project\hadoop-common\target\site\H {color:red}*:*{color} 
\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H {color:red}*:*{color} 
\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-pr

[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# 1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# 2. In Windows, it yields an incorrect path and thus fails since there's a 
colon ( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109
> {code}
> ${project.build.directory}/site
> ${pr

[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H *:* 
\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109
> {code}
> ${project.build.directory}/site
> ${proje

[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# 1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# 2. In Windows, it yields an incorrect path and thus fails since there's a 
colon ( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
2. In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109
> {code}
> ${project.build.directory}/site
> ${

[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H {color:red}*:*{color} 
\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
# In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
# In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H *:* 
\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109
> {code}
> ${project.build.dire

[jira] [Updated] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18751:

Description: 
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
2. In Windows, it yields an incorrect path and thus fails since there's a colon 
( : ) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.

  was:
The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
2. In Windows, it yields an incorrect path and thus fails since there's a colon 
(:) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.


> Fix incorrect output path in javadoc build phase
> 
>
> Key: HADOOP-18751
> URL: https://issues.apache.org/jira/browse/HADOOP-18751
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: All
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> The javadoc build phase fails with the following error -
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
> (default-cli) on project hadoop-common: An error has occurred in Javadoc 
> report generation: Unable to write 'options' temporary file for command 
> execution: 
> H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
>  (The filename, directory name, or volume label syntax is incorrect) -> [Help 
> 1]
> {code}
> As called out by the error message the path 
> *H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
>  is invalid.
> The culprit being - 
> https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109
> {code}
> ${project.build.directory}/site
> ${projec

[jira] [Created] (HADOOP-18751) Fix incorrect output path in javadoc build phase

2023-05-24 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18751:
---

 Summary: Fix incorrect output path in javadoc build phase
 Key: HADOOP-18751
 URL: https://issues.apache.org/jira/browse/HADOOP-18751
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
 Environment: All
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The javadoc build phase fails with the following error -

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:javadoc-no-fork 
(default-cli) on project hadoop-common: An error has occurred in Javadoc report 
generation: Unable to write 'options' temporary file for command execution: 
H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options
 (The filename, directory name, or volume label syntax is incorrect) -> [Help 1]
{code}

As called out by the error message the path 
*H:\hadoop-common-project\hadoop-common\target\site\H:\hadoop-common-project\hadoop-common\target\api\options*
 is invalid.

The culprit being - 
https://github.com/apache/hadoop/blob/e9740cb17aef157a615dc36ae08cd224ce1672f0/hadoop-project-dist/pom.xml#L109

{code}
${project.build.directory}/site
${project.build.directory}/api
{code}

As per the [docs from 
maven-javadoc-plugin|https://maven.apache.org/plugins/maven-javadoc-plugin/examples/output-configuration.html],
 *destDir* attribute's value gets appended to that of *reportOutputDirectory*. 
This implies that *destDir* must be a relative path, although not called out in 
the documentation. Since this isn't the case here,
1. In Linux, this yields an unintended path (albeit a valid one) and doesn't 
fail.
2. In Windows, it yields an incorrect path and thus fails since there's a colon 
(:) for the drive letter in the middle -
H:\hadoop-common-project\hadoop-common\target\site\H*:*\hadoop-common-project\hadoop-common\target\api\options

Thus, fixing this would fix the build failure on Windows and put the docs in 
the appropriate directory in Linux.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18746) Install Python 3 for Windows 10 docker image

2023-05-21 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18746.
-
   Fix Version/s: 3.4.0
Target Version/s: 3.4.0
  Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/5679 to trunk.

> Install Python 3 for Windows 10 docker image
> 
>
> Key: HADOOP-18746
> URL: https://issues.apache.org/jira/browse/HADOOP-18746
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> Currently, mvnsite build phase fails due to the following error -
> {code}
> [INFO] --< org.apache.hadoop:hadoop-common 
> >---
> [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT
> [11/114]
> [INFO] [ jar 
> ]-
> [INFO] 
> [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common ---
> [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target
> [INFO] Deleting 
> C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = 
> [UnixShellAPI.md], excludes = [])
> [INFO] Deleting 
> C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = 
> [configuration.xsl, core-default.xml], excludes = [])
> [INFO] 
> [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common ---
> tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 
> 'test-patch.sh': No such file or directory
> tar: Exiting with failure status due to previous errors
> /usr/bin/env: 'python3': No such file or directory
> {code}
> Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker 
> image to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18746) Install Python 3 for Windows 10 docker image

2023-05-21 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18746:
---

 Summary: Install Python 3 for Windows 10 docker image
 Key: HADOOP-18746
 URL: https://issues.apache.org/jira/browse/HADOOP-18746
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Currently, mvnsite build phase fails due to the following error -

{code}
[INFO] --< org.apache.hadoop:hadoop-common >---
[INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT[11/114]
[INFO] [ jar ]-
[INFO] 
[INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common ---
[INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target
[INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown 
(includes = [UnixShellAPI.md], excludes = [])
[INFO] Deleting 
C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = 
[configuration.xsl, core-default.xml], excludes = [])
[INFO] 
[INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common ---
tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 
'test-patch.sh': No such file or directory
tar: Exiting with failure status due to previous errors
/usr/bin/env: 'python3': No such file or directory
{code}

Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker image 
to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18746) Install Python 3 for Windows 10 docker image

2023-05-21 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18746:

Language: powershell Docker  (was: powershell)

> Install Python 3 for Windows 10 docker image
> 
>
> Key: HADOOP-18746
> URL: https://issues.apache.org/jira/browse/HADOOP-18746
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> Currently, mvnsite build phase fails due to the following error -
> {code}
> [INFO] --< org.apache.hadoop:hadoop-common 
> >---
> [INFO] Building Apache Hadoop Common 3.4.0-SNAPSHOT
> [11/114]
> [INFO] [ jar 
> ]-
> [INFO] 
> [INFO] --- maven-clean-plugin:3.1.0:clean (default-clean) @ hadoop-common ---
> [INFO] Deleting C:\hadoop\hadoop-common-project\hadoop-common\target
> [INFO] Deleting 
> C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown (includes = 
> [UnixShellAPI.md], excludes = [])
> [INFO] Deleting 
> C:\hadoop\hadoop-common-project\hadoop-common\src\site\resources (includes = 
> [configuration.xsl, core-default.xml], excludes = [])
> [INFO] 
> [INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common ---
> tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 
> 'test-patch.sh': No such file or directory
> tar: Exiting with failure status due to previous errors
> /usr/bin/env: 'python3': No such file or directory
> {code}
> Thus, we need to install Python 3 in the Windows 10 Hadoop builder docker 
> image to fix this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18734) Create qbt.sh symlink on Windows

2023-05-06 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18734:
---

 Summary: Create qbt.sh symlink on Windows
 Key: HADOOP-18734
 URL: https://issues.apache.org/jira/browse/HADOOP-18734
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The hadoop-common project fails when mvnsite is built while running the 
shelldocs plugin on Windows 10 -

{code}
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec 
(shelldocs) on project hadoop-common: Command execution failed. Process exited 
with an error: 1 (Exit value: 1) -> [Help 1]
{code}

This being the reason -

{code}
[INFO] --- exec-maven-plugin:1.3.1:exec (shelldocs) @ hadoop-common ---
tar: apache-yetus-0.14.0/lib/precommit/qbt.sh: Cannot create symlink to 
'test-patch.sh': No such file or directory
tar: Exiting with failure status due to previous errors
ERROR: apache-yetus-0.14.0-bin.tar.gz is corrupt. Investigate and then remove 
/c/hadoop/patchprocess to try again.
{code}

The apache-yetus-0.14.0 tarball contains a symlink *qbt.sh*. Unzipping this 
tarball fails to create the qbt.sh symlink since the creation of symlink is 
limited to Admin or when the developer mode is enabled.

The solution here is to use the *ln* command to create the symlink and move it 
to the required target location



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18729) Fix mvnsite on Windows 10

2023-05-04 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18729:
---

 Summary: Fix mvnsite on Windows 10
 Key: HADOOP-18729
 URL: https://issues.apache.org/jira/browse/HADOOP-18729
 Project: Hadoop Common
  Issue Type: Bug
  Components: build, site
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The mvnsite step fails to build on Windows 10 due to the following error -

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec 
(shelldocs) on project hadoop-common: Command execution failed. Cannot run 
program 
"C:\hadoop\hadoop-common-project\hadoop-common\..\..\dev-support\bin\shelldocs" 
(in directory 
"C:\hadoop\hadoop-common-project\hadoop-common\src\site\markdown"): 
CreateProcess error=193, %1 is not a valid Win32 application -> [Help 1]

shelldocs is a bash script which Windows can't execute natively. Thus, we need 
to run this through bash on Windows 10.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18134) Setup Jenkins nightly CI for Windows 10

2023-05-03 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18134.
-
Fix Version/s: 3.4.0
   Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/5062 to trunk.

> Setup Jenkins nightly CI for Windows 10
> ---
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> Need to run the Jenkins Nightly CI for Windows 10 environment so that we 
> catch any breaking changes for Hadoop on the Windows 10 platform. Need to get 
> Yetus to run on Windows 10 with against the Hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18725) Avoid cross-platform build for irrelevant Dockerfile changes

2023-05-01 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18725:

Affects Version/s: 3.4.0
   (was: 3.3.5)

> Avoid cross-platform build for irrelevant Dockerfile changes
> 
>
> Key: HADOOP-18725
> URL: https://issues.apache.org/jira/browse/HADOOP-18725
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal, Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> Currently, when one of the Dockerfiles (located at 
> https://github.com/apache/hadoop/tree/trunk/dev-support/docker) changes, all 
> the platform builds are run.
> For example, a change to Dockerfile_debian_10 would trigger a run for Centos 
> 7, which isn't relevant.
> This leads to unnecessary delays in PR validation. We should thus limit the 
> changes specific to the platform for the case where the corresponding 
> Dockerfile changes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18725) Avoid cross-platform build for irrelevant Dockerfile changes

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18725:

Summary: Avoid cross-platform build for irrelevant Dockerfile changes  
(was: Avoid cross-platform build for irrelevant docker changes)

> Avoid cross-platform build for irrelevant Dockerfile changes
> 
>
> Key: HADOOP-18725
> URL: https://issues.apache.org/jira/browse/HADOOP-18725
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.3.5
> Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal, Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
>
> Currently, when one of the Dockerfiles (located at 
> https://github.com/apache/hadoop/tree/trunk/dev-support/docker) changes, all 
> the platform builds are run.
> For example, a change to Dockerfile_debian_10 would trigger a run for Centos 
> 7, which isn't relevant.
> This leads to unnecessary delays in PR validation. We should thus limit the 
> changes specific to the platform for the case where the corresponding 
> Dockerfile changes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18725) Avoid cross-platform build for irrelevant docker changes

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18725:

Summary: Avoid cross-platform build for irrelevant docker changes  (was: 
Avoid cross platform build for irrelevant docker changes)

> Avoid cross-platform build for irrelevant docker changes
> 
>
> Key: HADOOP-18725
> URL: https://issues.apache.org/jira/browse/HADOOP-18725
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.3.5
> Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal, Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> Currently, when one of the Dockerfiles (located at 
> https://github.com/apache/hadoop/tree/trunk/dev-support/docker) changes, all 
> the platform builds are run.
> For example, a change to Dockerfile_debian_10 would trigger a run for Centos 
> 7, which isn't relevant.
> This leads to unnecessary delays in PR validation. We should thus limit the 
> changes specific to the platform for the case where the corresponding 
> Dockerfile changes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18725) Avoid cross platform build for irrelevant docker changes

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18725:

Issue Type: Improvement  (was: Bug)

> Avoid cross platform build for irrelevant docker changes
> 
>
> Key: HADOOP-18725
> URL: https://issues.apache.org/jira/browse/HADOOP-18725
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.3.5
> Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal, Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> Currently, when one of the Dockerfiles (located at 
> https://github.com/apache/hadoop/tree/trunk/dev-support/docker) changes, all 
> the platform builds are run.
> For example, a change to Dockerfile_debian_10 would trigger a run for Centos 
> 7, which isn't relevant.
> This leads to unnecessary delays in PR validation. We should thus limit the 
> changes specific to the platform for the case where the corresponding 
> Dockerfile changes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18725) Avoid cross platform build for irrelevant docker changes

2023-04-30 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18725:
---

 Summary: Avoid cross platform build for irrelevant docker changes
 Key: HADOOP-18725
 URL: https://issues.apache.org/jira/browse/HADOOP-18725
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.3.5
 Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal, Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Currently, when one of the Dockerfiles (located at 
https://github.com/apache/hadoop/tree/trunk/dev-support/docker) changes, all 
the platform builds are run.
For example, a change to Dockerfile_debian_10 would trigger a run for Centos 7, 
which isn't relevant.
This leads to unnecessary delays in PR validation. We should thus limit the 
changes specific to the platform for the case where the corresponding 
Dockerfile changes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Setup Jenkins nightly CI for Windows 10

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Description: Need to run the Jenkins Nightly CI for Windows 10 environment 
so that we catch any breaking changes for Hadoop on the Windows 10 platform. 
Need to get Yetus to run on Windows 10 with against the Hadoop codebase.  (was: 
Need to run the Jenkins Nightly CI for Windows 10 environment so that we catch 
any breaking changes for Hadoop on the Windows 10 platform.)

> Setup Jenkins nightly CI for Windows 10
> ---
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to run the Jenkins Nightly CI for Windows 10 environment so that we 
> catch any breaking changes for Hadoop on the Windows 10 platform. Need to get 
> Yetus to run on Windows 10 with against the Hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Setup Jenkins nightly CI for Windows 10

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Description: Need to run the Jenkins Nightly CI for Windows 10 environment 
so that we catch any breaking changes for Hadoop on the Windows 10 platform.  
(was: Need to run the Jenkins Precommit CI for Windows 10 environment so that 
we catch any breaking changes prior to merging them.)

> Setup Jenkins nightly CI for Windows 10
> ---
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to run the Jenkins Nightly CI for Windows 10 environment so that we 
> catch any breaking changes for Hadoop on the Windows 10 platform.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Setup Jenkins nightly CI for Windows 10

2023-04-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Summary: Setup Jenkins nightly CI for Windows 10  (was: Run CI for Windows 
10)

> Setup Jenkins nightly CI for Windows 10
> ---
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to run the Jenkins Precommit CI for Windows 10 environment so that we 
> catch any breaking changes prior to merging them.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18509) Convert build instructions file to markdown

2022-10-26 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18509:
---

 Summary: Convert build instructions file to markdown
 Key: HADOOP-18509
 URL: https://issues.apache.org/jira/browse/HADOOP-18509
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Need to convert https://github.com/apache/hadoop/blob/trunk/BUILDING.txt to 
markdown.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18506) Update build instructions for Windows using VS2019

2022-10-22 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18506:

Summary: Update build instructions for Windows using VS2019  (was: Change 
build instructions for Windows using VS2019)

> Update build instructions for Windows using VS2019
> --
>
> Key: HADOOP-18506
> URL: https://issues.apache.org/jira/browse/HADOOP-18506
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build, documentation
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
>
> With HADOOP-18133, we're finally able to build Hadoop on Windows using Visual 
> Studio 2019. We now need to update the documentation with the latest 
> instructions.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18506) Change build instructions for Windows using VS2019

2022-10-22 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18506:

Component/s: documentation

> Change build instructions for Windows using VS2019
> --
>
> Key: HADOOP-18506
> URL: https://issues.apache.org/jira/browse/HADOOP-18506
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build, documentation
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
>
> With HADOOP-18133, we're finally able to build Hadoop on Windows using Visual 
> Studio 2019. We now need to update the documentation with the latest 
> instructions.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18506) Change build instructions for Windows using VS2019

2022-10-22 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18506:
---

 Summary: Change build instructions for Windows using VS2019
 Key: HADOOP-18506
 URL: https://issues.apache.org/jira/browse/HADOOP-18506
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


With HADOOP-18133, we're finally able to build Hadoop on Windows using Visual 
Studio 2019. We now need to update the documentation with the latest 
instructions.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18483) Exclude Dockerfile_windows_10 from hadolint

2022-10-07 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18483:
---

 Summary: Exclude Dockerfile_windows_10 from hadolint
 Key: HADOOP-18483
 URL: https://issues.apache.org/jira/browse/HADOOP-18483
 Project: Hadoop Common
  Issue Type: Improvement
  Components: common
Affects Versions: 3.3.4
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


HADOOP-18133 tries to add Dockerfile for Windows 10 for building Hadoop. 
However, hadolint fails to run on *Dockerfile_windows_10* since the version of 
hadolint (1.1.1) used in Hadoop CI doesn't support parsing of the Windows 
command syntax.

HADOOP-18449 tries to upgrade the version of hadolint to the latest (2.10.0). 
However, it runs into some GPG issues on Centos 8.

Thus, we're going to exclude Dockerfile_windows_10 from getting hadolint-ed for 
the time being. There's a bug in Yetus that prevents exclusion of the file if 
the exclusion rule and the Dockerfile are added in the same PR - 
https://github.com/apache/yetus/pull/289#issuecomment-1263813381. Thus, this PR 
adds the exclusion rule first and then the Dockerfile will be added in another 
PR.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18449) Upgrade hadolint to 2.10.0

2022-09-09 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18449:
---

 Summary: Upgrade hadolint to 2.10.0
 Key: HADOOP-18449
 URL: https://issues.apache.org/jira/browse/HADOOP-18449
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The current version of hadolint (1.11.1) is only suitable for linting Linux 
commands in Dockerfile. It fails to recognize Windows command syntax.
HADOOP-18133 adds Dockerfile for Windows. Thus, it's essential to upgrade 
hadolint to 2.10.0 which has the ability to recognize Windows command syntax.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18428) Parameterize platform toolset version

2022-08-30 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18428.
-
Fix Version/s: 3.4.0
   Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/4815 to trunk.

> Parameterize platform toolset version
> -
>
> Key: HADOOP-18428
> URL: https://issues.apache.org/jira/browse/HADOOP-18428
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> The *winutils*, *libwinutils* and *native* project structures are currently 
> defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
> parameter is the *PlatformToolsetVersion*. This gets added by the build 
> system by running 
> [dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
>  This essentially runs the following command to detect the 
> PlatformToolsetVersion of the currently installed Visual Studio and uses the 
> same for compilation - 
> https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38
> {code}
> devenv %%f /upgrade
> {code}
> However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
> Build Tools are available (and not the full IDE). The Visual Studio 2019 
> Build Tools distribution doesn't contain *devenv* and thus, the above command 
> fails to run stating that it couldn't find devenv.
> To fix this issue, we need the ability to specify the PlatformToolsetVersion 
> as a Maven option, at which point the *win-vs-upgrade.cmd* won't run and 
> would use the speicified PlatformToolsetVersion against MSBuild.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18428) Parameterize platform toolset version

2022-08-28 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18428:

Description: 
The *winutils*, *libwinutils* and *native* project structures are currently 
defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
parameter is the *PlatformToolsetVersion*. This gets added by the build system 
by running 
[dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
 This essentially runs the following command to detect the 
PlatformToolsetVersion of the currently installed Visual Studio and uses the 
same for compilation - 
https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38

{code}
devenv %%f /upgrade
{code}

However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
Build Tools are available (and not the full IDE). The Visual Studio 2019 Build 
Tools distribution doesn't contain *devenv* and thus, the above command fails 
to run stating that it couldn't find devenv.

To fix this issue, we need the ability to specify the PlatformToolsetVersion as 
a Maven option, at which point the *win-vs-upgrade.cmd* won't run and would use 
the speicified PlatformToolsetVersion against MSBuild.

  was:
The *winutils*, *libwinutils* and *native* project structures are currently 
defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
parameter is the *PlatformToolsetVersion*. This gets added by the build system 
by running 
[dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
 This essentially runs the following command to detect the 
PlatformToolsetVersion of the currently installed Visual Studio and uses the 
same for compilation - 
https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38

{code}
devenv %%f /upgrade
{code}

However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
Build Tools are available (and not the full IDE). The Visual Studio 2019 Build 
Tools distribution doesn't contain *devenv* and thus, the above command fails 
to run stating that it couldn't find devenv.

To fix this issue, we need the ability to specify the PlatformToolsetVersion as 
a Maven option, at which point the win-vs-upgrade.cmd won't run and would use 
the speicified PlatformToolsetVersion against MSBuild.


> Parameterize platform toolset version
> -
>
> Key: HADOOP-18428
> URL: https://issues.apache.org/jira/browse/HADOOP-18428
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> The *winutils*, *libwinutils* and *native* project structures are currently 
> defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
> parameter is the *PlatformToolsetVersion*. This gets added by the build 
> system by running 
> [dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
>  This essentially runs the following command to detect the 
> PlatformToolsetVersion of the currently installed Visual Studio and uses the 
> same for compilation - 
> https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38
> {code}
> devenv %%f /upgrade
> {code}
> However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
> Build Tools are available (and not the full IDE). The Visual Studio 2019 
> Build Tools distribution doesn't contain *devenv* and thus, the above command 
> fails to run stating that it couldn't find devenv.
> To fix this issue, we need the ability to specify the PlatformToolsetVersion 
> as a Maven option, at which point the *win-vs-upgrade.cmd* won't run and 
> would use the speicified PlatformToolsetVersion against MSBuild.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18428) Parameterize platform toolset version

2022-08-28 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18428:

Description: 
The *winutils*, *libwinutils* and *native* project structures are currently 
defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
parameter is the *PlatformToolsetVersion*. This gets added by the build system 
by running 
[dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
 This essentially runs the following command to detect the 
PlatformToolsetVersion of the currently installed Visual Studio and uses the 
same for compilation - 
https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38

{code}
devenv %%f /upgrade
{code}

However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
Build Tools are available (and not the full IDE). The Visual Studio 2019 Build 
Tools distribution doesn't contain *devenv* and thus, the above command fails 
to run stating that it couldn't find devenv.

To fix this issue, we need the ability to specify the PlatformToolsetVersion as 
a Maven option, at which point the win-vs-upgrade.cmd won't run and would use 
the speicified PlatformToolsetVersion against MSBuild.

  was:
The *winutils*, *libwinutils* and *native* project structures are currently 
defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
parameter is the *PlatformToolsetVersion*. This gets added by the build system 
by running 
[dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
 This essentially runs the following command to detect the 
PlatformToolsetVersion of the currently installed Visual Studio and uses the 
same for compilation - 
https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38

{code}
devenv %%f /upgrade
{code}

However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
Build Tools are available (and not the full IDE). The Visual Studio 2019 Build 
Tools distribution doesn't contain devenv and thus, the above command fails to 
run stating that it couldn't find devenv.

To fix this issue, we need the ability to specify the PlatformToolsetVersion as 
a Maven option, at which point the win-vs-upgrade.cmd won't run and would use 
the speicified PlatformToolsetVersion against MSBuild.


> Parameterize platform toolset version
> -
>
> Key: HADOOP-18428
> URL: https://issues.apache.org/jira/browse/HADOOP-18428
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> The *winutils*, *libwinutils* and *native* project structures are currently 
> defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
> parameter is the *PlatformToolsetVersion*. This gets added by the build 
> system by running 
> [dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
>  This essentially runs the following command to detect the 
> PlatformToolsetVersion of the currently installed Visual Studio and uses the 
> same for compilation - 
> https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38
> {code}
> devenv %%f /upgrade
> {code}
> However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
> Build Tools are available (and not the full IDE). The Visual Studio 2019 
> Build Tools distribution doesn't contain *devenv* and thus, the above command 
> fails to run stating that it couldn't find devenv.
> To fix this issue, we need the ability to specify the PlatformToolsetVersion 
> as a Maven option, at which point the win-vs-upgrade.cmd won't run and would 
> use the speicified PlatformToolsetVersion against MSBuild.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18428) Parameterize platform toolset version

2022-08-28 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18428:
---

 Summary: Parameterize platform toolset version
 Key: HADOOP-18428
 URL: https://issues.apache.org/jira/browse/HADOOP-18428
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The *winutils*, *libwinutils* and *native* project structures are currently 
defined in *.vcxproj* and *.sln* files. For building on Windows, a key 
parameter is the *PlatformToolsetVersion*. This gets added by the build system 
by running 
[dev-support/bin/win-vs-upgrade.cmd|https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd].
 This essentially runs the following command to detect the 
PlatformToolsetVersion of the currently installed Visual Studio and uses the 
same for compilation - 
https://github.com/apache/hadoop/blob/c60a900583d6a8d0494980f4bbbf4f95438b741b/dev-support/bin/win-vs-upgrade.cmd#L38

{code}
devenv %%f /upgrade
{code}

However, when building with *Dockerfile_windows_10*, only Visual Studio 2019 
Build Tools are available (and not the full IDE). The Visual Studio 2019 Build 
Tools distribution doesn't contain devenv and thus, the above command fails to 
run stating that it couldn't find devenv.

To fix this issue, we need the ability to specify the PlatformToolsetVersion as 
a Maven option, at which point the win-vs-upgrade.cmd won't run and would use 
the speicified PlatformToolsetVersion against MSBuild.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18357) Retarget solution file to VS2019

2022-07-28 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18357.
-
Resolution: Abandoned

Abandoning the PR, this is the reason - 
https://github.com/apache/hadoop/pull/4616#discussion_r932136279.

> Retarget solution file to VS2019
> 
>
> Key: HADOOP-18357
> URL: https://issues.apache.org/jira/browse/HADOOP-18357
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: common
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>  Labels: libhdfscpp, pull-request-available
>  Time Spent: 3h
>  Remaining Estimate: 0h
>
> The Visual Studio version used by winutils and native components in Hadoop 
> common are quite old. We need to retarget the solution and vcxproj files to 
> use the latest version (Visual Studio 2019 as of this writing).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18357) Retarget solution file to VS2019

2022-07-23 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18357:
---

 Summary: Retarget solution file to VS2019
 Key: HADOOP-18357
 URL: https://issues.apache.org/jira/browse/HADOOP-18357
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The Visual Studio version used by winutils and native components in Hadoop 
common are quite old. We need to retarget the solution and vcxproj files to use 
the latest version (Visual Studio 2019 as of this writing).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-17196) Fix C/C++ standard warnings

2022-06-30 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17561073#comment-17561073
 ] 

Gautham Banasandra edited comment on HADOOP-17196 at 6/30/22 2:34 PM:
--

[~iwasakims] could you add the *--std* flags only for Linux? You can use the 
following snippet -

{code}
if (UNIX)
# Add the -std flag.
else (UNIX)
# retain whatever is current.
endif (UNIX)
{code}

With this, there won't be a regression on Windows, while addressing the build 
issue with Centos 7.


was (Author: gautham):
[~iwasakims] could you add the flags only for Linux. You can use the following 
snippet -

{code}
if (UNIX)
# Add the -std flag.
else (UNIX)
# retain whatever is current.
endif (UNIX)
{code}

> Fix C/C++ standard warnings
> ---
>
> Key: HADOOP-17196
> URL: https://issues.apache.org/jira/browse/HADOOP-17196
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.1.3
> Environment: Windows 10 Pro 64-bit
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.4.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> The C/C++ language standard is not specified in a cross-compiler manner. Even 
> though it's as straight forward as passing *-std* as compiler arguments, not 
> all the values are supported by all the compilers. For example, compilation 
> with the Visual C++ compiler on Windows with *-std=gnu99* flag causes the 
> following warning -
> {code:java}
> cl : command line warning D9002: ignoring unknown option '-std=gnu99' 
> [Z:\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\main\native\libhdfs-examples\hdfs_read.vcxproj]
>  {code}
> Thus, we need to use the appropriate flags provided by CMake to specify the 
> C/C++ standards so that it is compiler friendly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17196) Fix C/C++ standard warnings

2022-06-30 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17561073#comment-17561073
 ] 

Gautham Banasandra commented on HADOOP-17196:
-

[~iwasakims] could you add the flags only for Linux. You can use the following 
snippet -

{code}
if (UNIX)
# Add the -std flag.
else (UNIX)
# retain whatever is current.
endif (UNIX)
{code}

> Fix C/C++ standard warnings
> ---
>
> Key: HADOOP-17196
> URL: https://issues.apache.org/jira/browse/HADOOP-17196
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.1.3
> Environment: Windows 10 Pro 64-bit
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.4.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> The C/C++ language standard is not specified in a cross-compiler manner. Even 
> though it's as straight forward as passing *-std* as compiler arguments, not 
> all the values are supported by all the compilers. For example, compilation 
> with the Visual C++ compiler on Windows with *-std=gnu99* flag causes the 
> following warning -
> {code:java}
> cl : command line warning D9002: ignoring unknown option '-std=gnu99' 
> [Z:\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\main\native\libhdfs-examples\hdfs_read.vcxproj]
>  {code}
> Thus, we need to use the appropriate flags provided by CMake to specify the 
> C/C++ standards so that it is compiler friendly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-17196) Fix C/C++ standard warnings

2022-06-29 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17560586#comment-17560586
 ] 

Gautham Banasandra edited comment on HADOOP-17196 at 6/29/22 5:27 PM:
--

[~iwasakims] The right fix here would be to upgrade to GCC 9 on Centos 7. Here 
are the steps -
1. Run these commands - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L37-L40
2. yum install devtoolset-9
3. Run this command - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L44-L45
4. Set these environment variables - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L48-L61

It would be the responsibility of the downstream projects to align with the 
dependencies of Hadoop, not the other way around. It would be detrimental to 
the evolution of the Hadoop project otherwise.


was (Author: gautham):
[~iwasakims] here's how you can upgrade to GCC 9 on Centos 7 -
1. Run these commands - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L37-L40
2. yum install devtoolset-9
3. Run this command - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L44-L45
4. Set these environment variables - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L48-L61

It would be the responsibility of the downstream projects to align with the 
dependencies of Hadoop, not the other way around. It would be detrimental to 
the evolution of the Hadoop project otherwise.

> Fix C/C++ standard warnings
> ---
>
> Key: HADOOP-17196
> URL: https://issues.apache.org/jira/browse/HADOOP-17196
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.1.3
> Environment: Windows 10 Pro 64-bit
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.2.2, 3.3.1, 3.4.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> The C/C++ language standard is not specified in a cross-compiler manner. Even 
> though it's as straight forward as passing *-std* as compiler arguments, not 
> all the values are supported by all the compilers. For example, compilation 
> with the Visual C++ compiler on Windows with *-std=gnu99* flag causes the 
> following warning -
> {code:java}
> cl : command line warning D9002: ignoring unknown option '-std=gnu99' 
> [Z:\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\main\native\libhdfs-examples\hdfs_read.vcxproj]
>  {code}
> Thus, we need to use the appropriate flags provided by CMake to specify the 
> C/C++ standards so that it is compiler friendly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17196) Fix C/C++ standard warnings

2022-06-29 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17560586#comment-17560586
 ] 

Gautham Banasandra commented on HADOOP-17196:
-

[~iwasakims] here's how you can upgrade to GCC 9 on Centos 7 -
1. Run these commands - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L37-L40
2. yum install devtoolset-9
3. Run this command - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L44-L45
4. Set these environment variables - 
https://github.com/apache/hadoop/blob/2d133a54ac91bb961805915045b6ced2d06801ec/dev-support/docker/Dockerfile_centos_7#L48-L61

It would be the responsibility of the downstream projects to align with the 
dependencies of Hadoop, not the other way around. It would be detrimental to 
the evolution of the Hadoop project otherwise.

> Fix C/C++ standard warnings
> ---
>
> Key: HADOOP-17196
> URL: https://issues.apache.org/jira/browse/HADOOP-17196
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.1.3
> Environment: Windows 10 Pro 64-bit
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
> Fix For: 3.2.2, 3.3.1, 3.4.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> The C/C++ language standard is not specified in a cross-compiler manner. Even 
> though it's as straight forward as passing *-std* as compiler arguments, not 
> all the values are supported by all the compilers. For example, compilation 
> with the Visual C++ compiler on Windows with *-std=gnu99* flag causes the 
> following warning -
> {code:java}
> cl : command line warning D9002: ignoring unknown option '-std=gnu99' 
> [Z:\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\main\native\libhdfs-examples\hdfs_read.vcxproj]
>  {code}
> Thus, we need to use the appropriate flags provided by CMake to specify the 
> C/C++ standards so that it is compiler friendly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-11804) Shaded Hadoop client artifacts and minicluster

2022-06-21 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-11804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-11804:

Release Note: 
The `hadoop-client` Maven artifact available in 2.x releases pulls
Hadoop's transitive dependencies onto a Hadoop application's classpath.
This can be problematic if the versions of these transitive dependencies
conflict with the versions used by the application.

HADOOP-11804 adds new `hadoop-client-api` and
`hadoop-client-runtime` artifacts that shade Hadoop's dependencies
into a single jar. This avoids leaking Hadoop's dependencies onto the
application's classpath.

  was:


The `hadoop-client` Maven artifact available in 2.x releases pulls
Hadoop's transitive dependencies onto a Hadoop application's classpath.
This can be problematic if the versions of these transitive dependencies
conflict with the versions used by the application.

[HADOOP-11804](https://issues.apache.org/jira/browse/HADOOP-11804) adds
new `hadoop-client-api` and `hadoop-client-runtime` artifacts that
shade Hadoop's dependencies into a single jar. This avoids leaking
Hadoop's dependencies onto the application's classpath.


> Shaded Hadoop client artifacts and minicluster
> --
>
> Key: HADOOP-11804
> URL: https://issues.apache.org/jira/browse/HADOOP-11804
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: build
>Reporter: Sean Busbey
>Assignee: Sean Busbey
>Priority: Major
> Fix For: 3.0.0-alpha2
>
> Attachments: HADOOP-11804.1.patch, HADOOP-11804.10.patch, 
> HADOOP-11804.11.patch, HADOOP-11804.12.patch, HADOOP-11804.13.patch, 
> HADOOP-11804.14.patch, HADOOP-11804.2.patch, HADOOP-11804.3.patch, 
> HADOOP-11804.4.patch, HADOOP-11804.5.patch, HADOOP-11804.6.patch, 
> HADOOP-11804.7.patch, HADOOP-11804.8.patch, HADOOP-11804.9.patch, 
> hadoop-11804-client-test.tar.gz
>
>
> make a hadoop-client-api and hadoop-client-runtime that i.e. HBase can use to 
> talk with a Hadoop cluster without seeing any of the implementation 
> dependencies.
> see proposal on parent for details.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17740) Set locale for Centos 7 and 8

2022-06-06 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17740:

Description: The locale is set for Ubuntu Focal - 
https://github.com/apache/hadoop/blob/a234d00c1ce57427202d4c9587f891ec0164d10c/dev-support/docker/Dockerfile#L52-L54.
 The locale needs to be set for Centos 7 and 8 as well.  (was: The locale needs 
to be set for Centos 7 and 8.)

> Set locale for Centos 7 and 8
> -
>
> Key: HADOOP-17740
> URL: https://issues.apache.org/jira/browse/HADOOP-17740
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: Centos 7, Centos 8
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Major
>
> The locale is set for Ubuntu Focal - 
> https://github.com/apache/hadoop/blob/a234d00c1ce57427202d4c9587f891ec0164d10c/dev-support/docker/Dockerfile#L52-L54.
>  The locale needs to be set for Centos 7 and 8 as well.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-18268) Install Maven from Apache archives

2022-06-03 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-18268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17547914#comment-17547914
 ] 

Gautham Banasandra commented on HADOOP-18268:
-

My bad, I forgot to change the title before merging. Thanks for noticing this 
[~aajisaka].

> Install Maven from Apache archives
> --
>
> Key: HADOOP-18268
> URL: https://issues.apache.org/jira/browse/HADOOP-18268
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> The Jenkins CI for Hadoop is failing to build since it's unable to download 
> and install maven -
> {code}
> 22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
> 22:38:13  #11 
> sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
> 22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
> assign requested address
> 22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
> pkg-resolver/install-maven.sh centos:7]: exit code: 7
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4370/4/console
> We need to switch to using Maven from Apache archives to prevent such issues.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18274) Use CMake 3.19.0 in Debian 10

2022-06-02 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18274:

Description: 
HDFS Native Client fails to build on Debian 10 due to the following error -
{code}
[WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
(FetchContent_MakeAvailable):
[WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
[WARNING] 
[WARNING] 
[WARNING] -- Configuring incomplete, errors occurred!
{code}
Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt

This cause for this issue is that the version of CMake on Debian 10 (which is 
installed through apt) is 3.13 and *FetchContent_MakeAvailable* was [introduced 
in CMake 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 

Thus, we upgrade CMake by installing through the 
[install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
 script from pkg-resolver which installs CMake 3.19.0, instead of installing 
CMake through apt on Debian 10.

  was:
HDFS Native Client fails to build on Debian 10 due to the following error -
{code}
[WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
(FetchContent_MakeAvailable):
[WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
[WARNING] 
[WARNING] 
[WARNING] -- Configuring incomplete, errors occurred!
{code}
Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt

This cause for this issue is that the version of CMake on Debian 10 (which is 
installed through apt) is 3.13 and *FetchContent_MakeAvailable* was [introduced 
in CMake 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 

Thus, we upgrade CMake by installing through the 
[install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
 from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
through apt on Debian 10.


> Use CMake 3.19.0 in Debian 10
> -
>
> Key: HADOOP-18274
> URL: https://issues.apache.org/jira/browse/HADOOP-18274
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>
> HDFS Native Client fails to build on Debian 10 due to the following error -
> {code}
> [WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
> (FetchContent_MakeAvailable):
> [WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
> [WARNING] 
> [WARNING] 
> [WARNING] -- Configuring incomplete, errors occurred!
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt
> This cause for this issue is that the version of CMake on Debian 10 (which is 
> installed through apt) is 3.13 and *FetchContent_MakeAvailable* was 
> [introduced in CMake 
> 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 
> Thus, we upgrade CMake by installing through the 
> [install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
>  script from pkg-resolver which installs CMake 3.19.0, instead of installing 
> CMake through apt on Debian 10.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18268) Install Maven from Apache archives

2022-06-02 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18268.
-
Fix Version/s: 3.4.0
   Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/4373 to trunk.

> Install Maven from Apache archives
> --
>
> Key: HADOOP-18268
> URL: https://issues.apache.org/jira/browse/HADOOP-18268
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> The Jenkins CI for Hadoop is failing to build since it's unable to download 
> and install maven -
> {code}
> 22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
> 22:38:13  #11 
> sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
> 22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
> assign requested address
> 22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
> pkg-resolver/install-maven.sh centos:7]: exit code: 7
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4370/4/console
> We need to switch to using Maven from Apache archives to prevent such issues.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-18268) Install Maven from Apache archives

2022-06-02 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-18268 started by Gautham Banasandra.
---
> Install Maven from Apache archives
> --
>
> Key: HADOOP-18268
> URL: https://issues.apache.org/jira/browse/HADOOP-18268
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>  Labels: pull-request-available
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> The Jenkins CI for Hadoop is failing to build since it's unable to download 
> and install maven -
> {code}
> 22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
> 22:38:13  #11 
> sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
> 22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
> assign requested address
> 22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
> pkg-resolver/install-maven.sh centos:7]: exit code: 7
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4370/4/console
> We need to switch to using Maven from Apache archives to prevent such issues.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-18274) Use CMake 3.19.0 in Debian 10

2022-06-02 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-18274 started by Gautham Banasandra.
---
> Use CMake 3.19.0 in Debian 10
> -
>
> Key: HADOOP-18274
> URL: https://issues.apache.org/jira/browse/HADOOP-18274
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>
> HDFS Native Client fails to build on Debian 10 due to the following error -
> {code}
> [WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
> (FetchContent_MakeAvailable):
> [WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
> [WARNING] 
> [WARNING] 
> [WARNING] -- Configuring incomplete, errors occurred!
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt
> This cause for this issue is that the version of CMake on Debian 10 (which is 
> installed through apt) is 3.13 and *FetchContent_MakeAvailable* was 
> [introduced in CMake 
> 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 
> Thus, we upgrade CMake by installing through the 
> [install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
>  from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
> through apt on Debian 10.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18274) Use CMake 3.19.0 in Debian 10

2022-06-02 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18274:

Description: 
HDFS Native Client fails to build on Debian 10 due to the following error -
{code}
[WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
(FetchContent_MakeAvailable):
[WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
[WARNING] 
[WARNING] 
[WARNING] -- Configuring incomplete, errors occurred!
{code}
Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt

This cause for this issue is that the version of CMake on Debian 10 (which is 
installed through apt) is 3.13 and *FetchContent_MakeAvailable* was [introduced 
in CMake 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 

Thus, we upgrade CMake by installing through the 
[install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
 from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
through apt on Debian 10.

  was:
HDFS Native Client fails to build on Debian 10 due to the following error -
{code}
[WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
(FetchContent_MakeAvailable):
[WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
[WARNING] 
[WARNING] 
[WARNING] -- Configuring incomplete, errors occurred!
{code}

Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt

This cause for this issue is that the version of CMake on Debian 10 (which is 
installed through apt) is 3.13 and *FetchContent_MakeAvailable* was [introduced 
in CMake 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 

Thus, we upgrade CMake by installing through the 
[install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
 from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
through apt on Debian 10.


> Use CMake 3.19.0 in Debian 10
> -
>
> Key: HADOOP-18274
> URL: https://issues.apache.org/jira/browse/HADOOP-18274
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>
> HDFS Native Client fails to build on Debian 10 due to the following error -
> {code}
> [WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
> (FetchContent_MakeAvailable):
> [WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
> [WARNING] 
> [WARNING] 
> [WARNING] -- Configuring incomplete, errors occurred!
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt
> This cause for this issue is that the version of CMake on Debian 10 (which is 
> installed through apt) is 3.13 and *FetchContent_MakeAvailable* was 
> [introduced in CMake 
> 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 
> Thus, we upgrade CMake by installing through the 
> [install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
>  from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
> through apt on Debian 10.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18274) Use CMake 3.19.0 in Debian 10

2022-06-02 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18274:
---

 Summary: Use CMake 3.19.0 in Debian 10
 Key: HADOOP-18274
 URL: https://issues.apache.org/jira/browse/HADOOP-18274
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


HDFS Native Client fails to build on Debian 10 due to the following error -
{code}
[WARNING] CMake Error at main/native/libhdfspp/CMakeLists.txt:68 
(FetchContent_MakeAvailable):
[WARNING]   Unknown CMake command "FetchContent_MakeAvailable".
[WARNING] 
[WARNING] 
[WARNING] -- Configuring incomplete, errors occurred!
{code}

Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4371/2/artifact/out/branch-compile-hadoop-hdfs-project_hadoop-hdfs-native-client.txt

This cause for this issue is that the version of CMake on Debian 10 (which is 
installed through apt) is 3.13 and *FetchContent_MakeAvailable* was [introduced 
in CMake 3.14|https://cmake.org/cmake/help/v3.14/module/FetchContent.html] 

Thus, we upgrade CMake by installing through the 
[install-cmake.sh|https://github.com/apache/hadoop/blob/34a973a90ef89b633c9b5c13a79aa1ac11c92eb5/dev-support/docker/pkg-resolver/install-cmake.sh]
 from pkg-resolver which installs CMake 3.19.0, instead of installing CMake 
through apt on Debian 10.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18268) Install Maven from Apache archives

2022-05-29 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18268:
---

 Summary: Install Maven from Apache archives
 Key: HADOOP-18268
 URL: https://issues.apache.org/jira/browse/HADOOP-18268
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


The Jenkins CI for Hadoop is failing to build since it's unable to download and 
install maven -
{code}
22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
22:38:13  #11 
sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
assign requested address
22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
pkg-resolver/install-maven.sh centos:7]: exit code: 7
{code}

We need to switch to using Maven from Apache archives to prevent such issues.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18268) Install Maven from Apache archives

2022-05-29 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18268:

Description: 
The Jenkins CI for Hadoop is failing to build since it's unable to download and 
install maven -
{code}
22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
22:38:13  #11 
sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
assign requested address
22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
pkg-resolver/install-maven.sh centos:7]: exit code: 7
{code}

Jenkins run - 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4370/4/console
We need to switch to using Maven from Apache archives to prevent such issues.

  was:
The Jenkins CI for Hadoop is failing to build since it's unable to download and 
install maven -
{code}
22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
22:38:13  #11 
sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
assign requested address
22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
pkg-resolver/install-maven.sh centos:7]: exit code: 7
{code}

We need to switch to using Maven from Apache archives to prevent such issues.


> Install Maven from Apache archives
> --
>
> Key: HADOOP-18268
> URL: https://issues.apache.org/jira/browse/HADOOP-18268
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>
> The Jenkins CI for Hadoop is failing to build since it's unable to download 
> and install maven -
> {code}
> 22:38:13  #11 [ 7/16] RUN pkg-resolver/install-maven.sh centos:7
> 22:38:13  #11 
> sha256:8b1823a6197611693af5daa2888f195db76ae5e9d0765f799becc7e7d5f7b019
> 22:40:25  #11 131.5 curl: (7) Failed to connect to 2403:8940:3:1::f: Cannot 
> assign requested address
> 22:40:25  #11 ERROR: executor failed running [/bin/bash --login -c 
> pkg-resolver/install-maven.sh centos:7]: exit code: 7
> {code}
> Jenkins run - 
> https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4370/4/console
> We need to switch to using Maven from Apache archives to prevent such issues.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-18219) Fix shadedclient test failure

2022-05-01 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18219?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra reassigned HADOOP-18219:
---

Assignee: PJ Fanning  (was: Gautham Banasandra)

> Fix shadedclient test failure
> -
>
> Key: HADOOP-18219
> URL: https://issues.apache.org/jira/browse/HADOOP-18219
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.4.0
> Environment: Debian 10
>Reporter: Gautham Banasandra
>Assignee: PJ Fanning
>Priority: Blocker
>
> Two of the shaded client tests are failing on Debian 10 ever since this 
> commit - 
> https://github.com/apache/hadoop/commit/63187083cc3b9bb1c1e90e692e271958561f9cc8.
>  The failures are as follows -
> 1st test failure -
> {code}
> [INFO] Running org.apache.hadoop.example.ITUseMiniCluster
> [ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 
> 18.315 s <<< FAILURE! - in org.apache.hadoop.example.ITUseMiniCluster
> [ERROR] useWebHDFS(org.apache.hadoop.example.ITUseMiniCluster)  Time elapsed: 
> 12.048 s  <<< ERROR!
> org.apache.hadoop.yarn.exceptions.YarnRuntimeException: 
> org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
>   at 
> org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:384)
>   at 
> org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:129)
>   at 
> org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:500)
>   at 
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:195)
>   at 
> org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:123)
>   at 
> org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:333)
>   at 
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:195)
>   at 
> org.apache.hadoop.example.ITUseMiniCluster.clusterUp(ITUseMiniCluster.java:84)
>   at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>   at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>   at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>   at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>   at 
> org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
>   at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
>   at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>   at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>   at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>   at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
>   at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>   at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>   at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
>   at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
>   at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
>   at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
>   at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
>   at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
>   at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
>   at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>   at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>   at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>   at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>   at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
>   at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
>   at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
>   at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> Caused by: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http 
> server
>   at org.apache.hadoop.yarn.weba

[jira] [Created] (HADOOP-18219) Fix shadedclient test failure

2022-05-01 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18219:
---

 Summary: Fix shadedclient test failure
 Key: HADOOP-18219
 URL: https://issues.apache.org/jira/browse/HADOOP-18219
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.4.0
 Environment: Debian 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Two of the shaded client tests are failing on Debian 10 ever since this commit 
- 
https://github.com/apache/hadoop/commit/63187083cc3b9bb1c1e90e692e271958561f9cc8.
 The failures are as follows -

1st test failure -
{code}
[INFO] Running org.apache.hadoop.example.ITUseMiniCluster
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 18.315 
s <<< FAILURE! - in org.apache.hadoop.example.ITUseMiniCluster
[ERROR] useWebHDFS(org.apache.hadoop.example.ITUseMiniCluster)  Time elapsed: 
12.048 s  <<< ERROR!
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: 
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
at 
org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:384)
at 
org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:129)
at 
org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:500)
at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:195)
at 
org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:123)
at 
org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:333)
at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:195)
at 
org.apache.hadoop.example.ITUseMiniCluster.clusterUp(ITUseMiniCluster.java:84)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at 
org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at 
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
at 
org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
Caused by: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http 
server
at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:479)
at 
org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1443)
at 
org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1552)
at 
org.apache.hadoop.service.AbstractS

[jira] [Updated] (HADOOP-18155) Refactor tests in TestFileUtil

2022-03-10 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18155:

Fix Version/s: 3.4.0

> Refactor tests in TestFileUtil
> --
>
> Key: HADOOP-18155
> URL: https://issues.apache.org/jira/browse/HADOOP-18155
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Trivial
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> We need to ensure that we check the results of file operations whenever we 
> invoke *mkdir*, *deleteFile* etc. and assert them right there before 
> proceeding on. Also, we need to ensure that some of the relevant FileSystem 
> APIs don't return null.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18155) Refactor tests in TestFileUtil

2022-03-10 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18155.
-
Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/4053 to trunk.

> Refactor tests in TestFileUtil
> --
>
> Key: HADOOP-18155
> URL: https://issues.apache.org/jira/browse/HADOOP-18155
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Trivial
>  Labels: pull-request-available
>  Time Spent: 2.5h
>  Remaining Estimate: 0h
>
> We need to ensure that we check the results of file operations whenever we 
> invoke *mkdir*, *deleteFile* etc. and assert them right there before 
> proceeding on. Also, we need to ensure that some of the relevant FileSystem 
> APIs don't return null.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18155) Refactor tests in TestFileUtil

2022-03-07 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18155:

Description: We need to ensure that we check the results of file operations 
whenever we invoke *mkdir*, *deleteFile* etc. and assert them right there 
before proceeding on. Also, we need to ensure that some of the relevant 
FileSystem APIs don't return null.  (was: We need to ensure that we check the 
results of file operations whenever we invoke *mkdir*, *deleteFile* etc. and 
assert them right there before proceeding on.)

> Refactor tests in TestFileUtil
> --
>
> Key: HADOOP-18155
> URL: https://issues.apache.org/jira/browse/HADOOP-18155
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: common
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Trivial
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> We need to ensure that we check the results of file operations whenever we 
> invoke *mkdir*, *deleteFile* etc. and assert them right there before 
> proceeding on. Also, we need to ensure that some of the relevant FileSystem 
> APIs don't return null.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18155) Refactor tests in TestFileUtil

2022-03-07 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18155:
---

 Summary: Refactor tests in TestFileUtil
 Key: HADOOP-18155
 URL: https://issues.apache.org/jira/browse/HADOOP-18155
 Project: Hadoop Common
  Issue Type: Improvement
  Components: common
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


We need to ensure that we check the results of file operations whenever we 
invoke *mkdir*, *deleteFile* etc. and assert them right there before proceeding 
on.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-18151) Switch the baseurl for Centos 8

2022-03-04 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-18151.
-
Fix Version/s: 3.4.0
   Resolution: Fixed

Merged PR https://github.com/apache/hadoop/pull/4047 to trunk.

> Switch the baseurl for Centos 8
> ---
>
> Key: HADOOP-18151
> URL: https://issues.apache.org/jira/browse/HADOOP-18151
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
> Environment: Centos 8
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 2h
>  Remaining Estimate: 0h
>
> Centos 8 has reached its End-of-Life and thus its packages are no longer 
> accessible from  mirror.centos.org. We need to switch the *baseurl* to 
> vault.centos.org where the packages are archived.
> Please see https://www.centos.org/centos-linux-eol/ for more details.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18151) Switch the baseurl for Centos 8

2022-03-03 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18151:
---

 Summary: Switch the baseurl for Centos 8
 Key: HADOOP-18151
 URL: https://issues.apache.org/jira/browse/HADOOP-18151
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
 Environment: Centos 8
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Centos 8 has reached its End-of-Life and thus its packages are no longer 
accessible from  mirror.centos.org. We need to switch the *baseurl* to 
vault.centos.org where the packages are archived.

Please see https://www.centos.org/centos-linux-eol/ for more details.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18135) Produce Windows binaries of Hadoop

2022-02-20 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18135:
---

 Summary: Produce Windows binaries of Hadoop
 Key: HADOOP-18135
 URL: https://issues.apache.org/jira/browse/HADOOP-18135
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


We currently only provide Linux libraries and binaries. We need to provide the 
same for Windows.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Run CI for Windows 10

2022-02-20 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Description: Need to run the Jenkins Precommit CI for Windows 10 
environment so that we catch any breaking changes prior to merging them.  (was: 
Need to run the Jenkins Precommit CI for Windows 10 environment so that we 
catch any breaking changes.)

> Run CI for Windows 10
> -
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to run the Jenkins Precommit CI for Windows 10 environment so that we 
> catch any breaking changes prior to merging them.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Run CI for Windows 10

2022-02-20 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Description: Need to run the Jenkins Precommit CI for Windows 10 
environment so that we catch any breaking changes.  (was: Need to write a 
Dockerfile for Windows 10 that creates a Docker image for building Hadoop on 
the Windows 10 platform.)

> Run CI for Windows 10
> -
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to run the Jenkins Precommit CI for Windows 10 environment so that we 
> catch any breaking changes.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18134) Run CI for Windows 10

2022-02-20 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18134?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18134:

Language: Jenkins  (was: Docker)

> Run CI for Windows 10
> -
>
> Key: HADOOP-18134
> URL: https://issues.apache.org/jira/browse/HADOOP-18134
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.4.0
> Environment: Windows 10
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>
> Need to write a Dockerfile for Windows 10 that creates a Docker image for 
> building Hadoop on the Windows 10 platform.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18134) Run CI for Windows 10

2022-02-20 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18134:
---

 Summary: Run CI for Windows 10
 Key: HADOOP-18134
 URL: https://issues.apache.org/jira/browse/HADOOP-18134
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Need to write a Dockerfile for Windows 10 that creates a Docker image for 
building Hadoop on the Windows 10 platform.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18133) Add Dockerfile for Windows 10

2022-02-20 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18133:
---

 Summary: Add Dockerfile for Windows 10
 Key: HADOOP-18133
 URL: https://issues.apache.org/jira/browse/HADOOP-18133
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 3.4.0
 Environment: Windows 10
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra


Need to write a Dockerfile for Windows 10 that creates a Docker image for 
building Hadoop on the Windows 10 platform.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13941) hadoop-client-minicluster build error creating shaded jar duplicate entry

2021-12-30 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17467116#comment-17467116
 ] 

Gautham Banasandra commented on HADOOP-13941:
-

I got this issue when I ran -
{code:bash}
$ mvn package -Dhttps.protocols=TLSv1.2 -DskipTests -Pnative,dist 
-Drequire.fuse -Drequire.openssl -Drequire.snappy -Drequire.valgrind 
-Drequire.zstd -Drequire.test.libhadoop -Pyarn-ui -Dtar
{code}

It went away with -
{code:bash}
$ mvn clean package -Dhttps.protocols=TLSv1.2 -DskipTests -Pnative,dist 
-Drequire.fuse -Drequire.openssl -Drequire.snappy -Drequire.valgrind 
-Drequire.zstd -Drequire.test.libhadoop -Pyarn-ui -Dtar
{code}

> hadoop-client-minicluster build error creating shaded jar duplicate entry
> -
>
> Key: HADOOP-13941
> URL: https://issues.apache.org/jira/browse/HADOOP-13941
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.0.0-alpha2
> Environment: CentOS Linux release 7.2.1511 (Core) 
> $ mvn -version
> Apache Maven 3.0.5 (Red Hat 3.0.5-17)
> Maven home: /usr/share/maven
> Java version: 1.8.0_111, vendor: Oracle Corporation
> Java home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-2.b15.el7_3.x86_64/jre
> Default locale: en_US, platform encoding: UTF-8
> OS name: "linux", version: "3.10.0-327.36.3.el7.x86_64", arch: "amd64", 
> family: "unix"
>Reporter: John Zhuge
>Priority: Major
>
> {noformat}
> $ mvn clean package install -Pdist -Dtar -DskipTests -Dmaven.javadoc.skip
> ...
> [INFO] Apache Hadoop Client Test Minicluster . FAILURE [5:57.062s]
> [INFO] Apache Hadoop Client Packaging Invariants for Test  SKIPPED
> [INFO] Apache Hadoop Client Packaging Integration Tests .. SKIPPED
> [INFO] Apache Hadoop Distribution  SKIPPED
> [INFO] Apache Hadoop Client Modules .. SKIPPED
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 15:13.535s
> [INFO] Finished at: Fri Dec 23 11:31:59 PST 2016
> [INFO] Final Memory: 324M/1824M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-shade-plugin:2.4.3:shade (default) on project 
> hadoop-client-minicluster: Error creating shaded jar: duplicate entry: 
> META-INF/services/org.apache.hadoop.shaded.org.eclipse.jetty.http.HttpFieldPreEncoder
>  -> [Help 1]
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17509) Parallelize building of dependencies

2021-12-03 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17453061#comment-17453061
 ] 

Gautham Banasandra commented on HADOOP-17509:
-

[~smeng] these changes are present in trunk as well. Please see -
# 
https://github.com/apache/hadoop/blob/trunk/dev-support/docker/pkg-resolver/install-protobuf.sh#L50
# 
https://github.com/apache/hadoop/blob/trunk/dev-support/docker/pkg-resolver/install-intel-isa-l.sh#L51

I refactored the Dockerfile when I wrote [pkg-resolver 
module|https://github.com/apache/hadoop/tree/trunk/dev-support/docker/pkg-resolver].

> Parallelize building of dependencies
> 
>
> Key: HADOOP-17509
> URL: https://issues.apache.org/jira/browse/HADOOP-17509
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.3.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 3.3.1, 3.4.0
>
>  Time Spent: 3h 10m
>  Remaining Estimate: 0h
>
> Need to use make -j$(nproc) to parallelize building of Protocol buffers and 
> Intel ISA - L dependency.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-18007) Use platform specific endpoints for CI report

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18007:

Description: 
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.





To fix this issue, we need to do the following two things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 |

We need a way to change the URL to /artifact/*centos-7*/out/, 
/artifact/*ubuntu-focal*/out/ and so on.

  was:
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.



To fix this issue, we need to do the following two things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*a

[jira] [Updated] (HADOOP-18007) Use platform specific endpoints for CI report

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18007:

Description: 
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.



To fix this issue, we need to do the following two things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 |

We need a way to change the URL to /artifact/*centos-7*/out/, 
/artifact/*ubuntu-focal*/out/ and so on.

  was:
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.

To fix this issue, we need to do 2 things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-

[jira] [Updated] (HADOOP-18007) Use platform specific endpoints for CI report

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-18007:

Description: 
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.

To fix this issue, we need to do 2 things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 |

We need a way to change the URL to /artifact/*centos-7*/out/, 
/artifact/*ubuntu-focal*/out/ and so on.

  was:
Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.

To fix this issue, we need to do 2 things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-hadoop-yarn-projec

[jira] [Created] (HADOOP-18007) Use platform specific endpoints for CI report

2021-11-12 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-18007:
---

 Summary: Use platform specific endpoints for CI report
 Key: HADOOP-18007
 URL: https://issues.apache.org/jira/browse/HADOOP-18007
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
 Environment: Centos 7, Centos 8, Debian 10, Ubuntu Focal
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra
 Attachments: image-2021-11-12-21-13-18-997.png

Consider the Github comment by hadoop-yetus for different platform runs -

||Platform||hadoop-yetus report||
|Centos 7|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110|
|Centos 8|https://github.com/apache/hadoop/pull/3563#issuecomment-947239385|
|Debian 10|https://github.com/apache/hadoop/pull/3563#issuecomment-947340406|
|Ubuntu Focal|https://github.com/apache/hadoop/pull/3563#issuecomment-947464004|

 
Notice that the Docker subsystem points to the same file for all the platforms -

!image-2021-11-12-21-13-18-997.png!

To illustrate the issue here, upon clicking on the link to Dockerfile in 
[Centos 7's hadoop-yetus 
report|https://github.com/apache/hadoop/pull/3563#issuecomment-947151110] will 
navigate to Ubuntu Focal's Dockerfile (since Ubuntu Focal wass the most 
recently run platform for that pipeline run).
Please note that this issue applies to all the links that contain "out/" that 
appear in the Yetus summary.

As a side note, this issue doesn't happen when the Yetus run fails. Consider 
https://github.com/apache/hadoop/pull/3563#issuecomment-946215556. Here the 
Yetus run has failed for Centos 7. The link to Dockerfile points to Centos 7's 
Dockerfile since Centos 7 is the last platform that was run for that pipeline.

To fix this issue, we need to do 2 things -
*+Archive the artifacts even when the Yetus run is successful+*
Currently, we're archiving the artifacts only when the run fails - 
https://github.com/apache/hadoop/blob/bccf2f3ef4c8f09f010656f9061a4e323daf132b/dev-support/Jenkinsfile#L142-L148
 (Hence, we're able to navigate to the right Dockerfile when the Yetus run 
fails for a platform). We need to archive the artifacts even when the run 
succeeds so that we're able to navigate to it irrespective of success/failure 
of the Yetus run.

*+Expose different endpoints for each platform+*
At the end of the Yetus's pre-commit run, we see the following table -

|| Subsystem || Report/Notes ||
|  cc  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/results-compile-cc-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt
 |
|  unit  | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3650/1/*artifact/out*/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 |

We need a way to change the URL to */artifact/centos-7/out/*, 
*/artifact/ubuntu-focal/out/* and so on.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17986) Disable JIRA plugin for YETUS on Hadoop

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17986?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17986.
-
Resolution: Abandoned

Abandoning this due to 
https://github.com/apache/hadoop/pull/3608#issuecomment-962122351

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17986
> URL: https://issues.apache.org/jira/browse/HADOOP-17986
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.3.1
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17987) Disable JIRA plugin for YETUS on Hadoop

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17987.
-
Resolution: Abandoned

Abandoning this due to 
https://github.com/apache/hadoop/pull/3609#issuecomment-962121826.

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17987
> URL: https://issues.apache.org/jira/browse/HADOOP-17987
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.3.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17993) Disable JIRA plugin for YETUS on Hadoop

2021-11-12 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17993.
-
Fix Version/s: 3.2.3
   Resolution: Fixed

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17993
> URL: https://issues.apache.org/jira/browse/HADOOP-17993
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.2.3
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.2.3
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17992) Disable JIRA plugin for YETUS on Hadoop

2021-11-10 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17992.
-
Fix Version/s: 2.10.2
   Resolution: Fixed

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17992
> URL: https://issues.apache.org/jira/browse/HADOOP-17992
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 2.10.2
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17988) Disable JIRA plugin for YETUS on Hadoop

2021-11-09 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17988.
-
Fix Version/s: 3.3.3
   Resolution: Fixed

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17988
> URL: https://issues.apache.org/jira/browse/HADOOP-17988
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.3.3
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.3.3
>
>  Time Spent: 2h 20m
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17988) Disable JIRA plugin for YETUS on Hadoop

2021-11-09 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17988:

Affects Version/s: 3.3.3
   (was: 3.3.0)

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17988
> URL: https://issues.apache.org/jira/browse/HADOOP-17988
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.3.3
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
>  Time Spent: 2h 20m
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-08 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17440425#comment-17440425
 ] 

Gautham Banasandra commented on HADOOP-17880:
-

Thanks for the PR [~baizhendong]. I've merged your PR 3535 to branch-2.10.

> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
> Fix For: 2.10.2
>
>  Time Spent: 13h 50m
>  Remaining Estimate: 0h
>
> Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
> we aim to provide a Dockerfile that builds out the image with all the 
> dependencies needed to build Hadoop on Centos 7.
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-08 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra resolved HADOOP-17880.
-
Fix Version/s: 2.10.2
   Resolution: Fixed

> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
> Fix For: 2.10.2
>
>  Time Spent: 13h 50m
>  Remaining Estimate: 0h
>
> Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
> we aim to provide a Dockerfile that builds out the image with all the 
> dependencies needed to build Hadoop on Centos 7.
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-17985) Disable JIRA plugin for YETUS on Hadoop

2021-11-05 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17439410#comment-17439410
 ] 

Gautham Banasandra edited comment on HADOOP-17985 at 11/5/21, 6:27 PM:
---

Thanks [~aajisaka]. I've created the following backports, could you please 
review them -
# branch-2.10 - https://github.com/apache/hadoop/pull/3623
# branch-3.2 - https://github.com/apache/hadoop/pull/3624

I'll abandon the PRs for branches 3.3.1 and 3.3.0.


was (Author: gautham):
Thanks [~aajisaka]. I've created the backport for branch-2.10 - 
https://github.com/apache/hadoop/pull/3623. Could you please review it? I'll 
abandon the PRs for branches 3.3.1 and 3.3.0.

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17985
> URL: https://issues.apache.org/jira/browse/HADOOP-17985
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17993) Disable JIRA plugin for YETUS on Hadoop

2021-11-05 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17993:

Fix Version/s: (was: 3.4.0)

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17993
> URL: https://issues.apache.org/jira/browse/HADOOP-17993
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.2.3
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17993) Disable JIRA plugin for YETUS on Hadoop

2021-11-05 Thread Gautham Banasandra (Jira)
Gautham Banasandra created HADOOP-17993:
---

 Summary: Disable JIRA plugin for YETUS on Hadoop
 Key: HADOOP-17993
 URL: https://issues.apache.org/jira/browse/HADOOP-17993
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.4.0
Reporter: Gautham Banasandra
Assignee: Gautham Banasandra
 Fix For: 3.4.0


I’ve been noticing an issue with Jenkins CI where a file jira-json goes missing 
all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 (apache.org)
{code}
[2021-10-27T17:52:58.787Z] Processing: 
https://github.com/apache/hadoop/pull/3588
[2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
[2021-10-27T17:52:58.787Z] https://api.github.com/repos/apache/hadoop/pulls/3588
[2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
[2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
[2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
[2021-10-27T17:52:59.814Z] awk: cannot open 
/home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
 (No such file or directory)
[2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
issue status is not matched with "Patch Available".
[2021-10-27T17:52:59.814Z]
{code}
This causes the pipeline run to fail. I’ve seen this in my multiple attempts to 
re-run the CI on my PR –
 # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
(apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
 # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
(apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
 # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
(apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]

The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling the 
*jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17993) Disable JIRA plugin for YETUS on Hadoop

2021-11-05 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17993:

Affects Version/s: (was: 3.4.0)
   3.2.3

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17993
> URL: https://issues.apache.org/jira/browse/HADOOP-17993
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.2.3
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17985) Disable JIRA plugin for YETUS on Hadoop

2021-11-05 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17985?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17439410#comment-17439410
 ] 

Gautham Banasandra commented on HADOOP-17985:
-

Thanks [~aajisaka]. I've created the backport for branch-2.10 - 
https://github.com/apache/hadoop/pull/3623. Could you please review it? I'll 
abandon the PRs for branches 3.3.1 and 3.3.0.

> Disable JIRA plugin for YETUS on Hadoop
> ---
>
> Key: HADOOP-17985
> URL: https://issues.apache.org/jira/browse/HADOOP-17985
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.4.0
>Reporter: Gautham Banasandra
>Assignee: Gautham Banasandra
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> I’ve been noticing an issue with Jenkins CI where a file jira-json goes 
> missing all of a sudden – jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)
> {code}
> [2021-10-27T17:52:58.787Z] Processing: 
> https://github.com/apache/hadoop/pull/3588
> [2021-10-27T17:52:58.787Z] GITHUB PR #3588 is being downloaded from
> [2021-10-27T17:52:58.787Z] 
> https://api.github.com/repos/apache/hadoop/pulls/3588
> [2021-10-27T17:52:58.787Z] JSON data at Wed Oct 27 17:52:55 UTC 2021
> [2021-10-27T17:52:58.787Z] Patch data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:58.787Z] Diff data at Wed Oct 27 17:52:56 UTC 2021
> [2021-10-27T17:52:59.814Z] awk: cannot open 
> /home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-3588/centos-7/out/jira-json
>  (No such file or directory)
> [2021-10-27T17:52:59.814Z] ERROR: https://github.com/apache/hadoop/pull/3588 
> issue status is not matched with "Patch Available".
> [2021-10-27T17:52:59.814Z]
> {code}
> This causes the pipeline run to fail. I’ve seen this in my multiple attempts 
> to re-run the CI on my PR –
>  # After 45 minutes – [jenkins / hadoop-multibranch / PR-3588 / #1 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/1/pipeline/]
>  # After 1 minute – [jenkins / hadoop-multibranch / PR-3588 / #2 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/2/pipeline/]
>  # After 17 minutes – [jenkins / hadoop-multibranch / PR-3588 / #3 
> (apache.org)|https://ci-hadoop.apache.org/blue/organizations/jenkins/hadoop-multibranch/detail/PR-3588/3/pipeline/]
> The hadoop-multibranch pipeline doesn't use ASF JIRA, thus, we're disabling 
> the *jira* plugin to fix this issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-05 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17439408#comment-17439408
 ] 

Gautham Banasandra edited comment on HADOOP-17880 at 11/5/21, 6:05 PM:
---

[~baizhendong] I've taken the liberty to alter the title and description as per 
[~iwasakims]'s suggestions above. The PR looks good to me. I've approved it. 
I'll merge it tomorrow if there are no further comments.


was (Author: gautham):
[~baizhendong] I've taken the liberty to alter the title and description as per 
[~iwasakims]'s suggestions above.

> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 13h
>  Remaining Estimate: 0h
>
> Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
> we aim to provide a Dockerfile that builds out the image with all the 
> dependencies needed to build Hadoop on Centos 7.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-05 Thread Gautham Banasandra (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17439408#comment-17439408
 ] 

Gautham Banasandra commented on HADOOP-17880:
-

[~baizhendong] I've taken the liberty to alter the title and description as per 
[~iwasakims]'s suggestions above.

> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 13h
>  Remaining Estimate: 0h
>
> Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
> we aim to provide a Dockerfile that builds out the image with all the 
> dependencies needed to build Hadoop on Centos 7.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-05 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17880:

Description: 
Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
we aim to provide a Dockerfile that builds out the image with all the 
dependencies needed to build Hadoop on Centos 7.

 

  was:
1. currently, we build the hadoop 2.10.0 with docker machine, and must install 
Virtual Box, and for hadoop 3.x, just build with docker only.

2. besides this, the docker image dependency is out of date, and some of them 
is unavaialble, for example – jdk7

3. but just building hadoop 2.10.0 with hadoop 3.x build script without 
modification is not working, for the protocol buffer version is not 2.5.0, and 
it's not work for native build.


> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 13h
>  Remaining Estimate: 0h
>
> Getting Hadoop to build on Centos 7 will greatly benefit the community. Here, 
> we aim to provide a Dockerfile that builds out the image with all the 
> dependencies needed to build Hadoop on Centos 7.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17880) Build Hadoop on Centos 7

2021-11-05 Thread Gautham Banasandra (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gautham Banasandra updated HADOOP-17880:

Summary: Build Hadoop on Centos 7  (was: Build 2.10.x with docker)

> Build Hadoop on Centos 7
> 
>
> Key: HADOOP-17880
> URL: https://issues.apache.org/jira/browse/HADOOP-17880
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.10.2
> Environment: mac os x86_64
>Reporter: baizhendong
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 13h
>  Remaining Estimate: 0h
>
> 1. currently, we build the hadoop 2.10.0 with docker machine, and must 
> install Virtual Box, and for hadoop 3.x, just build with docker only.
> 2. besides this, the docker image dependency is out of date, and some of them 
> is unavaialble, for example – jdk7
> 3. but just building hadoop 2.10.0 with hadoop 3.x build script without 
> modification is not working, for the protocol buffer version is not 2.5.0, 
> and it's not work for native build.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



  1   2   >