Re: [NOTICE] Building trunk needs protoc 3.7.1

2019-09-20 Thread Vinayakumar B
@Wangda Tan  ,
Sorry for the confusion. HADOOP-13363 is umbrella jira to track multiple
stages of protobuf upgrade in subtasks. (jar upgrade, Docker update, plugin
upgrade, shading, etc).
Right now, first task of jar upgrade is done. So need to update the protoc
executable in the in build environments.

@张铎(Duo Zhang)  ,
Sorry for the inconvenience. Yes, indeed plugin update before jar upgrde
was possible. Sorry I missed it.

Plugin update needed to be done for whole project, for which precommit
jenkins will need more time complete end-to-end runs.
So plugin update is planned in stages in further subtasks. It could be done
in 2-3 days.

-Vinay

On Sat, 21 Sep 2019, 5:55 am 张铎(Duo Zhang),  wrote:

> I think this one is alread in place so we have to upgrade...
>
> https://issues.apache.org/jira/browse/HADOOP-16557
>
> Wangda Tan  于2019年9月21日周六 上午7:19写道:
>
> > Hi Vinay,
> >
> > A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> > upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
> >
> > Thanks,
> > Wangda
> >
> > On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B 
> > wrote:
> >
> > > Hi All,
> > >
> > > A very long pending task, protobuf upgrade is happening in
> HADOOP-13363.
> > As
> > > part of that protobuf version is upgraded to 3.7.1.
> > >
> > > Please update your build environments to have 3.7.1 protobuf version.
> > >
> > > BUILIDING.txt has been updated with latest instructions.
> > >
> > > This pre-requisite to update protoc dependecy manually is required
> until
> > > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > > dynamically resolve required protoc exe.
> > >
> > > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> > test
> > > environments.
> > >
> > > Thanks,
> > > -Vinay
> > >
> >
>


Apache Hadoop qbt Report: branch2+JDK7 on Linux/x86

2019-09-20 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/451/

[Sep 20, 2019 4:59:14 PM] (ekrogen) HADOOP-16581. Revise ValueQueue to 
correctly replenish queues that go

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Re: [NOTICE] Building trunk needs protoc 3.7.1

2019-09-20 Thread Duo Zhang
I think this one is alread in place so we have to upgrade...

https://issues.apache.org/jira/browse/HADOOP-16557

Wangda Tan  于2019年9月21日周六 上午7:19写道:

> Hi Vinay,
>
> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>
> Thanks,
> Wangda
>
> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B 
> wrote:
>
> > Hi All,
> >
> > A very long pending task, protobuf upgrade is happening in HADOOP-13363.
> As
> > part of that protobuf version is upgraded to 3.7.1.
> >
> > Please update your build environments to have 3.7.1 protobuf version.
> >
> > BUILIDING.txt has been updated with latest instructions.
> >
> > This pre-requisite to update protoc dependecy manually is required until
> > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > dynamically resolve required protoc exe.
> >
> > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> test
> > environments.
> >
> > Thanks,
> > -Vinay
> >
>


Re: [NOTICE] Building trunk needs protoc 3.7.1

2019-09-20 Thread Duo Zhang
As suggested before, please consider switching to maven protobuf plugin
first before upgrading the protobuf dependency.

https://www.xolstice.org/protobuf-maven-plugin/examples/protoc-artifact.html


This plugin will download the protoc binary automatically so we do not need
to have developers install protoc manually on their machine.

Wangda Tan  于2019年9月21日周六 上午7:19写道:

> Hi Vinay,
>
> A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
> upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?
>
> Thanks,
> Wangda
>
> On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B 
> wrote:
>
> > Hi All,
> >
> > A very long pending task, protobuf upgrade is happening in HADOOP-13363.
> As
> > part of that protobuf version is upgraded to 3.7.1.
> >
> > Please update your build environments to have 3.7.1 protobuf version.
> >
> > BUILIDING.txt has been updated with latest instructions.
> >
> > This pre-requisite to update protoc dependecy manually is required until
> > 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> > dynamically resolve required protoc exe.
> >
> > Dockerfile is being updated to have latest 3.7.1 as default protoc for
> test
> > environments.
> >
> > Thanks,
> > -Vinay
> >
>


Re: [NOTICE] Building trunk needs protoc 3.7.1

2019-09-20 Thread Wangda Tan
Hi Vinay,

A bit confused, I saw the HADOOP-13363 is still pending. Do we need to
upgrade protobuf version to 3.7.1 NOW or once HADOOP-13363 is completed?

Thanks,
Wangda

On Fri, Sep 20, 2019 at 8:11 AM Vinayakumar B 
wrote:

> Hi All,
>
> A very long pending task, protobuf upgrade is happening in HADOOP-13363. As
> part of that protobuf version is upgraded to 3.7.1.
>
> Please update your build environments to have 3.7.1 protobuf version.
>
> BUILIDING.txt has been updated with latest instructions.
>
> This pre-requisite to update protoc dependecy manually is required until
> 'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
> dynamically resolve required protoc exe.
>
> Dockerfile is being updated to have latest 3.7.1 as default protoc for test
> environments.
>
> Thanks,
> -Vinay
>


[jira] [Created] (HADOOP-16591) S3A ITest*MRjob failures

2019-09-20 Thread Siddharth Seth (Jira)
Siddharth Seth created HADOOP-16591:
---

 Summary: S3A ITest*MRjob failures
 Key: HADOOP-16591
 URL: https://issues.apache.org/jira/browse/HADOOP-16591
 Project: Hadoop Common
  Issue Type: Test
  Components: fs/s3
Reporter: Siddharth Seth
Assignee: Siddharth Seth


ITest*MRJob fail with a FileNotFoundException
{code}
[ERROR]   
ITestMagicCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestDirectoryCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestPartitionCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
[ERROR]   
ITestStagingCommitMRJob>AbstractITCommitMRJob.testMRJob:146->AbstractFSContractTestBase.assertIsDirectory:327
 » FileNotFound
{code}
Details here: 
https://issues.apache.org/jira/browse/HADOOP-16207?focusedCommentId=16933718=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16933718

Creating a separate jira since HADOOP-16207 already has a patch which is trying 
to parallelize the test runs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16590) IBM Java has deprecated OS login module classes and OS principal classes.

2019-09-20 Thread Nicholas Marion (Jira)
Nicholas Marion created HADOOP-16590:


 Summary: IBM Java has deprecated OS login module classes and OS 
principal classes.
 Key: HADOOP-16590
 URL: https://issues.apache.org/jira/browse/HADOOP-16590
 Project: Hadoop Common
  Issue Type: Bug
  Components: security
Reporter: Nicholas Marion


When building applications that rely on hadoop-commons and using IBM Java, 
errors such as `{{Exception in thread "main" java.io.IOException: failure to 
login}}` and `{{Unable to find JAAS 
classes:com.ibm.security.auth.LinuxPrincipal}}` can be seen. 

IBM Java has deprecated the following OS Login Module classes:

```
com.ibm.security.auth.module.Win64LoginModule
com.ibm.security.auth.module.NTLoginModule
com.ibm.security.auth.module.AIX64LoginModule
com.ibm.security.auth.module.AIXLoginModule
com.ibm.security.auth.module.LinuxLoginModule
```
and replaced with
`com.ibm.security.auth.module.JAASLoginModule`

IBM Java has deprecated the following OS Principal classes:

```

com.ibm.security.auth.UsernamePrincipal

com.ibm.security.auth.NTUserPrincipal

com.ibm.security.auth.AIXPrincipal

com.ibm.security.auth.LinuxPrincipal
```


and replaced with `com.ibm.security.auth.UsernamePrincipal`.

Older issue HADOOP-15765 has same issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2019-09-20 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/

[Sep 19, 2019 7:16:12 AM] (elek) HDDS-730. ozone fs cli prints hadoop fs in 
usage
[Sep 19, 2019 8:41:00 AM] (elek) HDDS-2147. Include dumpstream in test report
[Sep 19, 2019 9:18:16 AM] (elek) HDDS-2016. Add option to enforce GDPR in 
Bucket Create command
[Sep 19, 2019 9:59:43 AM] (elek) HDDS-2119. Use checkstyle.xml and 
suppressions.xml in hdds/ozone
[Sep 19, 2019 10:26:53 AM] (elek) HDDS-2148. Remove redundant code in 
CreateBucketHandler.java
[Sep 19, 2019 12:11:44 PM] (elek) HDDS-2141. Missing total number of operations
[Sep 19, 2019 1:23:35 PM] (kihwal) HADOOP-16582. LocalFileSystem's mkdirs() 
does not work as expected under
[Sep 19, 2019 3:00:05 PM] (stevel) HADOOP-16556. Fix some alerts raised by LGTM.
[Sep 19, 2019 4:41:55 PM] (aengineer) HDDS-2110. Arbitrary file can be 
downloaded with the help of
[Sep 19, 2019 4:50:21 PM] (aengineer) HDDS-2127. Detailed Tools doc not 
reachable
[Sep 19, 2019 5:58:33 PM] (bharat) HDDS-2151. Ozone client logs the entire 
request payload at DEBUG level
[Sep 19, 2019 6:00:10 PM] (inigoiri) HDFS-14609. RBF: Security should use 
common AuthenticationFilter.
[Sep 19, 2019 6:06:02 PM] (bharat) HDDS-1054. List Multipart uploads in a 
bucket (#1277)
[Sep 19, 2019 6:30:33 PM] (bharat) HDDS-2154. Fix Checkstyle issues (#1475)
[Sep 19, 2019 11:28:29 PM] (aengineer) HDDS-2101. Ozone filesystem provider 
doesn't exist (#1473)




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-mawo/hadoop-yarn-applications-mawo-core
 
   Class org.apache.hadoop.applications.mawo.server.common.TaskStatus 
implements Cloneable but does not define or use clone method At 
TaskStatus.java:does not define or use clone method At TaskStatus.java:[lines 
39-346] 
   Equals method for 
org.apache.hadoop.applications.mawo.server.worker.WorkerId assumes the argument 
is of type WorkerId At WorkerId.java:the argument is of type WorkerId At 
WorkerId.java:[line 114] 
   
org.apache.hadoop.applications.mawo.server.worker.WorkerId.equals(Object) does 
not check for null argument At WorkerId.java:null argument At 
WorkerId.java:[lines 114-115] 

Failed junit tests :

   hadoop.hdfs.TestReconstructStripedFile 
   hadoop.hdfs.server.datanode.TestNNHandlesBlockReportPerStorage 
   hadoop.hdfs.server.federation.router.TestRouterFaultTolerant 
   hadoop.mapreduce.v2.hs.TestJobHistoryParsing 
   hadoop.yarn.sls.TestSLSStreamAMSynth 
   hadoop.yarn.sls.appmaster.TestAMSimulator 
   hadoop.fs.adl.live.TestAdlSdkConfiguration 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-compile-javac-root.txt
  [332K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-checkstyle-root.txt
  [17M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-patch-hadolint.txt
  [8.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-patch-pylint.txt
  [220K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-patch-shellcheck.txt
  [24K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/diff-patch-shelldocs.txt
  [44K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/whitespace-eol.txt
  [9.6M]
   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1265/artifact/out/whitespace-tabs.txt
  [1.1M]

   xml:

   

[NOTICE] Building trunk needs protoc 3.7.1

2019-09-20 Thread Vinayakumar B
Hi All,

A very long pending task, protobuf upgrade is happening in HADOOP-13363. As
part of that protobuf version is upgraded to 3.7.1.

Please update your build environments to have 3.7.1 protobuf version.

BUILIDING.txt has been updated with latest instructions.

This pre-requisite to update protoc dependecy manually is required until
'hadoop-maven-plugin' is replaced with 'protobuf-mavem-plugin' to
dynamically resolve required protoc exe.

Dockerfile is being updated to have latest 3.7.1 as default protoc for test
environments.

Thanks,
-Vinay


Apache Hadoop qbt Report: branch2+JDK7 on Linux/x86

2019-09-20 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/

[Sep 19, 2019 1:27:23 PM] (kihwal) HADOOP-16582. LocalFileSystem's mkdirs() 
does not work as expected under
[Sep 19, 2019 8:21:42 PM] (ericp) YARN-7817. Add Resource reference to RM's 
NodeInfo object so REST API
[Sep 19, 2019 8:25:31 PM] (ericp) YARN-7860. Fix UT failure 
TestRMWebServiceAppsNodelabel#testAppsRunning.
[Sep 19, 2019 10:27:30 PM] (jhung) YARN-7410. Cleanup FixedValueResource to 
avoid dependency to




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/empty-configuration.xml
 
   hadoop-tools/hadoop-azure/src/config/checkstyle-suppressions.xml 
   hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/public/crossdomain.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/src/main/webapp/public/crossdomain.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client
 
   Boxed value is unboxed and then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:[line 335] 

Failed junit tests :

   hadoop.util.TestDiskCheckerWithDiskIo 
   hadoop.hdfs.TestDecommission 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.registry.secure.TestSecureLogins 
   hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-compile-cc-root-jdk1.7.0_95.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-compile-javac-root-jdk1.7.0_95.txt
  [328K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-compile-cc-root-jdk1.8.0_222.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-compile-javac-root-jdk1.8.0_222.txt
  [308K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-checkstyle-root.txt
  [16M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-patch-pylint.txt
  [24K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-patch-shellcheck.txt
  [72K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-patch-shelldocs.txt
  [8.0K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/whitespace-eol.txt
  [12M]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/whitespace-tabs.txt
  [1.3M]

   xml:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/xml.txt
  [12K]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-timelineservice-hbase_hadoop-yarn-server-timelineservice-hbase-client-warnings.html
  [8.0K]

   javadoc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-javadoc-javadoc-root-jdk1.7.0_95.txt
  [16K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/diff-javadoc-javadoc-root-jdk1.8.0_222.txt
  [1.1M]

   unit:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [164K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/450/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [232K]
   

[jira] [Created] (HADOOP-16589) [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-20 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16589:
--

 Summary: [pb-upgrade] Update docker image to make 3.7.1 protoc as 
default
 Key: HADOOP-16589
 URL: https://issues.apache.org/jira/browse/HADOOP-16589
 Project: Hadoop Common
  Issue Type: Sub-task
Reporter: Vinayakumar B


Right now, docker image contains both 2.5.0 protoc and 3.7.1 protoc.

2.5.0 is default protoc in PATH.

After HADOOP-16557, protoc version expected in PATH is 3.7.1. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: [DISCUSS] Separate Hadoop Core trunk and Hadoop Ozone trunk source tree

2019-09-20 Thread Peter Bacsko
+1 (non-binding)

On Fri, Sep 20, 2019 at 8:01 AM Rakesh Radhakrishnan 
wrote:

> +1
>
> Rakesh
>
> On Fri, Sep 20, 2019 at 12:29 AM Aaron Fabbri  wrote:
>
> > +1 (binding)
> >
> > Thanks to the Ozone folks for their efforts at maintaining good
> separation
> > with HDFS and common. I took a lot of heat for the unpopular opinion that
> > they should  be separate, so I am glad the process has worked out well
> for
> > both codebases. It looks like my concerns were addressed and I appreciate
> > it.  It is cool to see the evolution here.
> >
> > Aaron
> >
> >
> > On Thu, Sep 19, 2019 at 3:37 AM Steve Loughran
>  > >
> > wrote:
> >
> > > in that case,
> > >
> > > +1 from me (binding)
> > >
> > > On Wed, Sep 18, 2019 at 4:33 PM Elek, Marton  wrote:
> > >
> > > >  > one thing to consider here as you are giving up your ability to
> make
> > > >  > changes in hadoop-* modules, including hadoop-common, and their
> > > >  > dependencies, in sync with your own code. That goes for filesystem
> > > > contract
> > > >  > tests.
> > > >  >
> > > >  > are you happy with that?
> > > >
> > > >
> > > > Yes. I think we can live with it.
> > > >
> > > > Fortunatelly the Hadoop parts which are used by Ozone (security +
> rpc)
> > > > are stable enough, we didn't need bigger changes until now (small
> > > > patches are already included in 3.1/3.2).
> > > >
> > > > I think it's better to use released Hadoop bits in Ozone anyway, and
> > > > worst (best?) case we can try to do more frequent patch releases from
> > > > Hadoop (if required).
> > > >
> > > >
> > > > m.
> > > >
> > > >
> > > >
> > >
> >
>


[jira] [Created] (HADOOP-16588) Update commons-beanutils version to 1.9.4 in branch-2

2019-09-20 Thread Wei-Chiu Chuang (Jira)
Wei-Chiu Chuang created HADOOP-16588:


 Summary: Update commons-beanutils version to 1.9.4 in branch-2
 Key: HADOOP-16588
 URL: https://issues.apache.org/jira/browse/HADOOP-16588
 Project: Hadoop Common
  Issue Type: Task
Reporter: Wei-Chiu Chuang
 Attachments: HADOOP-16588.branch-2.001.patch

Similar to HADOOP-16542 but we need to do it differently.
In branch-2, we pull in commons-beanutils through commons-configuration 1.6 --> 
commons-digester 1.8

{noformat}
[INFO] +- commons-configuration:commons-configuration:jar:1.6:compile
[INFO] |  +- commons-digester:commons-digester:jar:1.8:compile
[INFO] |  |  \- commons-beanutils:commons-beanutils:jar:1.7.0:compile
[INFO] |  \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile
{noformat}

I have a patch to update version of the transitive dependency.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: [DISCUSS] Separate Hadoop Core trunk and Hadoop Ozone trunk source tree

2019-09-20 Thread Rakesh Radhakrishnan
+1

Rakesh

On Fri, Sep 20, 2019 at 12:29 AM Aaron Fabbri  wrote:

> +1 (binding)
>
> Thanks to the Ozone folks for their efforts at maintaining good separation
> with HDFS and common. I took a lot of heat for the unpopular opinion that
> they should  be separate, so I am glad the process has worked out well for
> both codebases. It looks like my concerns were addressed and I appreciate
> it.  It is cool to see the evolution here.
>
> Aaron
>
>
> On Thu, Sep 19, 2019 at 3:37 AM Steve Loughran  >
> wrote:
>
> > in that case,
> >
> > +1 from me (binding)
> >
> > On Wed, Sep 18, 2019 at 4:33 PM Elek, Marton  wrote:
> >
> > >  > one thing to consider here as you are giving up your ability to make
> > >  > changes in hadoop-* modules, including hadoop-common, and their
> > >  > dependencies, in sync with your own code. That goes for filesystem
> > > contract
> > >  > tests.
> > >  >
> > >  > are you happy with that?
> > >
> > >
> > > Yes. I think we can live with it.
> > >
> > > Fortunatelly the Hadoop parts which are used by Ozone (security + rpc)
> > > are stable enough, we didn't need bigger changes until now (small
> > > patches are already included in 3.1/3.2).
> > >
> > > I think it's better to use released Hadoop bits in Ozone anyway, and
> > > worst (best?) case we can try to do more frequent patch releases from
> > > Hadoop (if required).
> > >
> > >
> > > m.
> > >
> > >
> > >
> >
>