I looked at the history[1] of the hadoop project pom and found that
HADOOP-16213[2] seems to be the cause of this change. So it seems like
we need to bump the guava version if we want to work with newer
versions of Hadoop 3.

One of the goals of 1.9 (and I think 1.10) is to be a bridge version
between hadoop 2 and 3.  Need to determine if there is a good way to
achieve this goal and if this goal is still desired.  If hadoop 2 is
still using an older version of Guava, then maybe we could make
Accumulo's 1.x source build against the new and old versions of Guava
and make the hadoop 3 profile use the newer version of Guava.  Not
sure if this is possible.

The modenizer and java 8 is its own issue.

[1]: 
https://github.com/apache/hadoop/commits/release-3.1.3-RC0/hadoop-project/pom.xml
[2]: https://issues.apache.org/jira/browse/HADOOP-16213

On Wed, Nov 20, 2019 at 10:51 PM Arvind Shyamsundar
<arvin...@microsoft.com.invalid> wrote:
>
> Hello!
> Per this issue(https://github.com/apache/accumulo/issues/569) building 1.9.x 
> with Hadoop 3 support needs hadoop.profile=3. So I checked out current 1.9 
> branch and built with -Dhadoop.profile=3. When I deployed this "custom" 
> Accumulo build with Hadoop 3.1.3, accumulo init failed:
>
> Caused by: java.lang.NoSuchMethodError: 
> com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
>         at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
>         at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
>         at 
> org.apache.hadoop.conf.Configuration.setInt(Configuration.java:1515)
>         at 
> org.apache.hadoop.hdfs.server.namenode.ha.AbstractNNFailoverProxyProvider.<init>(AbstractNNFailoverProxyProvider.java:70)
>         at 
> org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.<init>(ConfiguredFailoverProxyProvider.java:44)
>
> This is related to Guava. The version of Guava that is used by Hadoop 3.1.3 
> is 27.0-jre while Accumulo 1.9 still depends (and includes) Guava 14.0. So I 
> set about to build 1.9 with Guava 27.0-jre. I had to set the compiler version 
> to 1.8. As Christopher had mentioned to in a the 1.10 thread, I also ran into 
> problems with modernizer. Without disabling modernizer, the refactor involved 
> looks non-trivial. I also had issues with outdated interfaces in 
> DataoutputHasher.java, CloseWriteAheadLogReferences.java, 
> RemoveCompleteReplicationRecords.java but those were relatively easy fixes. 
> FWIW, I pushed my changes here: 
> https://github.com/apache/accumulo/compare/master...arvindshmicrosoft:temp-1.9-guava27.
>
> So my question is: are these known issues with the current 1.9 branch and 
> Hadoop? Do we want to support Hadoop 3.1 / 3.2 with Accumulo 1.10?
>
> Thank you.
>
> - Arvind.

Reply via email to