[
https://issues.apache.org/jira/browse/PHOENIX-7261?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Nihal Jain updated PHOENIX-7261:
--------------------------------
Affects Version/s: (was: 5.2.0)
> Align mockito version with Hadoop and HBase in Phoenix QueryServer
> ------------------------------------------------------------------
>
> Key: PHOENIX-7261
> URL: https://issues.apache.org/jira/browse/PHOENIX-7261
> Project: Phoenix
> Issue Type: Bug
> Reporter: Nihal Jain
> Assignee: Nihal Jain
> Priority: Major
> Fix For: 5.2.0
>
>
> As mentioned in PHOENIX-6769 by [~stoty]
> {quote}There is a well known incompatibility between old versions of
> mockito-all and mockito-core and newer versions. It manifests as
> IncompatibleClassChangeErrors and other linkage problems. The Hadoop
> minicluster in versions 3.x embed mockito classes in the minicluster.
> To avoid potential problems it would be best to align Phoenix use of mockito
> (mockito-core) with downstreamers. HBase uses mockito-core 2.28.2 on
> branch-2.4 and branch-2.5. (Phoenix is on 1.10.19.) I checked Hadoop
> branch-3.3 and it's also on 2.28.2.
> I recently opened a PR for OMID-226 to fix the same concern in phoenix-omid.
> {quote}
>
> Goal is to Update mockito to 4.11.0, same as Hbase branch-3. Same was done
> in PHOENIX-6769 for phoenix.
> Also Context on why I want this:
> # Currently we are working on building phoenix, pqs and omid with hadoop
> 3.3.6 and seems we fail to even start a minicluster with the mockito that is
> bundled in code, with following error:
> {code:java}
> [ERROR]
> org.apache.phoenix.tool.ParameterizedPhoenixCanaryToolIT.phoenixCanaryToolTest[ParameterizedPhoenixCanaryToolIT_isPositiveTestType=false,isNamespaceEnabled=false,resultSinkOption=org.apache.phoenix.tool.PhoenixCanaryTool$StdOutSink]
> -- Time elapsed: 4.234 s <<< ERROR!
> java.lang.RuntimeException: java.lang.IncompatibleClassChangeError: class
> org.apache.hadoop.hdfs.server.namenode.NameNodeAdapter$2 can not implement
> org.mockito.ArgumentMatcher, because it is not an interface
> (org.mockito.ArgumentMatcher is in unnamed module of loader 'app')
> at
> org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:551)
> at
> org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:450)
> at
> org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:436)
> at
> org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:518)
> at
> org.apache.phoenix.tool.ParameterizedPhoenixCanaryToolIT.setup(ParameterizedPhoenixCanaryToolIT.java:115)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:568)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
> at
> org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
> at
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
> at
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
> at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
> at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
> at org.junit.runners.Suite.runChild(Suite.java:128)
> at org.junit.runners.Suite.runChild(Suite.java:27)
> at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
> at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
> at
> org.apache.phoenix.SystemExitRule$1.evaluate(SystemExitRule.java:40)
> at
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:316)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:240)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:214)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:155)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:385)
> at
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162)
> at
> org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:507)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:495)
> Caused by: java.lang.IncompatibleClassChangeError: class
> org.apache.hadoop.hdfs.server.namenode.NameNodeAdapter$2 can not implement
> org.mockito.ArgumentMatcher, because it is not an interface
> (org.mockito.ArgumentMatcher is in unnamed module of loader 'app')
> at java.base/java.lang.ClassLoader.defineClass1(Native Method)
> at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012)
> at
> java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader$4.run(BuiltinClassLoader.java:773)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader$4.run(BuiltinClassLoader.java:768)
> at
> java.base/java.security.AccessController.doPrivileged(AccessController.java:318)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:781)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681)
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639)
> at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.isNameNodeUp(MiniDFSCluster.java:2666)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.isClusterUp(MiniDFSCluster.java:2680)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.waitClusterUp(MiniDFSCluster.java:1510)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:989)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:878)
> at
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:689)
> at
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:669)
> at
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1141)
> at
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1106)
> at
> org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:545)
> ... 45 more
> {code}
> # Also, as per
> [https://github.com/mockito/mockito/blob/release/2.x/doc/release-notes/official.md#2191]
> Java 11 comat for mockito was introduced in 2.19+.
>
> Hence I would want to align it with Hadoop/HBase.
> I also plan to work on OMID-260 and backport PHOENIX-6769 to branch-5.1
--
This message was sent by Atlassian Jira
(v8.20.10#820010)