hi, 不排除依赖的话环境都起不来的哈, java.lang.IncompatibleClassChangeError: Implementing class
at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:756) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at java.net.URLClassLoader.access$100(URLClassLoader.java:74) at java.net.URLClassLoader$1.run(URLClassLoader.java:369) at java.net.URLClassLoader$1.run(URLClassLoader.java:363) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:362) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:112) at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:48) at org.apache.flink.table.planner.delegation.BlinkPlannerFactory.create(BlinkPlannerFactory.java:50) at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:130) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:111) at com.akulaku.data.flink.ParserDataTest.parserDataTest(ParserDataTest.java:24) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.junit.runner.JUnitCore.run(JUnitCore.java:137) at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68) at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33) at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230) at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58) Rui Li <lirui.fu...@gmail.com> 于2020年7月20日周一 上午11:48写道: > 现在具体是遇到了什么冲突呀?hive > connector本身在依赖hive的时候确实也排除了很多传递依赖,才能正常运行UT和IT。也可以参考我们的pom来看排除了哪些依赖: > > https://github.com/apache/flink/blob/release-1.11.0/flink-connectors/flink-connector-hive/pom.xml > > On Fri, Jul 17, 2020 at 5:32 PM Dream-底限 <zhan...@akulaku.com> wrote: > > > hi > > 我用的是用户定义依赖,没有用捆绑依赖包,捆绑依赖包还要自己下载一次。。。。。 > > > > Dream-底限 <zhan...@akulaku.com> 于2020年7月17日周五 下午5:24写道: > > > > > 1.9和1.10时候排除一些传递依赖后在idea和打uber jar在集群环境都可以运行,不排除传递依赖的话在idea运行不了; > > > > > > 1.11现在只在本地测哪,不排除传递依赖idea运行不了,集群环境还没弄,但是我感觉在idea直接run这个功能好多人都需要,文档是不是可以改进一下 > > > > > > Jingsong Li <jingsongl...@gmail.com> 于2020年7月17日周五 下午5:16写道: > > > > > >> 用bundle jar可以搞定吗? > > >> > > >> [1] > > >> > > >> > > > https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar > > >> > > >> Best, > > >> Jingsong > > >> > > >> On Fri, Jul 17, 2020 at 5:14 PM Dream-底限 <zhan...@akulaku.com> wrote: > > >> > > >> > hi: > > >> > > > >> > > > >> > > > 大佬们,下面连接hive的依赖包的哪个传递依赖导致的jar包冲突,我从1.9到1.11每次在maven按照官方文档引包都会出现依赖冲突。。。。1.9刚发布的时候对下面的引包有做依赖排除,后来文档改了 > > >> > > > >> > // Flink's Hive connector.Contains flink-hadoop-compatibility and > > >> > flink-orc jars > > >> > flink-connector-hive_2.11-1.11.0.jar > > >> > // Hive dependencies > > >> > hive-exec-2.3.4.jar > > >> > > > >> > > >> > > >> -- > > >> Best, Jingsong Lee > > >> > > > > > > > > -- > Best regards! > Rui Li >