Hi,

with Flink 1.10 we changed the behaviour on the client side so that it also
uses the child first class loader [1]. Due to that it might be the case
that you have some conflicting dependencies bundled in your user code jar
which don't play well together with what you have on the system class path
of your client. If the problematic dependency originates from
flink-table-planner-blink, then setting it to provided makes sense.

Please also take a look at this issue if you are using Hive [2].

[1] https://issues.apache.org/jira/browse/FLINK-13749
[2] https://issues.apache.org/jira/browse/FLINK-14849

Cheers,
Till

On Fri, Feb 28, 2020 at 10:01 AM LakeShen <shenleifight...@gmail.com> wrote:

>  I have solved this problem. I set the  flink-table-planner-blink maven
> scope to provided .
>
> kant kodali <kanth...@gmail.com> 于2020年2月28日周五 下午3:32写道:
>
> > Same problem!
> >
> > On Thu, Feb 27, 2020 at 11:10 PM LakeShen <shenleifight...@gmail.com>
> > wrote:
> >
> >> Hi community,
> >>               now  I am using the flink 1.10 to run the flink task
> >> ,cluster type is yarn . I use commandline to submit my flink job , the
> >> commandline just like this :
> >>
> >> flink run  -m yarn-cluster  --allowNonRestoredState  -c xxx.xxx.xx
> >>  flink-stream-xxx.jar
> >>
> >> Bug there is a exception to throw,the exception info is :
> >>
> >> *org.apache.flink.client.program.ProgramInvocationException: The main
> >> method caused an error: Unable to instantiate java compiler*
> >> at
> >>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)
> >> at
> >>
> org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
> >> at
> >> org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
> >> at
> >>
> org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:664)
> >> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
> >> at
> >>
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:895)
> >> at
> >>
> org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:968)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:422)
> >> at
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
> >> at
> >>
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:968)
> >> Caused by: java.lang.IllegalStateException: Unable to instantiate java
> >> compiler
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:434)
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:375)
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:109)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.get(LocalCache.java:3953)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957)
> >> at
> >>
> org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875)
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:475)
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:488)
> >> at
> >>
> org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:193)
> >> at
> >>
> org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:797)
> >> at
> >>
> org.apache.calcite.rel.rules.ReduceExpressionsRule$ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:298)
> >> at
> >>
> org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:319)
> >> at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:560)
> >> at
> org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:419)
> >> at
> >>
> org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:256)
> >> at
> >>
> org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127)
> >> at
> >>
> org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:215)
> >> at
> org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:202)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.program.FlinkHepProgram.optimize(FlinkHepProgram.scala:69)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.program.FlinkHepRuleSetProgram.optimize(FlinkHepRuleSetProgram.scala:87)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:62)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:58)
> >> at
> >>
> scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
> >> at
> >>
> scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
> >> at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> >> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> >> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> >> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> >> at
> >>
> scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
> >> at scala.collection.AbstractTraversable.foldLeft(Traversable.scala:104)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala:57)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala:170)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala:90)
> >> at
> >>
> org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala:77)
> >> at
> >>
> org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala:248)
> >> at
> >>
> org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:151)
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:686)
> >> at
> >>
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlUpdate(TableEnvironmentImpl.java:496)
> >> at
> >>
> com.youzan.bigdata.FlinkStreamSQLDDLJob.lambda$main$0(FlinkStreamSQLDDLJob.java:60)
> >> at
> >>
> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1380)
> >> at
> >>
> java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
> >> at
> >>
> com.youzan.bigdata.FlinkStreamSQLDDLJob.main(FlinkStreamSQLDDLJob.java:58)
> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:498)
> >> at
> >>
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)
> >> ... 11 more
> >> *Caused by: java.lang.ClassCastException:
> >> org.codehaus.janino.CompilerFactory cannot be cast to
> >> org.codehaus.commons.compiler.ICompilerFactory*
> >> at
> >>
> org.codehaus.commons.compiler.CompilerFactoryFactory.getCompilerFactory(CompilerFactoryFactory.java:129)
> >> at
> >>
> org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(CompilerFactoryFactory.java:79)
> >> at
> >>
> org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:432)
> >> ... 62 more
> >>
> >
>

Reply via email to