Hi Avi,

I actually wanted to check the result of gradle dependencies. But okay I
want ahead and checked your sources and as suspected, your build is using
scala 2.12.12
org.scala-lang:scala-library:2.12.7 -> 2.12.12.

Please use strict version [1].

[1]
https://docs.gradle.org/current/userguide/dependency_downgrade_and_exclude.html

On Thu, Jan 14, 2021 at 4:14 PM Avi Levi <a...@neosec.com> wrote:

> Hi,
> did all that including upgrading the gradle version and got the same
> result. please find attached a sample project here
> <https://github.com/123avi/sample-flink-gradle>
>
> Thank you
> Avi
>
> On Thu, Jan 14, 2021 at 3:25 PM Arvid Heise <ar...@ververica.com> wrote:
>
>> Hi Avi,
>>
>> could you run
>> gradle dependencies
>> and report back to me?
>>
>> Also did you ensure to run
>> gradle clean
>> before? The gradle version you are using is ancient, so I'm not sure if
>> it's picking up the change correctly.
>>
>> On Thu, Jan 14, 2021 at 10:55 AM Avi Levi <a...@neosec.com> wrote:
>>
>>> No, I don't.  I actually tried also with 2.12.7 and got the same result
>>>
>>> On Thu, Jan 14, 2021 at 11:07 AM Arvid Heise <ar...@ververica.com>
>>> wrote:
>>>
>>>> Hi Avi,
>>>>
>>>> apparently the maximum version that Flink supports for scala is 2.12.7
>>>> [1]. Do you have a specific reason to use a higher version?
>>>>
>>>> [1] https://issues.apache.org/jira/browse/FLINK-12461
>>>>
>>>> On Thu, Jan 14, 2021 at 5:11 AM Avi Levi <a...@neosec.com> wrote:
>>>>
>>>>> Hi Arvid,
>>>>> Please find attached full build.gradle file
>>>>>
>>>>> On Tue, Jan 12, 2021 at 8:18 PM Arvid Heise <ar...@ververica.com>
>>>>> wrote:
>>>>>
>>>>>> Can you post the full dependencies of sbt/maven/gradle whatever?
>>>>>>
>>>>>> On Tue, Jan 12, 2021 at 3:54 AM Avi Levi <a...@neosec.com> wrote:
>>>>>>
>>>>>>> Hi Arvid,
>>>>>>> using :
>>>>>>>
>>>>>>> flinkVersion = '1.12.0'
>>>>>>> scalaBinaryVersion = '2.12'
>>>>>>>
>>>>>>> I simplified the example to (same exception)  :
>>>>>>>
>>>>>>> object Flinktest extends App {
>>>>>>>   private val env = StreamExecutionEnvironment.getExecutionEnvironment
>>>>>>>   env.fromElements("A", "B","c")
>>>>>>>     .windowAll(TumblingEventTimeWindows.of(Time.seconds(5)))
>>>>>>>     .process{new ProcessAllWindowFunction[String, List[String], 
>>>>>>> TimeWindow] {
>>>>>>>       override def process(context: Context, elements: 
>>>>>>> Iterable[String], out: Collector[List[String]]): Unit = {
>>>>>>>         out.collect(elements.toList)
>>>>>>>       }
>>>>>>>     }
>>>>>>>     }
>>>>>>>     .print()
>>>>>>>
>>>>>>> env.execute("Sample")
>>>>>>> }
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jan 5, 2021 at 1:53 PM Arvid Heise <ar...@ververica.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Avi,
>>>>>>>>
>>>>>>>> without being a scala-guy, I'm guessing that you are mixing scala
>>>>>>>> versions. Could you check that your user code uses the same scala 
>>>>>>>> version
>>>>>>>> as Flink (1.11 or 1.12)? I have also heard of issues with different 
>>>>>>>> minor
>>>>>>>> versions of scala, so make sure to use the exact same version (e.g.
>>>>>>>> 2.11.12).
>>>>>>>>
>>>>>>>> On Mon, Dec 28, 2020 at 3:54 PM Avi Levi <a...@neosec.com> wrote:
>>>>>>>>
>>>>>>>>> I am trying to aggregate all records in a time window. This is my
>>>>>>>>> ProcessAllWindowFunction :
>>>>>>>>>
>>>>>>>>> case class SimpleAggregate(elms: List[String])
>>>>>>>>>
>>>>>>>>> class AggregateLogs extends ProcessAllWindowFunction[String, 
>>>>>>>>> SimpleAggregate, TimeWindow ] {
>>>>>>>>>   override def process(context: Context, elements: Iterable[String], 
>>>>>>>>> out: Collector[SimpleAggregate]): Unit = {
>>>>>>>>>     val es: List[String] = elements.toList
>>>>>>>>>     val record = SimpleAggregate(es)
>>>>>>>>>     out.collect(record)
>>>>>>>>>   }
>>>>>>>>> }
>>>>>>>>>
>>>>>>>>> But I am getting this exception why ?
>>>>>>>>>
>>>>>>>>> Exception in thread "main"
>>>>>>>>> java.util.concurrent.ExecutionException: 
>>>>>>>>> scala.tools.reflect.ToolBoxError:
>>>>>>>>> reflective compilation has failed: cannot initialize the compiler due 
>>>>>>>>> to
>>>>>>>>> java.lang.BootstrapMethodError: java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:137)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2348)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2320)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4739)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer$.compileCbf(TraversableSerializer.scala:184)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer.compileCbf(TraversableSerializer.scala:51)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer.<init>(TraversableSerializer.scala:41)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$$anon$1$$anon$2$$anon$3.<init>(HandleFinancialJob.scala:52)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$$anon$1$$anon$2.createSerializer(HandleFinancialJob.scala:52)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$$anon$1$$anon$2.createSerializer(HandleFinancialJob.scala:52)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$$anon$1.$anonfun$createSerializer$1(HandleFinancialJob.scala:52)
>>>>>>>>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$$anon$1.createSerializer(HandleFinancialJob.scala:52)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraph.createSerializer(StreamGraph.java:864)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraph.addOperator(StreamGraph.java:308)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraph.addOperator(StreamGraph.java:293)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraphGenerator.transformOneInputTransform(StreamGraphGenerator.java:680)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraphGenerator.transform(StreamGraphGenerator.java:253)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.graph.StreamGraphGenerator.generate(StreamGraphGenerator.java:212)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:1863)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:1848)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1699)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:699)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$.delayedEndpoint$com$neosec$handlefinancial$HandleFinancialJob$1(HandleFinancialJob.scala:60)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$delayedInit$body.apply(HandleFinancialJob.scala:20)
>>>>>>>>> at scala.Function0.apply$mcV$sp(Function0.scala:39)
>>>>>>>>> at scala.Function0.apply$mcV$sp$(Function0.scala:39)
>>>>>>>>> at
>>>>>>>>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
>>>>>>>>> at scala.App.$anonfun$main$1$adapted(App.scala:80)
>>>>>>>>> at scala.collection.immutable.List.foreach(List.scala:431)
>>>>>>>>> at scala.App.main(App.scala:80)
>>>>>>>>> at scala.App.main$(App.scala:78)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob$.main(HandleFinancialJob.scala:20)
>>>>>>>>> at
>>>>>>>>> com.neosec.handlefinancial.HandleFinancialJob.main(HandleFinancialJob.scala)
>>>>>>>>> Caused by: scala.tools.reflect.ToolBoxError: reflective
>>>>>>>>> compilation has failed: cannot initialize the compiler due to
>>>>>>>>> java.lang.BootstrapMethodError: java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>> Caused by: scala.tools.reflect.ToolBoxError: reflective
>>>>>>>>> compilation has failed: cannot initialize the compiler due to
>>>>>>>>> java.lang.BootstrapMethodError: java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>>
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$api$.liftedTree1$1(ToolBoxFactory.scala:360)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$api$.compiler$lzycompute(ToolBoxFactory.scala:346)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$api$.compiler(ToolBoxFactory.scala:345)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$.apply(ToolBoxFactory.scala:372)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.parse(ToolBoxFactory.scala:429)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer$LazyRuntimeCompiler.compileCbfInternal(TraversableSerializer.scala:229)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer$LazyRuntimeCompiler.call(TraversableSerializer.scala:220)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.api.scala.typeutils.TraversableSerializer$LazyRuntimeCompiler.call(TraversableSerializer.scala:216)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4742)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
>>>>>>>>> at
>>>>>>>>> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
>>>>>>>>> ... 34 more
>>>>>>>>> Caused by: java.lang.BootstrapMethodError:
>>>>>>>>> java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>> at
>>>>>>>>> scala.tools.nsc.PhaseAssembly$DependencyGraph.compilerPhaseList(PhaseAssembly.scala:102)
>>>>>>>>> at
>>>>>>>>> scala.tools.nsc.PhaseAssembly.computePhaseAssembly(PhaseAssembly.scala:244)
>>>>>>>>> at
>>>>>>>>> scala.tools.nsc.PhaseAssembly.computePhaseAssembly$(PhaseAssembly.scala:216)
>>>>>>>>> Caused by: java.lang.BootstrapMethodError:
>>>>>>>>> java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>>
>>>>>>>>> at scala.tools.nsc.Global.computePhaseAssembly(Global.scala:44)
>>>>>>>>> at scala.tools.nsc.Global.computePhaseDescriptors(Global.scala:713)
>>>>>>>>> at
>>>>>>>>> scala.tools.nsc.Global.phaseDescriptors$lzycompute(Global.scala:717)
>>>>>>>>> at scala.tools.nsc.Global.phaseDescriptors(Global.scala:717)
>>>>>>>>> at scala.tools.nsc.Global$Run.<init>(Global.scala:1219)
>>>>>>>>> at scala.tools.reflect.ReflectSetup.$init$(ReflectSetup.scala:21)
>>>>>>>>> at scala.tools.reflect.ReflectGlobal.<init>(ReflectGlobal.scala:26)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.<init>(ToolBoxFactory.scala:52)
>>>>>>>>> at
>>>>>>>>> scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$api$.liftedTree1$1(ToolBoxFactory.scala:350)
>>>>>>>>> ... 44 more
>>>>>>>>> Caused by: java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>> ... 56 more
>>>>>>>>> Caused by: java.lang.NoSuchMethodError:
>>>>>>>>> scala.collection.immutable.List.$anonfun$flatMap$1$adapted(Lscala/runtime/BooleanRef;Lscala/runtime/ObjectRef;Lscala/runtime/ObjectRef;Ljava/lang/Object;)Ljava/lang/Object;
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>> Arvid Heise | Senior Java Developer
>>>>>>>>
>>>>>>>> <https://www.ververica.com/>
>>>>>>>>
>>>>>>>> Follow us @VervericaData
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>>>>>>>> Conference
>>>>>>>>
>>>>>>>> Stream Processing | Event Driven | Real Time
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>>>>>>>>
>>>>>>>> --
>>>>>>>> Ververica GmbH
>>>>>>>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>>>>>>>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung
>>>>>>>> Jason, Ji (Toni) Cheng
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Arvid Heise | Senior Java Developer
>>>>>>
>>>>>> <https://www.ververica.com/>
>>>>>>
>>>>>> Follow us @VervericaData
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>>>>>> Conference
>>>>>>
>>>>>> Stream Processing | Event Driven | Real Time
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>>>>>>
>>>>>> --
>>>>>> Ververica GmbH
>>>>>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>>>>>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason,
>>>>>> Ji (Toni) Cheng
>>>>>>
>>>>>
>>>>
>>>> --
>>>>
>>>> Arvid Heise | Senior Java Developer
>>>>
>>>> <https://www.ververica.com/>
>>>>
>>>> Follow us @VervericaData
>>>>
>>>> --
>>>>
>>>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>>>> Conference
>>>>
>>>> Stream Processing | Event Driven | Real Time
>>>>
>>>> --
>>>>
>>>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>>>>
>>>> --
>>>> Ververica GmbH
>>>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>>>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
>>>> (Toni) Cheng
>>>>
>>>
>>
>> --
>>
>> Arvid Heise | Senior Java Developer
>>
>> <https://www.ververica.com/>
>>
>> Follow us @VervericaData
>>
>> --
>>
>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink
>> Conference
>>
>> Stream Processing | Event Driven | Real Time
>>
>> --
>>
>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany
>>
>> --
>> Ververica GmbH
>> Registered at Amtsgericht Charlottenburg: HRB 158244 B
>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
>> (Toni) Cheng
>>
>

-- 

Arvid Heise | Senior Java Developer

<https://www.ververica.com/>

Follow us @VervericaData

--

Join Flink Forward <https://flink-forward.org/> - The Apache Flink
Conference

Stream Processing | Event Driven | Real Time

--

Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany

--
Ververica GmbH
Registered at Amtsgericht Charlottenburg: HRB 158244 B
Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
(Toni) Cheng

Reply via email to