I've found the problem!
It was indeed a local thingy!

$ cat ~/.mavenrc
MAVEN_OPTS='-XX:+TieredCompilation -XX:TieredStopAtLevel=1'

I've added this some time ago. It optimizes the build time. But it seems it
also overrides the env var MAVEN_OPTS...

Now it fails with:

[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
spark-catalyst_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file:
/home/martin/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.15__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin:
BasicArtifact(com.github.ghik,silencer-plugin_2.12.15,1.7.6,null)
[INFO] Compiling 372 Scala sources and 171 Java sources to
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes ...

[ERROR] [Error] : error writing
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes/org/apache/spark/sql/catalyst/analysis/Analyzer$ResolveGroupingAnalytics$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveGroupingAnalytics$$replaceGroupingFunc$1.class:
java.nio.file.FileSystemException
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes/org/apache/spark/sql/catalyst/analysis/Analyzer$ResolveGroupingAnalytics$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveGroupingAnalytics$$replaceGroupingFunc$1.class:
File name too long
but this is well documented:
https://spark.apache.org/docs/latest/building-spark.html#encrypted-filesystems

All works now!
Thank you, Sean!


On Thu, Feb 10, 2022 at 10:13 PM Sean Owen <sro...@gmail.com> wrote:

> I think it's another occurrence that I had to change or had to set
> MAVEN_OPTS. I think this occurs in a way that this setting doesn't affect,
> though I don't quite understand it. Try the stack size in test runner
> configs
>
> On Thu, Feb 10, 2022, 2:02 PM Martin Grigorov <mgrigo...@apache.org>
> wrote:
>
>> Hi Sean,
>>
>> On Thu, Feb 10, 2022 at 5:37 PM Sean Owen <sro...@gmail.com> wrote:
>>
>>> Yes I've seen this; the JVM stack size needs to be increased. I'm not
>>> sure if it's env specific (though you and I at least have hit it, I think
>>> others), or whether we need to change our build script.
>>> In the pom.xml file, find "-Xss..." settings and make them something
>>> like "-Xss4m", see if that works.
>>>
>>
>> It is already a much bigger value - 128m (
>> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845
>> )
>> I've tried smaller and bigger values for all jvmArgs next to this one.
>> None helped!
>> I also have the feeling it is something in my environment that overrides
>> these values but so far I cannot identify anything.
>>
>>
>>
>>>
>>> On Thu, Feb 10, 2022 at 8:54 AM Martin Grigorov <mgrigo...@apache.org>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am not able to build Spark due to the following error :
>>>>
>>>> ERROR] ## Exception when compiling 543 sources to
>>>> /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes
>>>> java.lang.BootstrapMethodError: call site initialization exception
>>>> java.lang.invoke.CallSite.makeSite(CallSite.java:341)
>>>>
>>>> java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:307)
>>>>
>>>> java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:297)
>>>> scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2504)
>>>>
>>>> scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103(Typers.scala:5711)
>>>>
>>>> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:500)
>>>> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5746)
>>>> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5781)
>>>> ...
>>>> Caused by: java.lang.StackOverflowError
>>>>     at java.lang.ref.Reference.<init> (Reference.java:303)
>>>>     at java.lang.ref.WeakReference.<init> (WeakReference.java:57)
>>>>     at
>>>> java.lang.invoke.MethodType$ConcurrentWeakInternSet$WeakEntry.<init>
>>>> (MethodType.java:1269)
>>>>     at java.lang.invoke.MethodType$ConcurrentWeakInternSet.get
>>>> (MethodType.java:1216)
>>>>     at java.lang.invoke.MethodType.makeImpl (MethodType.java:302)
>>>>     at java.lang.invoke.MethodType.dropParameterTypes
>>>> (MethodType.java:573)
>>>>     at java.lang.invoke.MethodType.replaceParameterTypes
>>>> (MethodType.java:467)
>>>>     at java.lang.invoke.MethodHandle.asSpreader (MethodHandle.java:875)
>>>>     at java.lang.invoke.Invokers.spreadInvoker (Invokers.java:158)
>>>>     at java.lang.invoke.CallSite.makeSite (CallSite.java:324)
>>>>     at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
>>>> (MethodHandleNatives.java:307)
>>>>     at java.lang.invoke.MethodHandleNatives.linkCallSite
>>>> (MethodHandleNatives.java:297)
>>>>     at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
>>>> (Typers.scala:2504)
>>>>     at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
>>>> (Typers.scala:5711)
>>>>     at
>>>> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
>>>> (Typers.scala:500)
>>>>     at scala.tools.nsc.typechecker.Typers$Typer.typed1
>>>> (Typers.scala:5746)
>>>>     at scala.tools.nsc.typechecker.Typers$Typer.typed
>>>> (Typers.scala:5781)
>>>>
>>>> I have played a lot with the scala-maven-plugin jvmArg settings at [1]
>>>> but so far nothing helps.
>>>> Same error for Scala 2.12 and 2.13.
>>>>
>>>> The command I use is: ./build/mvn install -Pkubernetes -DskipTests
>>>>
>>>> I need to create a distribution from master branch.
>>>>
>>>> Java: 1.8.0_312
>>>> Maven: 3.8.4
>>>> OS: Ubuntu 21.10
>>>>
>>>> Any hints ?
>>>> Thank you!
>>>>
>>>> 1.
>>>> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845-L2849
>>>>
>>>

Reply via email to