Hi Gengliang, Yay! Thank you! Java 11 with the following MAVEN_OPTS worked fine:
$ echo $MAVEN_OPTS -Xss64m -Xmx4g -XX:ReservedCodeCacheSize=1g $ ./build/mvn \ -Pyarn,kubernetes,hadoop-cloud,hive,hive-thriftserver \ -DskipTests \ clean install ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 22:02 min [INFO] Finished at: 2021-08-22T13:09:25+02:00 [INFO] ------------------------------------------------------------------------ Pozdrawiam, Jacek Laskowski ---- https://about.me/JacekLaskowski "The Internals Of" Online Books <https://books.japila.pl/> Follow me on https://twitter.com/jaceklaskowski <https://twitter.com/jaceklaskowski> On Sun, Aug 22, 2021 at 12:45 PM Jacek Laskowski <[email protected]> wrote: > Hi Gengliang, > > With Java 8 the build worked fine. No other changes. I'm going to give > Java 11 a try with the options you mentioned. > > $ java -version > openjdk version "1.8.0_292" > OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_292-b10) > OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.292-b10, mixed mode) > > BTW, Shouldn't the page [1] be updated to reflect this? This is what I > followed. > > [1] > https://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage > > Thanks > Pozdrawiam, > Jacek Laskowski > ---- > https://about.me/JacekLaskowski > "The Internals Of" Online Books <https://books.japila.pl/> > Follow me on https://twitter.com/jaceklaskowski > > <https://twitter.com/jaceklaskowski> > > > On Sun, Aug 22, 2021 at 8:29 AM Gengliang Wang <[email protected]> wrote: > >> Hi Jacek, >> >> The current GitHub action CI for Spark contains Java 11 build. The build >> is successful with the options "-Xss64m -Xmx2g >> -XX:ReservedCodeCacheSize=1g": >> >> https://github.com/apache/spark/blob/master/.github/workflows/build_and_test.yml#L506 >> The default Java stack size is small and we have to raise it for Spark >> build with the option "-Xss64m". >> >> On Sat, Aug 21, 2021 at 9:33 PM Jacek Laskowski <[email protected]> wrote: >> >>> Hi, >>> >>> I've been building the tag and I'm facing the >>> following StackOverflowError: >>> >>> Exception in thread "main" java.lang.StackOverflowError >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:275) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:133) >>> at >>> scala.reflect.api.Trees$Transformer.$anonfun$transformStats$1(Trees.scala:2597) >>> at scala.reflect.api.Trees$Transformer.transformStats(Trees.scala:2595) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transformStats(ExtensionMethods.scala:280) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transformStats(ExtensionMethods.scala:133) >>> at scala.reflect.internal.Trees.itransform(Trees.scala:1430) >>> at scala.reflect.internal.Trees.itransform$(Trees.scala:1400) >>> at scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28) >>> at scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28) >>> at scala.reflect.api.Trees$Transformer.transform(Trees.scala:2563) >>> at >>> scala.tools.nsc.transform.TypingTransformers$TypingTransformer.transform(TypingTransformers.scala:57) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:275) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:133) >>> at scala.reflect.internal.Trees.itransform(Trees.scala:1409) >>> at scala.reflect.internal.Trees.itransform$(Trees.scala:1400) >>> at scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28) >>> at scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28) >>> at scala.reflect.api.Trees$Transformer.transform(Trees.scala:2563) >>> at >>> scala.tools.nsc.transform.TypingTransformers$TypingTransformer.transform(TypingTransformers.scala:57) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:275) >>> at >>> scala.tools.nsc.transform.ExtensionMethods$Extender.transform(ExtensionMethods.scala:133) >>> ... >>> >>> The command I use: >>> >>> ./build/mvn \ >>> -Pyarn,kubernetes,hadoop-cloud,hive,hive-thriftserver \ >>> -DskipTests \ >>> clean install >>> >>> $ java --version >>> openjdk 11.0.11 2021-04-20 >>> OpenJDK Runtime Environment AdoptOpenJDK-11.0.11+9 (build 11.0.11+9) >>> OpenJDK 64-Bit Server VM AdoptOpenJDK-11.0.11+9 (build 11.0.11+9, mixed >>> mode) >>> >>> $ ./build/mvn -v >>> Using `mvn` from path: /usr/local/bin/mvn >>> Apache Maven 3.8.1 (05c21c65bdfed0f71a2f2ada8b84da59348c4c5d) >>> Maven home: /usr/local/Cellar/maven/3.8.1/libexec >>> Java version: 11.0.11, vendor: AdoptOpenJDK, runtime: >>> /Users/jacek/.sdkman/candidates/java/11.0.11.hs-adpt >>> Default locale: en_PL, platform encoding: UTF-8 >>> OS name: "mac os x", version: "11.5", arch: "x86_64", family: "mac" >>> >>> $ echo $MAVEN_OPTS >>> -Xmx8g -XX:ReservedCodeCacheSize=1g >>> >>> Pozdrawiam, >>> Jacek Laskowski >>> ---- >>> https://about.me/JacekLaskowski >>> "The Internals Of" Online Books <https://books.japila.pl/> >>> Follow me on https://twitter.com/jaceklaskowski >>> >>> <https://twitter.com/jaceklaskowski> >>> >>> >>> On Fri, Aug 20, 2021 at 7:05 PM Gengliang Wang <[email protected]> wrote: >>> >>>> Please vote on releasing the following candidate as Apache Spark >>>> version 3.2.0. >>>> >>>> The vote is open until 11:59pm Pacific time Aug 25 and passes if a >>>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes. >>>> >>>> [ ] +1 Release this package as Apache Spark 3.2.0 >>>> [ ] -1 Do not release this package because ... >>>> >>>> To learn more about Apache Spark, please see http://spark.apache.org/ >>>> >>>> The tag to be voted on is v3.2.0-rc1 (commit >>>> 6bb3523d8e838bd2082fb90d7f3741339245c044): >>>> https://github.com/apache/spark/tree/v3.2.0-rc1 >>>> >>>> The release files, including signatures, digests, etc. can be found at: >>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-bin/ >>>> >>>> Signatures used for Spark RCs can be found in this file: >>>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>>> >>>> The staging repository for this release can be found at: >>>> https://repository.apache.org/content/repositories/orgapachespark-1388 >>>> >>>> The documentation corresponding to this release can be found at: >>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc1-docs/ >>>> >>>> The list of bug fixes going into 3.2.0 can be found at the following >>>> URL: >>>> https://issues.apache.org/jira/projects/SPARK/versions/12349407 >>>> >>>> This release is using the release script of the tag v3.2.0-rc1. >>>> >>>> >>>> FAQ >>>> >>>> ========================= >>>> How can I help test this release? >>>> ========================= >>>> If you are a Spark user, you can help us test this release by taking >>>> an existing Spark workload and running on this release candidate, then >>>> reporting any regressions. >>>> >>>> If you're working in PySpark you can set up a virtual env and install >>>> the current RC and see if anything important breaks, in the Java/Scala >>>> you can add the staging repository to your projects resolvers and test >>>> with the RC (make sure to clean up the artifact cache before/after so >>>> you don't end up building with a out of date RC going forward). >>>> >>>> =========================================== >>>> What should happen to JIRA tickets still targeting 3.2.0? >>>> =========================================== >>>> The current list of open tickets targeted at 3.2.0 can be found at: >>>> https://issues.apache.org/jira/projects/SPARK and search for "Target >>>> Version/s" = 3.2.0 >>>> >>>> Committers should look at those and triage. Extremely important bug >>>> fixes, documentation, and API tweaks that impact compatibility should >>>> be worked on immediately. Everything else please retarget to an >>>> appropriate release. >>>> >>>> ================== >>>> But my bug isn't fixed? >>>> ================== >>>> In order to make timely releases, we will typically not hold the >>>> release unless the bug in question is a regression from the previous >>>> release. That being said, if there is something which is a regression >>>> that has not been correctly targeted please ping me or a committer to >>>> help target the issue. >>>> >>>>
