Looks like a bug with this garbage collector (ZGC). We should file a bug report, the stack trace might be helpful to them:
Current CompileTask: C2: 489037 23064 % 4 org.apache.lucene.index.RandomPostingsTester::verifyEnum @ 1415 (3617 bytes) Stack: [0x00007f7be6564000,0x00007f7be6665000], sp=0x00007f7be665fb60, free space=1006k Native frames: (J=compiled Java code, A=aot compiled Java code, j=interpreted, Vv=VM code, C=native code) V [libjvm.so+0xce71c9] PhaseIterGVN::transform_old(Node*)+0x159 V [libjvm.so+0xce3874] PhaseIterGVN::optimize()+0x134 V [libjvm.so+0x1027184] ZBarrierSetC2::insert_barriers_on_unsafe(PhaseIdealLoop*) const+0x364 V [libjvm.so+0x10283b8] ZBarrierSetC2::optimize_loops(PhaseIdealLoop*, LoopOptsMode, VectorSet&, Node_Stack&, Node_List&) const+0x38 V [libjvm.so+0xb21ac6] PhaseIdealLoop::build_and_optimize(LoopOptsMode)+0xad6 V [libjvm.so+0x638ecd] PhaseIdealLoop::optimize(PhaseIterGVN&, LoopOptsMode)+0x1dd V [libjvm.so+0x63703f] Compile::Optimize()+0x83f V [libjvm.so+0x63860a] Compile::Compile(ciEnv*, C2Compiler*, ciMethod*, int, bool, bool, bool, DirectiveSet*)+0xd2a V [libjvm.so+0x55fadc] C2Compiler::compile_method(ciEnv*, ciMethod*, int, DirectiveSet*)+0xbc V [libjvm.so+0x64229d] CompileBroker::invoke_compiler_on_method(CompileTask*)+0x3fd V [libjvm.so+0x643c70] CompileBroker::compiler_thread_loop()+0x5d0 V [libjvm.so+0xf6b9fe] JavaThread::thread_main_inner()+0x1be V [libjvm.so+0xf707fd] Thread::call_run()+0x10d V [libjvm.so+0xc875b7] thread_native_entry(Thread*)+0xe7 On Sun, Dec 29, 2019 at 1:29 AM Mikhail Khludnev <[email protected]> wrote: > Hi, Dev. > > This happens though December. What we supposed to do? > > [junit4] # SIGSEGV (0xb) at pc=0x00007f7e1c0e01c9, pid=30883, tid=30952 > [junit4] # > [junit4] # JRE version: OpenJDK Runtime Environment (13.0.1+9) (build > 13.0.1+9) > [junit4] # Java VM: OpenJDK 64-Bit Server VM (13.0.1+9, mixed mode, > tiered, z gc, linux-amd64) > [junit4] # Problematic frame: > [junit4] # V [libjvm.so+0xce71c9] PhaseIterGVN::transform_old > (Node*)+0x159 > > On Sat, Dec 28, 2019 at 1:09 PM Policeman Jenkins Server < > [email protected]> wrote: > >> Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/1684/ >> Java: 64bit/jdk-13.0.1 -XX:+UseCompressedOops >> -XX:+UnlockExperimentalVMOptions -XX:+UseZGC >> >> All tests passed >> >> Build Log: >> [...truncated 1396 lines...] >> [junit4] JVM J0: stdout was not empty, see: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20191228_095417_89118188758387783302429.sysout >> [junit4] >>> JVM J0 emitted unexpected output (verbatim) ---- >> [junit4] # >> [junit4] # A fatal error has been detected by the Java Runtime >> Environment: >> [junit4] # >> [junit4] # SIGSEGV (0xb) at pc=0x00007f7e1c0e01c9, pid=30883, >> tid=30952 >> [junit4] # >> [junit4] # JRE version: OpenJDK Runtime Environment (13.0.1+9) (build >> 13.0.1+9) >> [junit4] # Java VM: OpenJDK 64-Bit Server VM (13.0.1+9, mixed mode, >> tiered, z gc, linux-amd64) >> [junit4] # Problematic frame: >> [junit4] # V [libjvm.so+0xce71c9] >> PhaseIterGVN::transform_old(Node*)+0x159 >> [junit4] # >> [junit4] # No core dump will be written. Core dumps have been >> disabled. To enable core dumping, try "ulimit -c unlimited" before starting >> Java again >> [junit4] # >> [junit4] # An error report file with more information is saved as: >> [junit4] # >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/J0/hs_err_pid30883.log >> [junit4] [thread 6488 also had an error] >> [junit4] # >> [junit4] # Compiler replay data is saved as: >> [junit4] # >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/J0/replay_pid30883.log >> [junit4] # >> [junit4] # If you would like to submit a bug report, please visit: >> [junit4] # https://github.com/AdoptOpenJDK/openjdk-build/issues >> [junit4] # >> [junit4] <<< JVM J0: EOF ---- >> >> [...truncated 798 lines...] >> [junit4] ERROR: JVM J0 ended with an exception, command line: >> /home/jenkins/tools/java/64bit/jdk-13.0.1/bin/java -XX:+UseCompressedOops >> -XX:+UnlockExperimentalVMOptions -XX:+UseZGC >> -XX:+HeapDumpOnOutOfMemoryError >> -XX:HeapDumpPath=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/heapdumps >> -ea -esa --illegal-access=deny -Dtests.prefix=tests >> -Dtests.seed=C474FBC796E01686 -Xmx512M -Dtests.iters= -Dtests.verbose=false >> -Dtests.infostream=false -Dtests.codec=random -Dtests.postingsformat=random >> -Dtests.docvaluesformat=random -Dtests.locale=random >> -Dtests.timezone=random -Dtests.directory=random >> -Dtests.linedocsfile=europarl.lines.txt.gz -Dtests.luceneMatchVersion=8.5.0 >> -Dtests.cleanthreads=perMethod >> -Djava.util.logging.config.file=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/tools/junit4/logging.properties >> -Dtests.nightly=false -Dtests.weekly=false -Dtests.monster=false >> -Dtests.slow=true -Dtests.asserts=true -Dtests.multiplier=3 >> -DtempDir=./temp -Djava.io.tmpdir=./temp >> -Dcommon.dir=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene >> -Dclover.db.dir=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/clover/db >> -Djava.security.policy=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/tools/junit4/tests.policy >> -Dtests.LUCENE_VERSION=8.5.0 -Djetty.testMode=1 -Djetty.insecurerandom=1 >> -Dsolr.directoryFactory=org.apache.solr.core.MockDirectoryFactory >> -Djava.awt.headless=true -Djdk.map.althashing.threshold=0 >> -Dtests.src.home=/home/jenkins/workspace/Lucene-Solr-8.x-Linux >> -Djava.security.egd=file:/dev/./urandom >> -Djunit4.childvm.cwd=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/J0 >> -Djunit4.tempDir=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp >> -Djunit4.childvm.id=0 -Djunit4.childvm.count=3 -Dfile.encoding=US-ASCII >> -Djava.security.manager=org.apache.lucene.util.TestSecurityManager >> -Dtests.filterstacks=true -Dtests.leaveTemporary=false >> -Dtests.badapples=false -classpath >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/classes/java:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/classes/java:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/test-framework/lib/hamcrest-core-1.3.jar:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/test-framework/lib/junit-4.12.jar:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/test-framework/lib/randomizedtesting-runner-2.7.2.jar:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/classes/java9:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/classes/java:/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/classes/test:/home/jenkins/.ivy2/cache/com.carrotsearch.randomizedtesting/junit4-ant/jars/junit4-ant-2.7.2.jar >> com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe -eventsfile >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20191228_095417_8917447844543437384674.events >> @/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20191228_095417_891226391883297247733.suites >> -stdin >> [junit4] ERROR: JVM J0 ended with an exception: Forked process >> returned with error code: 134. Very likely a JVM crash. See process stdout >> at: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20191228_095417_89118188758387783302429.sysout >> [junit4] at >> com.carrotsearch.ant.tasks.junit4.JUnit4.executeSlave(JUnit4.java:1542) >> [junit4] at >> com.carrotsearch.ant.tasks.junit4.JUnit4.access$000(JUnit4.java:123) >> [junit4] at >> com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:997) >> [junit4] at >> com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:994) >> [junit4] at >> java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) >> [junit4] at >> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) >> [junit4] at >> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) >> [junit4] at java.base/java.lang.Thread.run(Thread.java:830) >> >> BUILD FAILED >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The >> following error occurred while executing this line: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:578: The >> following error occurred while executing this line: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:59: The following >> error occurred while executing this line: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:50: The >> following error occurred while executing this line: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1590: >> The following error occurred while executing this line: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1117: >> At least one slave process threw an exception, first: Forked process >> returned with error code: 134. Very likely a JVM crash. See process stdout >> at: >> /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20191228_095417_89118188758387783302429.sysout >> >> Total time: 15 minutes 31 seconds >> Build step 'Invoke Ant' marked build as failure >> Archiving artifacts >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> [WARNINGS] Skipping publisher since build result is FAILURE >> Recording test results >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Email was triggered for: Failure - Any >> Sending email for trigger: Failure - Any >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> Setting >> ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2 >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: [email protected] >> For additional commands, e-mail: [email protected] > > > > -- > Sincerely yours > Mikhail Khludnev >
