srowen commented on a change in pull request #25423: 
[SPARK-28701][test-hadoop3.2][test-java11][k8s] adding java11 support for pull 
request builds
URL: https://github.com/apache/spark/pull/25423#discussion_r317650935
 
 

 ##########
 File path: dev/run-tests.py
 ##########
 @@ -404,6 +404,12 @@ def run_scala_tests(build_tool, hadoop_version, 
test_modules, excluded_tags):
     if excluded_tags:
         test_profiles += ['-Dtest.exclude.tags=' + ",".join(excluded_tags)]
 
+    # set up java11 env if this is a pull request build with 'test-java11' in 
the title
+    if "test-java11" in os.environ["ghprbPullTitle"].lower():
+        os.environ["JAVA_HOME"] = "/usr/java/jdk-11.0.1"
+        os.environ["PATH"] = "%s/bin:%s" % (os.environ["JAVA_HOME"], 
os.environ["PATH"])
+        test_profiles += ['-Djava.version=11']
 
 Review comment:
   It should use Java 11 if the path provides Java 11 and the test harness that 
runs Python tests does too. At least I don't know how else one would tell 
pyspark what to use!
   
   In fact I'm pretty sure the test failure here shows that it is using JDK 11. 
From JPMML: `java.lang.ClassNotFoundException: 
com.sun.xml.internal.bind.v2.ContextFactory` This would be caused by JDK 11 
changes. However, I don't get why all the other non-Python tests don't fail.
   
   Given the weird problem in https://github.com/apache/spark/pull/24651 I am 
wondering if we have some subtle classpath issues with how the Pyspark tests 
are run.
   
   This one however might be more directly solvable by figuring out what is 
suggesting to use this old Sun JAXB implementation. I'll start digging around 
META-INF

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to