[jira] [Assigned] (SPARK-34007) Downgrade scala-maven-plugin to 4.3.0
[ https://issues.apache.org/jira/browse/SPARK-34007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hyukjin Kwon reassigned SPARK-34007: Assignee: Hyukjin Kwon > Downgrade scala-maven-plugin to 4.3.0 > - > > Key: SPARK-34007 > URL: https://issues.apache.org/jira/browse/SPARK-34007 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.1.0 >Reporter: Hyukjin Kwon >Assignee: Hyukjin Kwon >Priority: Blocker > > After we upgraded scala-maven-plugin to 4.4.0 at SPARK-33512, the docker > release script fails as below: > {code} > [INFO] Compiling 21 Scala sources and 3 Java sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > ... > [ERROR] ## Exception when compiling 24 sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s > signer information does not match signer information of other classes in the > same package > java.lang.ClassLoader.checkCerts(ClassLoader.java:891) > java.lang.ClassLoader.preDefineClass(ClassLoader.java:661) > java.lang.ClassLoader.defineClass(ClassLoader.java:754) > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > java.net.URLClassLoader.defineClass(URLClassLoader.java:468) > java.net.URLClassLoader.access$100(URLClassLoader.java:74) > java.net.URLClassLoader$1.run(URLClassLoader.java:369) > java.net.URLClassLoader$1.run(URLClassLoader.java:363) > java.security.AccessController.doPrivileged(Native Method) > java.net.URLClassLoader.findClass(URLClassLoader.java:362) > java.lang.ClassLoader.loadClass(ClassLoader.java:418) > java.lang.ClassLoader.loadClass(ClassLoader.java:351) > java.lang.Class.getDeclaredMethods0(Native Method) > java.lang.Class.privateGetDeclaredMethods(Class.java:2701) > java.lang.Class.privateGetPublicMethods(Class.java:2902) > java.lang.Class.getMethods(Class.java:1615) > sbt.internal.inc.ClassToAPI$.toDefinitions0(ClassToAPI.scala:170) > sbt.internal.inc.ClassToAPI$.$anonfun$toDefinitions$1(ClassToAPI.scala:123) > scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86) > sbt.internal.inc.ClassToAPI$.toDefinitions(ClassToAPI.scala:123) > sbt.internal.inc.ClassToAPI$.$anonfun$process$1(ClassToAPI.scala:33) > scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) > scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-34007) Downgrade scala-maven-plugin to 4.3.0
[ https://issues.apache.org/jira/browse/SPARK-34007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-34007: Assignee: Apache Spark > Downgrade scala-maven-plugin to 4.3.0 > - > > Key: SPARK-34007 > URL: https://issues.apache.org/jira/browse/SPARK-34007 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.1.0 >Reporter: Hyukjin Kwon >Assignee: Apache Spark >Priority: Blocker > > After we upgraded scala-maven-plugin to 4.4.0 at SPARK-33512, the docker > release script fails as below: > {code} > [INFO] Compiling 21 Scala sources and 3 Java sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > ... > [ERROR] ## Exception when compiling 24 sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s > signer information does not match signer information of other classes in the > same package > java.lang.ClassLoader.checkCerts(ClassLoader.java:891) > java.lang.ClassLoader.preDefineClass(ClassLoader.java:661) > java.lang.ClassLoader.defineClass(ClassLoader.java:754) > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > java.net.URLClassLoader.defineClass(URLClassLoader.java:468) > java.net.URLClassLoader.access$100(URLClassLoader.java:74) > java.net.URLClassLoader$1.run(URLClassLoader.java:369) > java.net.URLClassLoader$1.run(URLClassLoader.java:363) > java.security.AccessController.doPrivileged(Native Method) > java.net.URLClassLoader.findClass(URLClassLoader.java:362) > java.lang.ClassLoader.loadClass(ClassLoader.java:418) > java.lang.ClassLoader.loadClass(ClassLoader.java:351) > java.lang.Class.getDeclaredMethods0(Native Method) > java.lang.Class.privateGetDeclaredMethods(Class.java:2701) > java.lang.Class.privateGetPublicMethods(Class.java:2902) > java.lang.Class.getMethods(Class.java:1615) > sbt.internal.inc.ClassToAPI$.toDefinitions0(ClassToAPI.scala:170) > sbt.internal.inc.ClassToAPI$.$anonfun$toDefinitions$1(ClassToAPI.scala:123) > scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86) > sbt.internal.inc.ClassToAPI$.toDefinitions(ClassToAPI.scala:123) > sbt.internal.inc.ClassToAPI$.$anonfun$process$1(ClassToAPI.scala:33) > scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) > scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-34007) Downgrade scala-maven-plugin to 4.3.0
[ https://issues.apache.org/jira/browse/SPARK-34007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-34007: Assignee: (was: Apache Spark) > Downgrade scala-maven-plugin to 4.3.0 > - > > Key: SPARK-34007 > URL: https://issues.apache.org/jira/browse/SPARK-34007 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 3.1.0 >Reporter: Hyukjin Kwon >Priority: Blocker > > After we upgraded scala-maven-plugin to 4.4.0 at SPARK-33512, the docker > release script fails as below: > {code} > [INFO] Compiling 21 Scala sources and 3 Java sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > ... > [ERROR] ## Exception when compiling 24 sources to > /opt/spark-rm/output/spark-3.1.0-bin-hadoop2.7/resource-managers/yarn/target/scala-2.12/test-classes > java.lang.SecurityException: class "javax.servlet.SessionCookieConfig"'s > signer information does not match signer information of other classes in the > same package > java.lang.ClassLoader.checkCerts(ClassLoader.java:891) > java.lang.ClassLoader.preDefineClass(ClassLoader.java:661) > java.lang.ClassLoader.defineClass(ClassLoader.java:754) > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > java.net.URLClassLoader.defineClass(URLClassLoader.java:468) > java.net.URLClassLoader.access$100(URLClassLoader.java:74) > java.net.URLClassLoader$1.run(URLClassLoader.java:369) > java.net.URLClassLoader$1.run(URLClassLoader.java:363) > java.security.AccessController.doPrivileged(Native Method) > java.net.URLClassLoader.findClass(URLClassLoader.java:362) > java.lang.ClassLoader.loadClass(ClassLoader.java:418) > java.lang.ClassLoader.loadClass(ClassLoader.java:351) > java.lang.Class.getDeclaredMethods0(Native Method) > java.lang.Class.privateGetDeclaredMethods(Class.java:2701) > java.lang.Class.privateGetPublicMethods(Class.java:2902) > java.lang.Class.getMethods(Class.java:1615) > sbt.internal.inc.ClassToAPI$.toDefinitions0(ClassToAPI.scala:170) > sbt.internal.inc.ClassToAPI$.$anonfun$toDefinitions$1(ClassToAPI.scala:123) > scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86) > sbt.internal.inc.ClassToAPI$.toDefinitions(ClassToAPI.scala:123) > sbt.internal.inc.ClassToAPI$.$anonfun$process$1(ClassToAPI.scala:33) > scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) > scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org