[ 
https://issues.apache.org/jira/browse/SPARK-8768?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619681#comment-14619681
 ] 

Patrick Wendell edited comment on SPARK-8768 at 7/9/15 1:04 AM:
----------------------------------------------------------------

So it turns out that build/mvn still uses the system maven even if it downloads 
the newer version (this was the original design). Is it possible that is why 
it's breaking?

It might be nice to modify that script to have a flag like --force that will 
always use the downloaded maven.


was (Author: pwendell):
So it turns out that build/mvn still uses the system maven even if it downloads 
the newer version (this was the original design). Is it possible that is why 
it's breaking?

> SparkSubmitSuite fails on Hadoop 1.x builds due to java.lang.VerifyError in 
> Akka Protobuf
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-8768
>                 URL: https://issues.apache.org/jira/browse/SPARK-8768
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.5.0
>            Reporter: Josh Rosen
>            Priority: Blocker
>
> The end-to-end SparkSubmitSuite tests ("launch simple application with 
> spark-submit", "include jars passed in through --jars", and "include jars 
> passed in through --packages") are currently failing for the pre-YARN Hadoop 
> builds.
> I managed to reproduce one of the Jenkins failures locally:
> {code}
> build/mvn -Phadoop-1 -Dhadoop.version=1.2.1 -Phive -Phive-thriftserver 
> -Pkinesis-asl test -DwildcardSuites=org.apache.spark.deploy.SparkSubmitSuite 
> -Dtest=none
> {code}
> Here's the output from unit-tests.log:
> {code}
> ===== TEST OUTPUT FOR o.a.s.deploy.SparkSubmitSuite: 'launch simple 
> application with spark-submit' =====
> 15/07/01 13:39:58.964 redirect stderr for command ./bin/spark-submit INFO 
> Utils: SLF4J: Class path contains multiple SLF4J bindings.
> 15/07/01 13:39:58.964 redirect stderr for command ./bin/spark-submit INFO 
> Utils: SLF4J: Found binding in 
> [jar:file:/Users/joshrosen/Documents/spark-2/assembly/target/scala-2.10/spark-assembly-1.5.0-SNAPSHOT-hadoop1.2.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 15/07/01 13:39:58.965 redirect stderr for command ./bin/spark-submit INFO 
> Utils: SLF4J: Found binding in 
> [jar:file:/Users/joshrosen/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 15/07/01 13:39:58.965 redirect stderr for command ./bin/spark-submit INFO 
> Utils: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> 15/07/01 13:39:58.965 redirect stderr for command ./bin/spark-submit INFO 
> Utils: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/07/01 13:39:58.966 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:58 INFO SparkContext: Running Spark version 
> 1.5.0-SNAPSHOT
> 15/07/01 13:39:59.334 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:59 INFO SecurityManager: Changing view acls to: 
> joshrosen
> 15/07/01 13:39:59.335 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:59 INFO SecurityManager: Changing modify acls to: 
> joshrosen
> 15/07/01 13:39:59.335 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:59 INFO SecurityManager: SecurityManager: 
> authentication disabled; ui acls disabled; users with view permissions: 
> Set(joshrosen); users with modify permissions: Set(joshrosen)
> 15/07/01 13:39:59.898 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:59 INFO Slf4jLogger: Slf4jLogger started
> 15/07/01 13:39:59.934 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:39:59 INFO Remoting: Starting remoting
> 15/07/01 13:40:00.009 redirect stderr for command ./bin/spark-submit INFO 
> Utils: 15/07/01 13:40:00 ERROR ActorSystemImpl: Uncaught fatal error from 
> thread [sparkDriver-akka.remote.default-remote-dispatcher-5] shutting down 
> ActorSystem [sparkDriver]
> 15/07/01 13:40:00.009 redirect stderr for command ./bin/spark-submit INFO 
> Utils: java.lang.VerifyError: class 
> akka.remote.WireFormats$AkkaControlMessage overrides final method 
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
> 15/07/01 13:40:00.009 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.lang.ClassLoader.defineClass1(Native Method)
> 15/07/01 13:40:00.009 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> 15/07/01 13:40:00.009 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.security.AccessController.doPrivileged(Native Method)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> akka.remote.transport.AkkaPduProtobufCodec$.<init>(AkkaPduCodec.scala:153)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> akka.remote.transport.AkkaPduProtobufCodec$.<clinit>(AkkaPduCodec.scala)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)
> 15/07/01 13:40:00.010 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
> 15/07/01 13:40:00.011 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> 15/07/01 13:40:00.011 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> 15/07/01 13:40:00.011 redirect stderr for command ./bin/spark-submit INFO 
> Utils:    at 
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> {code}
> A Google search reveals that we've run into this problem before:
> https://mail-archives.apache.org/mod_mbox/spark-reviews/201407.mbox/%3cgit-pr-1685-sp...@git.apache.org%3E
> This may be caused by Protobuf version conflicts.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to