RE: Spark 1.1.0 - spark-submit failed

2015-01-21 Thread ey-chih chow
Thanks for help.  I added the following dependency in my pom file and the 
problem went away.
dependency !-- default Netty --
  groupIdio.netty/groupId
  artifactIdnetty/artifactId
  version3.6.6.Final/version
/dependency
Ey-Chih
Date: Tue, 20 Jan 2015 16:57:20 -0800 Subject: Re: Spark 1.1.0 - spark-submit 
failedFrom: yuzhih...@gmail.com
To: eyc...@hotmail.com
CC: user@spark.apache.org

Please check which netty jar(s) are on the classpath.
NioWorkerPool(Executor workerExecutor, int workerCount) was added in netty 3.5.4

Cheers
On Tue, Jan 20, 2015 at 4:15 PM, ey-chih chow eyc...@hotmail.com wrote:
Hi,



I issued the following command in a ec2 cluster launched using spark-ec2:



~/spark/bin/spark-submit --class com.crowdstar.cluster.etl.ParseAndClean

--master spark://ec2-54-185-107-113.us-west-2.compute.amazonaws.com:7077

--deploy-mode cluster --total-executor-cores 4

file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar

/ETL/input/2015/01/10/12/10Jan2015.avro

file:///tmp/etl-admin/vertica/VERTICA.avdl

file:///tmp/etl-admin/vertica/extras.json

file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar



The command failed with the following error logs in Spark-UI.  Is there any

suggestion on how to fix the problem?  Thanks.



Ey-Chih Chow



==



Launch Command: /usr/lib/jvm/java-1.7.0/bin/java -cp

/root/spark/work/driver-20150120200843-/spark-etl-0.0.1-SNAPSHOT.jar/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.1.jar:/root/spark/lib/datanucleus-core-3.2.2.jar:/root/spark/lib/datanucleus-rdbms-3.2.1.jar

-XX:MaxPermSize=128m

-Dspark.executor.extraLibraryPath=/root/ephemeral-hdfs/lib/native/

-Dspark.executor.memory=13000m -Dspark.akka.askTimeout=10

-Dspark.cores.max=4

-Dspark.app.name=com.crowdstar.cluster.etl.ParseAndClean

-Dspark.jars=file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar

-Dspark.executor.extraClassPath=/root/ephemeral-hdfs/conf

-Dspark.master=spark://ec2-54-203-58-2.us-west-2.compute.amazonaws.com:7077

-Dakka.loglevel=WARNING -Xms512M -Xmx512M

org.apache.spark.deploy.worker.DriverWrapper

akka.tcp://sparkwor...@ip-10-33-140-157.us-west-2.compute.internal:47585/user/Worker

com.crowdstar.cluster.etl.ParseAndClean

/ETL/input/2015/01/10/12/10Jan2015.avro

file:///tmp/etl-admin/vertica/VERTICA.avdl

file:///tmp/etl-admin/vertica/extras.json

file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar





SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in

[jar:file:/root/spark/work/driver-20150120200843-/spark-etl-0.0.1-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in

[jar:file:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an

explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

15/01/20 20:08:45 INFO spark.SecurityManager: Changing view acls to: root,

15/01/20 20:08:45 INFO spark.SecurityManager: Changing modify acls to: root,

15/01/20 20:08:45 INFO spark.SecurityManager: SecurityManager:

authentication disabled; ui acls disabled; users with view permissions:

Set(root, ); users with modify permissions: Set(root, )

15/01/20 20:08:45 INFO slf4j.Slf4jLogger: Slf4jLogger started

15/01/20 20:08:45 ERROR actor.ActorSystemImpl: Uncaught fatal error from

thread [Driver-akka.actor.default-dispatcher-3] shutting down ActorSystem

[Driver]

java.lang.NoSuchMethodError:

org.jboss.netty.channel.socket.nio.NioWorkerPool.init(Ljava/util/concurrent/Executor;I)V

at

akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:282)

at

akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:239)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at

sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at

sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)

at scala.util.Try$.apply(Try.scala:161)

at

akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)

at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)

at

akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)

at scala.util.Success.flatMap(Try.scala:200)

at

akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)

at 

Re: Spark 1.1.0 - spark-submit failed

2015-01-20 Thread Ted Yu
Please check which netty jar(s) are on the classpath.

NioWorkerPool(Executor workerExecutor, int workerCount) was added in netty
3.5.4

Cheers

On Tue, Jan 20, 2015 at 4:15 PM, ey-chih chow eyc...@hotmail.com wrote:

 Hi,

 I issued the following command in a ec2 cluster launched using spark-ec2:

 ~/spark/bin/spark-submit --class com.crowdstar.cluster.etl.ParseAndClean
 --master spark://ec2-54-185-107-113.us-west-2.compute.amazonaws.com:7077
 --deploy-mode cluster --total-executor-cores 4
 file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar
 /ETL/input/2015/01/10/12/10Jan2015.avro
 file:///tmp/etl-admin/vertica/VERTICA.avdl
 file:///tmp/etl-admin/vertica/extras.json
 file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar

 The command failed with the following error logs in Spark-UI.  Is there any
 suggestion on how to fix the problem?  Thanks.

 Ey-Chih Chow

 ==

 Launch Command: /usr/lib/jvm/java-1.7.0/bin/java -cp

 /root/spark/work/driver-20150120200843-/spark-etl-0.0.1-SNAPSHOT.jar/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-api-jdo-3.2.1.jar:/root/spark/lib/datanucleus-core-3.2.2.jar:/root/spark/lib/datanucleus-rdbms-3.2.1.jar
 -XX:MaxPermSize=128m
 -Dspark.executor.extraLibraryPath=/root/ephemeral-hdfs/lib/native/
 -Dspark.executor.memory=13000m -Dspark.akka.askTimeout=10
 -Dspark.cores.max=4
 -Dspark.app.name=com.crowdstar.cluster.etl.ParseAndClean
 -Dspark.jars=file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar
 -Dspark.executor.extraClassPath=/root/ephemeral-hdfs/conf
 -Dspark.master=spark://
 ec2-54-203-58-2.us-west-2.compute.amazonaws.com:7077
 -Dakka.loglevel=WARNING -Xms512M -Xmx512M
 org.apache.spark.deploy.worker.DriverWrapper
 akka.tcp://sparkwor...@ip-10-33-140-157.us-west-2.compute.internal
 :47585/user/Worker
 com.crowdstar.cluster.etl.ParseAndClean
 /ETL/input/2015/01/10/12/10Jan2015.avro
 file:///tmp/etl-admin/vertica/VERTICA.avdl
 file:///tmp/etl-admin/vertica/extras.json
 file:///tmp/etl-admin/jar/spark-etl-0.0.1-SNAPSHOT.jar
 

 SLF4J: Class path contains multiple SLF4J bindings.
 SLF4J: Found binding in

 [jar:file:/root/spark/work/driver-20150120200843-/spark-etl-0.0.1-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in

 [jar:file:/root/spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
 explanation.
 SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 15/01/20 20:08:45 INFO spark.SecurityManager: Changing view acls to: root,
 15/01/20 20:08:45 INFO spark.SecurityManager: Changing modify acls to:
 root,
 15/01/20 20:08:45 INFO spark.SecurityManager: SecurityManager:
 authentication disabled; ui acls disabled; users with view permissions:
 Set(root, ); users with modify permissions: Set(root, )
 15/01/20 20:08:45 INFO slf4j.Slf4jLogger: Slf4jLogger started
 15/01/20 20:08:45 ERROR actor.ActorSystemImpl: Uncaught fatal error from
 thread [Driver-akka.actor.default-dispatcher-3] shutting down ActorSystem
 [Driver]
 java.lang.NoSuchMethodError:

 org.jboss.netty.channel.socket.nio.NioWorkerPool.init(Ljava/util/concurrent/Executor;I)V
 at
 akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:282)
 at
 akka.remote.transport.netty.NettyTransport.init(NettyTransport.scala:239)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at

 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at

 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 at

 akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
 at scala.util.Try$.apply(Try.scala:161)
 at

 akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
 at

 akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
 at

 akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
 at scala.util.Success.flatMap(Try.scala:200)
 at

 akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
 at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)
 at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)
 at

 scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
 at scala.collection.Iterator$class.foreach(Iterator.scala:727)
 at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
 at
 scala.collection.IterableLike$class.foreach(IterableLike.scala:72)