Could you clarify why you need a 10G akka frame size?

Best Regards,
Shixiong Zhu

2015-02-05 9:20 GMT+08:00 Shixiong Zhu <zsxw...@gmail.com>:

> The unit of "spark.akka.frameSize" is MB. The max value is 2047.
>
> Best Regards,
> Shixiong Zhu
>
> 2015-02-05 1:16 GMT+08:00 sahanbull <sa...@skimlinks.com>:
>
>> I am trying to run a spark application with
>>
>> -Dspark.executor.memory=30g -Dspark.kryoserializer.buffer.max.mb=2000
>> -Dspark.akka.frameSize=10000
>>
>> and the job fails because one or more of the akka frames are larger than
>> 10000mb (12000 ish).
>>
>> When I change the Dspark.akka.frameSize=10000 to 12000,15000 and 20000 and
>> RUN:
>>
>> ./spark/bin/spark-submit  --driver-memory 30g --executor-memory 30g
>> mySparkCode.py
>>
>> I get an error in the startup as :
>>
>>
>> ERROR OneForOneStrategy: Cannot instantiate transport
>> [akka.remote.transport.netty.NettyTransport]. Make sure it extends
>> [akka.remote.transport.Transport] and ha
>> s constructor with [akka.actor.ExtendedActorSystem] and
>> [com.typesafe.config.Config] parameters
>> java.lang.IllegalArgumentException: Cannot instantiate transport
>> [akka.remote.transport.netty.NettyTransport]. Make sure it extends
>> [akka.remote.transport.Transport] and has const
>> ructor with [akka.actor.ExtendedActorSystem] and
>> [com.typesafe.config.Config] parameters
>>         at
>>
>> akka.remote.EndpointManager$$anonfun$8$$anonfun$3.applyOrElse(Remoting.scala:620)
>>         at
>>
>> akka.remote.EndpointManager$$anonfun$8$$anonfun$3.applyOrElse(Remoting.scala:618)
>>         at
>>
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
>>         at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
>>         at scala.util.Try$.apply(Try.scala:161)
>>         at scala.util.Failure.recover(Try.scala:185)
>>         at
>> akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)
>>         at
>> akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)
>>         at
>>
>> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
>>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>         at
>> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>         at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>         at
>> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
>>         at
>>
>> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)
>>         at
>>
>> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)
>>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>>         at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>>         at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>>         at
>>
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>>         at
>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>         at
>>
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>         at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>         at
>>
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> Caused by: java.lang.IllegalArgumentException: requirement failed: Setting
>> 'maximum-frame-size' must be at least 32000 bytes
>>         at scala.Predef$.require(Predef.scala:233)
>>         at
>> akka.util.Helpers$Requiring$.requiring$extension1(Helpers.scala:104)
>>         at
>>
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>>         at
>> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>>         at
>> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>>         at
>>
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
>>         at
>> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>         at
>> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>>         at
>> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
>>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>>         at
>>
>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>         at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>>         at
>> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
>>         at py4j.Gateway.invoke(Gateway.java:214)
>>         at
>>
>> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>>         at
>> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>>         at py4j.GatewayConnection.run(GatewayConnection.java:207)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> Can anyone give me a clue about whats going wrong here??
>>
>> I am running spark 1.1.0 in r3.2xlarge EC2 instances that come with 60GB
>> RAM.
>>
>> Many thanks
>> Sahan
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-with-changing-the-akka-framesize-parameter-tp21497.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to