Re: Error in invoking a custom StandaloneRecoveryModeFactory in java env (Spark v1.3.0)

2015-07-05 Thread Niranda Perera
Hi Josh,

I tried using the spark 1.4.0 upgrade.

here is the class I'm trying to use

package org.wso2.carbon.analytics.spark.core.util.master

import akka.serialization.Serialization
import org.apache.spark.SparkConf
import org.apache.spark.deploy.master._

class AnalyticsRecoveryModeFactoryScala(conf: SparkConf, serializer:
Serialization)
  extends StandaloneRecoveryModeFactory(conf, serializer) {

  override def createPersistenceEngine(): PersistenceEngine = new
  AnalyticsPersistenceEngine(conf, serializer)

  override def createLeaderElectionAgent(master: LeaderElectable):
LeaderElectionAgent = new
  AnalyticsLeaderElectionAgent(master)
}

object AnalyticsRecoveryModeFactoryScala {

}

when I invoke this factory from the master, I get a similar error as before

[2015-07-05 17:06:55,384] ERROR {akka.actor.OneForOneStrategy} -
 
org.wso2.carbon.analytics.spark.core.util.master.AnalyticsRecoveryModeFactory.(org.apache.spark.SparkConf,
akka.serialization.Serialization)
akka.actor.ActorInitializationException: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
at akka.actor.ActorCell.create(ActorCell.scala:596)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoSuchMethodException:
org.wso2.carbon.analytics.spark.core.util.master.AnalyticsRecoveryModeFactory.(org.apache.spark.SparkConf,
akka.serialization.Serialization)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getConstructor(Class.java:1718)
at org.apache.spark.deploy.master.Master.preStart(Master.scala:168)
at akka.actor.Actor$class.aroundPreStart(Actor.scala:470)
at org.apache.spark.deploy.master.Master.aroundPreStart(Master.scala:52)
at akka.actor.ActorCell.create(ActorCell.scala:580)
... 9 more


what could be the reason for this?

rgds

On Thu, Jun 25, 2015 at 11:42 AM, Niranda Perera 
wrote:

> thanks Josh.
>
> this looks very similar to my problem.
>
> On Thu, Jun 25, 2015 at 11:32 AM, Josh Rosen  wrote:
>
>> This sounds like https://issues.apache.org/jira/browse/SPARK-7436, which
>> has been fixed in Spark 1.4+ and in branch-1.3 (for Spark 1.3.2).
>>
>> On Wed, Jun 24, 2015 at 10:57 PM, Niranda Perera <
>> niranda.per...@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm trying to implement a custom StandaloneRecoveryModeFactory in the
>>> Java environment. Pls find the implementation here. [1] . I'm new to Scala,
>>> hence I'm trying to use Java environment as much as possible.
>>>
>>> when I start a master with spark.deploy.recoveryMode.factory property to
>>> be "CUSTOM", I encounter a NoSuchMethodException for my custom class's
>>> constructor.
>>> it has the following constructor.
>>>
>>>  public AnalyticsStandaloneRecoveryModeFactory(SparkConf conf,
>>> Serialization serializer)
>>>
>>> but from the Master, it looks for a constructor for,
>>> org.wso2.carbon.analytics.spark.core.util.master.AnalyticsStandaloneRecoveryModeFactory.(org.apache.spark.SparkConf,
>>> akka.serialization.Serialization$)
>>>
>>> I see in the Spark source code for Master, that it uses reflection to
>>> get the custom recovery mode factory class.
>>>
>>> case "CUSTOM" =>
>>> val clazz =
>>> Class.forName(conf.get("spark.deploy.recoveryMode.factory"))
>>> val factory = clazz.getConstructor(conf.getClass,
>>> Serialization.getClass)
>>>   .newInstance(conf, SerializationExtension(context.system))
>>>   .asInstanceOf[StandaloneRecoveryModeFactory]
>>> (factory.createPersistenceEngine(),
>>> factory.createLeaderElectionAgent(this))
>>>
>>> here, Serialization.getClass returns a akka.serialization.Serialization$
>>> object, where as my custom class's constructor
>>> accepts akka.serialization.Serialization object.
>>>
>>> so I would like to know,
>>> 1. if this happens because I'm using this in the Java environment?
>>> 2. what is the workaround to this?
>>>
>>> thanks
>>>
>>> Please find the full stack trace of the error below.
>>>
>>> [2015-06-25 10:59:01,095] ERROR {akka.actor.OneForOneStrategy} -
>>>  
>>> org.wso2.carbon.analytics.spark.core.util.master.AnalyticsStandaloneRecoveryModeFactory.(org.apache.spark.SparkConf,
>>> akka.serialization.Serialization$)
>>> akka.actor.ActorInitializationException: exception during creation
>>> at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
>>> at akka.actor.ActorCell.create(ActorCell.scala:59

Re: Error in invoking a custom StandaloneRecoveryModeFactory in java env (Spark v1.3.0)

2015-07-05 Thread Niranda Perera
Hi,

Sorry this was a class loading issue at my side. Sorted it out.

Sorry if I caused any inconvenience

Rgds

Niranda Perera
+94 71 554 8430
On Jul 5, 2015 17:08, "Niranda Perera"  wrote:

> Hi Josh,
>
> I tried using the spark 1.4.0 upgrade.
>
> here is the class I'm trying to use
>
> package org.wso2.carbon.analytics.spark.core.util.master
>
> import akka.serialization.Serialization
> import org.apache.spark.SparkConf
> import org.apache.spark.deploy.master._
>
> class AnalyticsRecoveryModeFactoryScala(conf: SparkConf, serializer:
> Serialization)
>   extends StandaloneRecoveryModeFactory(conf, serializer) {
>
>   override def createPersistenceEngine(): PersistenceEngine = new
>   AnalyticsPersistenceEngine(conf, serializer)
>
>   override def createLeaderElectionAgent(master: LeaderElectable):
> LeaderElectionAgent = new
>   AnalyticsLeaderElectionAgent(master)
> }
>
> object AnalyticsRecoveryModeFactoryScala {
>
> }
>
> when I invoke this factory from the master, I get a similar error as
> before
>
> [2015-07-05 17:06:55,384] ERROR {akka.actor.OneForOneStrategy} -
>  
> org.wso2.carbon.analytics.spark.core.util.master.AnalyticsRecoveryModeFactory.(org.apache.spark.SparkConf,
> akka.serialization.Serialization)
> akka.actor.ActorInitializationException: exception during creation
> at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
> at akka.actor.ActorCell.create(ActorCell.scala:596)
> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
> at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.NoSuchMethodException:
> org.wso2.carbon.analytics.spark.core.util.master.AnalyticsRecoveryModeFactory.(org.apache.spark.SparkConf,
> akka.serialization.Serialization)
> at java.lang.Class.getConstructor0(Class.java:2810)
> at java.lang.Class.getConstructor(Class.java:1718)
> at org.apache.spark.deploy.master.Master.preStart(Master.scala:168)
> at akka.actor.Actor$class.aroundPreStart(Actor.scala:470)
> at org.apache.spark.deploy.master.Master.aroundPreStart(Master.scala:52)
> at akka.actor.ActorCell.create(ActorCell.scala:580)
> ... 9 more
>
>
> what could be the reason for this?
>
> rgds
>
> On Thu, Jun 25, 2015 at 11:42 AM, Niranda Perera  > wrote:
>
>> thanks Josh.
>>
>> this looks very similar to my problem.
>>
>> On Thu, Jun 25, 2015 at 11:32 AM, Josh Rosen 
>> wrote:
>>
>>> This sounds like https://issues.apache.org/jira/browse/SPARK-7436,
>>> which has been fixed in Spark 1.4+ and in branch-1.3 (for Spark 1.3.2).
>>>
>>> On Wed, Jun 24, 2015 at 10:57 PM, Niranda Perera <
>>> niranda.per...@gmail.com> wrote:
>>>
 Hi all,

 I'm trying to implement a custom StandaloneRecoveryModeFactory in the
 Java environment. Pls find the implementation here. [1] . I'm new to Scala,
 hence I'm trying to use Java environment as much as possible.

 when I start a master with spark.deploy.recoveryMode.factory property
 to be "CUSTOM", I encounter a NoSuchMethodException for my custom class's
 constructor.
 it has the following constructor.

  public AnalyticsStandaloneRecoveryModeFactory(SparkConf conf,
 Serialization serializer)

 but from the Master, it looks for a constructor for,
 org.wso2.carbon.analytics.spark.core.util.master.AnalyticsStandaloneRecoveryModeFactory.(org.apache.spark.SparkConf,
 akka.serialization.Serialization$)

 I see in the Spark source code for Master, that it uses reflection to
 get the custom recovery mode factory class.

 case "CUSTOM" =>
 val clazz =
 Class.forName(conf.get("spark.deploy.recoveryMode.factory"))
 val factory = clazz.getConstructor(conf.getClass,
 Serialization.getClass)
   .newInstance(conf, SerializationExtension(context.system))
   .asInstanceOf[StandaloneRecoveryModeFactory]
 (factory.createPersistenceEngine(),
 factory.createLeaderElectionAgent(this))

 here, Serialization.getClass returns
 a akka.serialization.Serialization$ object, where as my custom class's
 constructor accepts akka.serialization.Serialization object.

 so I would like to know,
 1. if this happens because I'm using this in the Java environment?
 2. what is the workaround to this?

 thanks

 Please find the full stack trace of the error below.

 [2015-06-25 10:59:01,095] ERROR {akka.actor.OneForOne