In any case,

I am running the same version of spark standalone on the cluster as the jobserver (I compiled the master branch as opposed to the jobserver branch, not sure if this matters). I then proceeded to change the application.conf file to reflect the spark://master_ip:7077 as the master.

When I run the jobserver with re-start, it starts up just fine. However, when I compile the test wordcount job and curl it as per README.md in jobserver/, this is what I get in the master's logs. Any ideas?

Thanks!
Ognen

14/02/24 14:58:32 ERROR OneForOneStrategy: Error while decoding incoming Akka PDU of length: 31 akka.remote.transport.AkkaProtocolException: Error while decoding incoming Akka PDU of length: 31 Caused by: akka.remote.transport.PduCodecException: Decoding of control PDU failed, invalid format, unexpected: [ASSOCIATE] at akka.remote.transport.AkkaPduProtobufCodec$.decodeControlPdu(AkkaPduCodec.scala:220) at akka.remote.transport.AkkaPduProtobufCodec$.decodePdu(AkkaPduCodec.scala:170) at akka.remote.transport.ProtocolStateActor.akka$remote$transport$ProtocolStateActor$$decodePdu(AkkaProtocolTransport.scala:513) at akka.remote.transport.ProtocolStateActor$$anonfun$4.applyOrElse(AkkaProtocolTransport.scala:320) at akka.remote.transport.ProtocolStateActor$$anonfun$4.applyOrElse(AkkaProtocolTransport.scala:292) at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
    at akka.actor.FSM$class.processEvent(FSM.scala:595)
at akka.remote.transport.ProtocolStateActor.processEvent(AkkaProtocolTransport.scala:220)
    at akka.actor.FSM$class.akka$actor$FSM$$processMsg(FSM.scala:589)
    at akka.actor.FSM$$anonfun$receive$1.applyOrElse(FSM.scala:583)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
    at akka.actor.ActorCell.invoke(ActorCell.scala:456)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

On 2/23/14, 10:46 PM, Ognen Duzlevski wrote:
Nick, thanks.

I checked out the code and after briefly reading the docs and playing with it, I have a basic question :) - I have a standalone spark cluster running on the same machine. I am guessing in order for all this to work - jobserver has to be running separately and be pointed to the cluster via a setting in application.conf? If so, what version of spark is it expecting to see running so that it all "just works"? I have 0.9 running and when pointed to it in application.conf - it reported that a test job was started, running but then reported that the cluster is dead.

Thanks!
Ognen

On 2/23/14, 2:57 PM, Nick Pentreath wrote:
Sure.

I've done it - rolled my own version before this PR came out. Evan's is definitely better than my effort :) but mine does the job so to speak.

I will be replacing my version with the "official" job server quite soon though.

Haven't upgraded mine to 0.9.0 yet though so I may also run into same issues.

Feel free to ping me or Evan who wrote the job server PR with questions.
—
Sent from Mailbox <https://www.dropbox.com/mailbox> for iPhone


On Sun, Feb 23, 2014 at 10:49 PM, Ognen Duzlevski <og...@plainvanillagames.com <mailto:og...@plainvanillagames.com>> wrote:

    Nick,

    Thanks - I was already nudged in this direction ;) but was hoping
    to see if I can roll my own.

    Ognen

    On 2/23/14, 2:43 PM, Nick Pentreath wrote:
    You might want to try out the Spark job server PR:
    https://github.com/apache/incubator-spark/pull/222

    It could be used for your use case. You can write the relevant
    code, upload the jars, and run it via passing along the conf.

    You could then use a very lightweight scalatra servlet on the
    frontend that executes request to this backend job server.
    —
    Sent from Mailbox <https://www.dropbox.com/mailbox> for iPhone


    On Sun, Feb 23, 2014 at 10:15 PM, Ognen Duzlevski
    <og...@nengoiksvelzud.com <mailto:og...@nengoiksvelzud.com>> wrote:

        On 2/23/14, 10:26 AM, Ognen Duzlevski wrote:

        Hello all,

        perhaps too ambitiously ;) I have decided to try and roll
        my own Scalatra app that connects to a spark cluster and
        executes a query when a certain URL is accessed - I am just
        trying to figure out how to get these things going.

        Is there anything I should pay particular attention to? I
        am running the scalatra servlet on the same machine spark
        standalone is running, only the scalatra servlet is
        listening to port 8082.
        I have made some headway in this, I think. Unfortunately, I
        am too new to scala and sbt and the whole ecosystem so what
        I may say next may sound like rubbish ;). The problem seems
        to be that scalatra is at version 2.2.2 which comes with
        Akka 2.1.3 and Spark-0.9 is at akka 2.2.3. Apparently things
        have changed in Akka between the two versions. I am still
        unable to get this going. Even if I downgrade spark to 0.8.1
        - it comes with Akka 2.0.5 and scalatra will be ahead.

        Does anyone have an idea how to put the two together?

        Thanks!
        Ognen



-- Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.
    -- Jamie Zawinski



--
Some people, when confronted with a problem, think "I know, I'll use regular 
expressions." Now they have two problems.
-- Jamie Zawinski

Reply via email to