Hi Aniket,
Thanks for your reply.

I followed your advice to modified my code.
Here is latest one.
https://github.com/TomoyaIgarashi/spark_cluster_sample/commit/ce7613c42d3adbe6ae44e264c11f3829460f3c35

As a result, It works correctly! Thank you very much.

But, "AssociationError" Message appears line 397 in Playframework logs as
follows.
https://gist.github.com/TomoyaIgarashi/9688bdd5663af95ddd4d

Is there any problem?


2014-12-15 18:48 GMT+09:00 Aniket Bhatnagar <aniket.bhatna...@gmail.com>:
>
> Try the workaround (addClassPathJars(sparkContext,
> this.getClass.getClassLoader) discussed in
> http://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAJOb8buD1B6tUtOfG8_Ok7F95C3=r-zqgffoqsqbjdxd427...@mail.gmail.com%3E
>
> Thanks,
> Aniket
>
>
> On Mon Dec 15 2014 at 07:43:24 Tomoya Igarashi <
> tomoya.igarashi.0...@gmail.com> wrote:
>
>> Hi all,
>>
>> I am trying to run Spark job on Playframework + Spark Master/Worker in
>> one Mac.
>> When job ran, I encountered java.lang.ClassNotFoundException.
>> Would you teach me how to solve it?
>>
>> Here is my code in Github.
>> https://github.com/TomoyaIgarashi/spark_cluster_sample
>>
>> * Envrionments:
>> Mac 10.9.5
>> Java 1.7.0_71
>> Playframework 2.2.3
>> Spark 1.1.1
>>
>> * Setup history:
>> > cd ~
>> > git clone g...@github.com:apache/spark.git
>> > cd spark
>> > git checkout -b v1.1.1 v1.1.1
>> > sbt/sbt assembly
>> > vi ~/.bashrc
>> export SPARK_HOME=/Users/tomoya/spark
>> > . ~/.bashrc
>> > hostname
>> Tomoya-Igarashis-MacBook-Air.local
>> > vi $SPARK_HOME/conf/slaves
>> Tomoya-Igarashis-MacBook-Air.local
>> > play new spark_cluster_sample
>> default name
>> type -> scala
>>
>> * Run history:
>> > $SPARK_HOME/sbin/start-all.sh
>> > jps
>> > which play
>> /Users/tomoya/play/play
>> > git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
>> > cd spark_cluster_sample
>> > play run
>>
>> * Error trace:
>> Here is error trace in Gist.
>> https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511
>>
>> Regards
>>
>

Reply via email to