Hi,

On Fri, Oct 31, 2014 at 4:31 PM, lieyan <lie...@yahoo.com> wrote:
>
> The code are here:  LogReg.scala
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n17803/LogReg.scala
> >
>
> Then I click the "Run" button of the IDEA, and I get the following error
> message
> errlog.txt
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n17803/errlog.txt
> >
> .
> But when I export the jar file, and use *spark-submit --class
> net.yanl.spark.LogReg log_reg.jar 150000*. The program works finely.
>

I have not used the spark built-in cluster manager and I don't know how
application jar distribution is done in it. However, it seems to me that
when you use spark-submit, then spark-submit takes care of distributing
your jar file properly to all the cluster nodes, that's why it works fine.
When you run it from your IDE, it seems not to do that, that's why some
classes are not there on all cluster nodes and you run
into ClassNotFoundExceptions. If you change the master to "local[3]"
instead of "spark://master.local:7077" and run it from IDEA, does it work?

Tobias

Reply via email to