Do you have Spark installed locally on your laptop with IntelliJ?  Are you
using the SparkLauncher class or your local spark-submit script?  A while
back, I was trying to submit a spark job from my local workstation to a
remote cluster using the SparkLauncher class, but I didn't actually have
SPARK_HOME set or the spark-submit script on my local machine yet, so the
submit was failing.  I think the error I was getting was that SPARK_HOME
environment variable was not set, though.

On Wed, May 10, 2017 at 5:51 AM s t <serkan_...@hotmail.com> wrote:

> Hello,
>
> I am trying to run spark code from my laptop with intellij. I have cluster
> of 2 nodes and a master. When i start the program from intellij it gets
> error of some missing classes.
>
> I am aware that some jars need to be distributed to the workers but do not
> know if it is possible programatically. spark submit or jupyter notebook
> handles the issue but intellij does not.
>
> can any one give some advices to me ?
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to