Hello,

I am trying to run spark code from my laptop with intellij. I have cluster of 2 
nodes and a master. When i start the program from intellij it gets error of 
some missing classes. 

I am aware that some jars need to be distributed to the workers but do not know 
if it is possible programatically. spark submit or jupyter notebook handles the 
issue but intellij does not.

can any one give some advices to me ?
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to