I was trying to submit a spark job from my local workstation to a remote
cluster using the SparkLauncher class, but I didn't actually have SPARK_HOME
set or the spark-submit script on my local machine yet, so the submit was
failing. I think the error I was getting was that SPARK_HOME environment
var
Hello,
I am trying to run spark code from my laptop with intellij. I have cluster of 2
nodes and a master. When i start the program from intellij it gets error of
some missing classes.
I am aware that some jars need to be distributed to the workers but do not know
if it is possible
Hi,
I hope i am missing very simple point to stuck this kind of error.
http://stackoverflow.com/questions/43560807/pyspark-streaming-from-kinesis-kills-heap
Regards,
Serkan