Pls follow instruction given in below links.
https://issues.apache.org/jira/browse/SPARK-6961
https://issues.apache.org/jira/browse/SPARK-6961
http://www.srccodes.com/p/article/39/error-util-shell-failed-locate-winutils-binary-hadoop-binary-path
Hello,
When I run a spark job with spark-submit it fails with below exception for
code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/
I had similar issue running from spark-shell, then realized that I needed
sqlContext.implicit._
Now my code has the following imports
/*
Hello,
I moved from 1.3.1 to 1.4.0 and started receiving
java.lang.OutOfMemoryError: PermGen space when I use spark-shell.
Same Scala code works fine in 1.3.1 spark-shell. I was loading same set of
external JARs and have same imports in 1.3.1.
I tried increasing perm size to 256m. I still got