I don't think this is the problem, but I think you'd also want to set
-Dhadoop.version= to match your deployment version, if you're building
for a particular version, just to be safe-est.
I don't recall seeing that particular error before. It indicates to me
that the SparkContext is null. Is this
Sean, you are exactly right, as I learned from parsing your earlier reply
more carefully -- sorry I didn't do this the first time.
Setting hadoop.version was indeed the solution
./make-distribution.sh --tgz -Pyarn -Phadoop-2.4 -Phive -Phive-thriftserver
-Dhadoop.version=2.5.0-cdh5.3.2
Thanks
Yes, I did, with these arguments: --tgz -Pyarn -Phadoop-2.4 -Phive
-Phive-thriftserver
To be more specific about what is not working, when I launch spark-shell
--master yarn, I get this error immediately after launch. I have no idea
from looking at the source.
java.lang.NullPointerException
at
OK, did you build with YARN support (-Pyarn)? and the right
incantation of flags like -Phadoop-2.4
-Dhadoop.version=2.5.0-cdh5.3.2 or similar?
On Tue, Mar 17, 2015 at 2:39 PM, Eric Friedman
eric.d.fried...@gmail.com wrote:
I did not find that the generic build worked. In fact I also haven't
I did not find that the generic build worked. In fact I also haven't
gotten a build from source to work either, though that one might be a case
of PEBCAK. In the former case I got errors about the build not having YARN
support.
On Sun, Mar 15, 2015 at 3:03 AM, Sean Owen so...@cloudera.com wrote:
I think (I hope) it's because the generic builds just work. Even
though these are of course distributed mostly verbatim in CDH5, with
tweaks to be compatible with other stuff at the edges, the stock
builds should be fine too. Same for HDP as I understand.
The CDH4 build may work on some builds of
Is there a reason why the prebuilt releases don't include current CDH distros
and YARN support?
Eric Friedman
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: