Yes, I did, with these arguments: --tgz -Pyarn -Phadoop-2.4 -Phive
-Phive-thriftserver

To be more specific about what is not working, when I launch spark-shell
--master yarn, I get this error immediately after launch.  I have no idea
from looking at the source.

java.lang.NullPointerException

at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:141)

at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:49)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:408)

at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1027)

at $iwC$$iwC.<init>(<console>:9)

On Tue, Mar 17, 2015 at 7:43 AM, Sean Owen <so...@cloudera.com> wrote:

> OK, did you build with YARN support (-Pyarn)? and the right
> incantation of flags like "-Phadoop-2.4
> -Dhadoop.version=2.5.0-cdh5.3.2" or similar?
>
> On Tue, Mar 17, 2015 at 2:39 PM, Eric Friedman
> <eric.d.fried...@gmail.com> wrote:
> > I did not find that the generic build worked.  In fact I also haven't
> gotten
> > a build from source to work either, though that one might be a case of
> > PEBCAK. In the former case I got errors about the build not having YARN
> > support.
> >
> > On Sun, Mar 15, 2015 at 3:03 AM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> I think (I hope) it's because the generic builds "just work". Even
> >> though these are of course distributed mostly verbatim in CDH5, with
> >> tweaks to be compatible with other stuff at the edges, the stock
> >> builds should be fine too. Same for HDP as I understand.
> >>
> >> The CDH4 build may work on some builds of CDH4, but I think is lurking
> >> there as a "Hadoop 2.0.x plus a certain YARN beta" build. I'd prefer
> >> to rename it that way, myself, since it doesn't actually work with all
> >> of CDH4 anyway.
> >>
> >> Are the MapR builds there because the stock Hadoop build doesn't work
> >> on MapR? that would actually surprise me, but then, why are these two
> >> builds distributed?
> >>
> >>
> >> On Sun, Mar 15, 2015 at 6:22 AM, Eric Friedman
> >> <eric.d.fried...@gmail.com> wrote:
> >> > Is there a reason why the prebuilt releases don't include current CDH
> >> > distros and YARN support?
> >> >
> >> > ----
> >> > Eric Friedman
> >> > ---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> > For additional commands, e-mail: user-h...@spark.apache.org
> >> >
> >
> >
>

Reply via email to