Disclaimer: this seems more of a CDH question, I'd suggest sending
these to the CDH mailing list in the future.

CDH 5.2 actually has Spark 1.1. It comes with SparkSQL built-in, but
it does not include the thrift server because of incompatibilities
with the CDH version of Hive. To use Hive support, you'll need to
manually add Hive jars to your application's classpath, though.

CDH 5.3 (Spark 1.2) has the thrift server, if you want to use it, and
has the same limitation regarding having to mess with the classpath.

Also, SparkSQL is currently unsupported in CDH, so YMMV.

Now, if you're trying to use an Apache release of Spark against CDH,
you may run into other issues (like different Hive versions causing
problems). So be careful when doing that.


On Thu, Jan 8, 2015 at 2:24 PM, Abhi Basu <9000r...@gmail.com> wrote:
> I am working with CDH5.2 (Spark 1.0.0) and wondering which version of Spark
> comes with SparkSQL by default. Also, will SparkSQL come enabled to access
> the Hive Metastore? Is there an easier way to enable Hive support without
> have to build the code with various switches?
>
> Thanks,
>
> Abhi
>
> --
> Abhi Basu



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

  • SparkSQL Abhi Basu
    • Re: SparkSQL Marcelo Vanzin

Reply via email to