Are you running Spark on YARN, Mesos, Standalone? For all of them you can
make the Hive dependency just part of your application, and then you can
manage this pretty easily.


On Wed, Jun 15, 2016 at 2:35 AM, Rostyslav Sotnychenko <
r.sotnyche...@gmail.com> wrote:

> Hello!
>
> I have a question regarding Hive and Spark.
>
> As far as I know, in order to use Hive-on-Spark one need to compile Spark
> without Hive profile, but that means that it won't be possible to access
> Hive from normal Spark jobs.
>
> How is community going to address this issue? Making two different
> spark-assembly jars or something else?
>
>
> Thanks,
> Rostyslav
>

Reply via email to