Thanks Marcelo and Patrick - I don't know how I missed that ticket in my
Jira search earlier. Is anybody working on the sub-issues yet, or is there
a design doc I should look at before taking a stab?

Regards,
Punya

On Mon, Apr 27, 2015 at 3:56 PM Patrick Wendell <pwend...@gmail.com> wrote:

> Hey Punya,
>
> There is some ongoing work to help make Hive upgrades more manageable
> and allow us to support multiple versions of Hive. Once we do that, it
> will be much easier for us to upgrade.
>
> https://issues.apache.org/jira/browse/SPARK-6906
>
> - Patrick
>
> On Mon, Apr 27, 2015 at 12:47 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
> > That's a lot more complicated than you might think.
> >
> > We've done some basic work to get HiveContext to compile against Hive
> > 1.1.0. Here's the code:
> >
> https://github.com/cloudera/spark/commit/00e2c7e35d4ac236bcfbcd3d2805b483060255ec
> >
> > We didn't sent that upstream because that only solves half of the
> > problem; the hive-thriftserver is disabled in our CDH build because it
> > uses a lot of Hive APIs that have been removed in 1.1.0, so even
> > getting it to compile is really complicated.
> >
> > If there's interest in getting the HiveContext part fixed up I can
> > send a PR for that code. But at this time I don't really have plans to
> > look at the thrift server.
> >
> >
> > On Mon, Apr 27, 2015 at 11:58 AM, Punyashloka Biswal
> > <punya.bis...@gmail.com> wrote:
> >> Dear Spark devs,
> >>
> >> Is there a plan for staying up-to-date with current (and future)
> versions
> >> of Hive? Spark currently supports version 0.13 (June 2014), but the
> latest
> >> version of Hive is 1.1.0 (March 2015). I don't see any Jira tickets
> about
> >> updating beyond 0.13, so I was wondering if this was intentional or it
> was
> >> just that nobody had started work on this yet.
> >>
> >> I'd be happy to work on a PR for the upgrade if one of the core
> developers
> >> can tell me what pitfalls to watch out for.
> >>
> >> Punya
> >
> >
> >
> > --
> > Marcelo
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
>

Reply via email to