Dear Spark devs, Is there a plan for staying up-to-date with current (and future) versions of Hive? Spark currently supports version 0.13 (June 2014), but the latest version of Hive is 1.1.0 (March 2015). I don't see any Jira tickets about updating beyond 0.13, so I was wondering if this was intentional or it was just that nobody had started work on this yet.
I'd be happy to work on a PR for the upgrade if one of the core developers can tell me what pitfalls to watch out for. Punya