I haven't received a response to the following message, which I posted last
week. Maybe my message rambled too much. Here is an attempt to pose my
question more succinctly:

Q: Does anyone know of any reason why we can't upgrade Hive's Derby version
to 10.12.1.1, the new version being vetted by the Derby community right
now?

Thanks,
-Rick

> I am following the Hive build instructions here:
>
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-InstallationandConfiguration
> .
>
> I noticed that Hive development seems to be using an old version of
Derby:
> 10.10.2.0. Is there some defect in the most recent Derby version
> (10.11.1.1) which prevents Hive from upgrading to 10.11.1.1? The only
> Hive-tagged Derby bug which I can find is
> https://issues.apache.org/jira/browse/DERBY-6358. That issue doesn't seem
> to be version-specific and it mentions a resolved Hive issue:
> https://issues.apache.org/jira/browse/HIVE-8739.
>
> Staying with 10.10.2.0 makes sense if you need to run on some ancient
JVMs:
> Java SE 5 or Java ME CDC/Foundation Profile 1.1. Hadoop, however,
requires
> at least Java 6 according to
> https://wiki.apache.org/hadoop/HadoopJavaVersions.
>
> Note that the Derby community expects to release version 10.12.1.1 soon:
> https://wiki.apache.org/db-derby/DerbyTenTwelveOneRelease. This might be
a
> good opportunity for Hive to upgrade to a more capable version of Derby.
>
> I mention this because the Derby version used by Hive ends up on the
> classpath used by downstream projects (like Spark). That makes it awkward
> for downstream projects to use more current Derby versions. Do you know
of
> any reason that downstream projects shouldn't override the Derby version
> currently preferred by Hive?
>
> Thanks,
> -Rick

Reply via email to