Yeah, that's what I feared. Unfortunately upgrades on very large production 
clusters aren't a cheap way to find out what else is broken. 

Perhaps I can create an RCFile table and sidestep parquet for now. 



> On Aug 10, 2014, at 1:45 PM, Sean Owen <so...@cloudera.com> wrote:
> 
> Hm, I was thinking that the issue is that Spark has to use a forked
> hive-exec since hive-exec unfortunately includes a bunch of
> dependencies it shouldn't. It forked Hive 0.12.0:
> http://mvnrepository.com/artifact/org.spark-project.hive/hive-exec/0.12.0
> 
> ... and then I was thinking maybe CDH wasn't able to harmonize it. But
> 5.1 did harmonize this dependency, it appears.
> https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/spark/spark-hive_2.10/1.0.0-cdh5.1.0/
> 
> Looking back at 5.0.3, which shipped 0.9.0 ... I don't think the Spark
> Hive module existed in 0.9.0?
> https://github.com/apache/spark/tree/branch-0.9/sql/hive So yeah using
> it with newer Spark 1.0 code probably means you'd have to build Spark
> to pull in the CDH version of hive-exec (or else keep it out the
> custom assembly and make sure only the CDH version is on the classpath
> at runtime) and hope you don't actually have to shade it like Spark
> does to get it to work.
> 
> I think support will suggest this 1.0 Hive code is supported and works
> in the version that ships 1.0!
> 
> (Sorry for extending a thread apparently about CDH but I think the
> issue is actually more broadly applicable.)
> 
> 
> On Sun, Aug 10, 2014 at 9:20 PM, Eric Friedman
> <eric.d.fried...@gmail.com> wrote:
>> Hi Sean,
>> 
>> Thanks for the reply.  I'm on CDH 5.0.3 and upgrading the whole cluster to
>> 5.1.0 will eventually happen but not immediately.
>> 
>> I've tried running the CDH spark-1.0 release and also building it from
>> source.  This, unfortunately goes into a whole other rathole of
>> dependencies.  :-(
>> 
>> Eric

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to