can someone provide the correct settings for spark 1.6.1 to work with cdh 5
(hive 1.1.0)?

in particular the settings for:
spark.sql.hive.version
spark.sql.hive.metastore.jars

also it would be helpful to know if your spark jar includes hadoop
dependencies or not.

i realize it works (or at least seems to work) if you simply set the
spark.sql.hive.version to 1.2.1 and spark.sql.hive.metastore.jars to
builtin, but i find it somewhat unsatisfactory to rely on that happy
coincidence.


On Sat, Mar 12, 2016 at 7:09 PM, Timur Shenkao <t...@timshenkao.su> wrote:

> I had similar issue with CDH 5.5.3.
> Not only with Spark 1.6 but with beeline as well.
> I resolved it via installation & running hiveserver2 role instance at the
> same server wher metastore is. <http://metastore.mycompany.com:9083>
>
> On Tue, Feb 9, 2016 at 10:58 PM, Koert Kuipers <ko...@tresata.com> wrote:
>
>> has anyone successfully connected to hive metastore using spark 1.6.0? i
>> am having no luck. worked fine with spark 1.5.1 for me. i am on cdh 5.5 and
>> launching spark with yarn.
>>
>> this is what i see in logs:
>> 16/02/09 14:49:12 INFO hive.metastore: Trying to connect to metastore
>> with URI thrift://metastore.mycompany.com:9083
>> 16/02/09 14:49:12 INFO hive.metastore: Connected to metastore.
>>
>> and then a little later:
>>
>> 16/02/09 14:49:34 INFO hive.HiveContext: Initializing execution hive,
>> version 1.2.1
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Inspected Hadoop version:
>> 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 INFO client.ClientWrapper: Loaded
>> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.4.4
>> 16/02/09 14:49:34 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:35 INFO metastore.HiveMetaStore: 0: Opening raw store with
>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>> 16/02/09 14:49:35 INFO metastore.ObjectStore: ObjectStore, initialize
>> called
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>> 16/02/09 14:49:35 INFO DataNucleus.Persistence: Property
>> datanucleus.cache.level2 unknown - will be ignored
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:35 WARN DataNucleus.Connection: BoneCP specified but not
>> present in CLASSPATH (or one of dependencies)
>> 16/02/09 14:49:37 WARN conf.HiveConf: HiveConf of name
>> hive.server2.enable.impersonation does not exist
>> 16/02/09 14:49:37 INFO metastore.ObjectStore: Setting MetaStore object
>> pin classes with
>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:38 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO DataNucleus.Datastore: The class
>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>> "embedded-only" so does not have its own datastore table.
>> 16/02/09 14:49:40 INFO metastore.MetaStoreDirectSql: Using direct SQL,
>> underlying DB is DERBY
>> 16/02/09 14:49:40 INFO metastore.ObjectStore: Initialized ObjectStore
>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>> instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>   at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>>   at
>> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
>>   at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
>>   at
>> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
>>   at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:440)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:272)
>>   at
>> org.apache.spark.sql.SQLContext$$anonfun$4.apply(SQLContext.scala:271)
>>   at scala.collection.Iterator$class.foreach(Iterator.scala:742)
>>   at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
>>   at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>>   at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>>   at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:271)
>>   at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:97)
>>   at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
>>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>   at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>   at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:89)
>>   ... 47 elided
>> Caused by: java.lang.RuntimeException: Unable to instantiate
>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>   at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>>   at
>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
>>   at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
>>   at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
>>   ... 66 more
>> Caused by: java.lang.reflect.InvocationTargetException:
>> org.apache.hadoop.hive.metastore.api.MetaException: Version information not
>> found in metastore.
>>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>   at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>   at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>   at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
>>   ... 72 more
>> Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Version
>> information not found in metastore.
>>   at
>> org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6664)
>>   at
>> org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645)
>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>   at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>   at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>   at java.lang.reflect.Method.invoke(Method.java:606)
>>   at
>> org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
>>   at com.sun.proxy.$Proxy26.verifySchema(Unknown Source)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
>>   at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
>>   at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
>>   at
>> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
>>   ... 77 more
>>
>> i seem to get this exact same error no matter what i set
>> spark.sql.hive.metastore.version and spark.sql.hive.metastore.jars to. i
>> tried building different spark 1.6.0 jars (with hadoop provided, with
>> hadoop included), but no effect.
>>
>> any ideas?
>>
>>
>

Reply via email to