I see

15/12/16 00:06:13 INFO metastore: Trying to connect to metastore with URI
thrift://remoteNode:9083
15/12/16 00:06:14 INFO metastore: Connected to metastore.

Looks like you were connected to your remote metastore.

On Tue, Dec 15, 2015 at 3:31 PM, syepes <sye...@gmail.com> wrote:

> ​Hello,
>
> I am testing out the 1.6 branch (#08aa3b4) and I have just noticed that
> spark-shell "HiveContext" is no longer able to connect to my remote
> metastore.
> Using the same build options and configuration files with 1.5 (#0fdf554) it
> works.
>
> Does anyone know if there have been any mayor changes on this component or
> any new config that's needed to make this work?
>
> spark-shell:
> ------------------------------
> ...
> 15/12/16 00:06:06 INFO Persistence: Property
> hive.metastore.integral.jdo.pushdown unknown - will be ignored
> 15/12/16 00:06:06 INFO Persistence: Property datanucleus.cache.level2
> unknown - will be ignored
> 15/12/16 00:06:06 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
> 15/12/16 00:06:06 WARN Connection: BoneCP specified but not present in
> CLASSPATH (or one of dependencies)
> 15/12/16 00:06:08 INFO ObjectStore: Setting MetaStore object pin classes
> with
>
> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Dat
> abase,Type,FieldSchema,Order"
> 15/12/16 00:06:09 INFO Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datasto
> re table.
> 15/12/16 00:06:09 INFO Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only"
> so does not have its own datastore tab
> le.
> 15/12/16 00:06:11 INFO Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> "embedded-only" so does not have its own datasto
> re table.
> 15/12/16 00:06:11 INFO Datastore: The class
> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> "embedded-only"
> so does not have its own datastore tab
> le.
> 15/12/16 00:06:11 INFO MetaStoreDirectSql: Using direct SQL, underlying DB
> is DERBY
> 15/12/16 00:06:11 INFO ObjectStore: Initialized ObjectStore
> 15/12/16 00:06:11 WARN ObjectStore: Version information not found in
> metastore. hive.metastore.schema.verification is not enabled so recording
> the schema versi
> on 1.2.0
> 15/12/16 00:06:11 WARN ObjectStore: Failed to get database default,
> returning NoSuchObjectException
> 15/12/16 00:06:11 INFO HiveMetaStore: Added admin role in metastore
> ..
> ..
> 15/12/16 00:06:12 INFO HiveContext: Initializing HiveMetastoreConnection
> version 1.2.1 using Spark classes.
> 15/12/16 00:06:12 INFO ClientWrapper: Inspected Hadoop version: 2.7.1
> 15/12/16 00:06:12 INFO ClientWrapper: Loaded
> org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1
> 15/12/16 00:06:13 INFO metastore: Trying to connect to metastore with URI
> thrift://remoteNode:9083
> 15/12/16 00:06:14 INFO metastore: Connected to metastore.
> 15/12/16 00:06:14 INFO SessionState: Created local directory:
> /tmp/c3a4afbb-e4cf-4d20-85a0-01a53074efc8_resources
> 15/12/16 00:06:14 INFO SessionState: Created HDFS directory:
> /tmp/hive/syepes/c3a4afbb-e4cf-4d20-85a0-01a53074efc8
> 15/12/16 00:06:14 INFO SessionState: Created local directory:
> /tmp/root/c3a4afbb-e4cf-4d20-85a0-01a53074efc8
> 15/12/16 00:06:14 INFO SessionState: Created HDFS directory:
> /tmp/hive/syepes/c3a4afbb-e4cf-4d20-85a0-01a53074efc8/_tmp_space.db
> 15/12/16 00:06:14 INFO SparkILoop: Created sql context (with Hive
> support)..
> SQL context available as sqlContext.
> ..
> ------------------------------
>
> hive-site.xml
> ---
> <configuration>
>   <property>
>     <name>hive.metastore.uris</name>
>     <value>thrift://remoteNode:9083</value>
>   </property>
> </configuration>
> ---
>
>
> Regards and thanks in advance for any info,
> Sebastian
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-1-6-H-ive-remote-metastore-not-working-tp15634.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to