Hi,

Please find the attached is my spark configuration files.

Regards,
Sandeep.v

On Mon, Mar 16, 2015 at 12:58 PM, sandeep vura <sandeepv...@gmail.com>
wrote:

> which location should i need to  specify the classpath exactly .
>
> Thanks,
>
>
> On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao <hao.ch...@intel.com> wrote:
>
>>  It doesn’t take effect if just putting jar files under the
>> lib-managed/jars folder, you need to put that under class path explicitly.
>>
>>
>>
>> *From:* sandeep vura [mailto:sandeepv...@gmail.com]
>> *Sent:* Monday, March 16, 2015 2:21 PM
>> *To:* Cheng, Hao
>> *Cc:* fightf...@163.com; Ted Yu; user
>>
>> *Subject:* Re: Re: Unable to instantiate
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>
>>
>>
>> I have already added mysql-connector-xx.jar file in
>> spark/lib-managed/jars directory.
>>
>>
>>
>> Regards,
>> Sandeep.v
>>
>>
>>
>> On Mon, Mar 16, 2015 at 11:48 AM, Cheng, Hao <hao.ch...@intel.com> wrote:
>>
>>  Or you need to specify the jars either in configuration or
>>
>>
>>
>> bin/spark-sql --jars  mysql-connector-xx.jar
>>
>>
>>
>> *From:* fightf...@163.com [mailto:fightf...@163.com]
>> *Sent:* Monday, March 16, 2015 2:04 PM
>> *To:* sandeep vura; Ted Yu
>> *Cc:* user
>> *Subject:* Re: Re: Unable to instantiate
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>
>>
>>
>> Hi, Sandeep
>>
>>
>>
>> From your error log I can see that jdbc driver not found in your
>> classpath. Did you had your mysql
>>
>> jdbc jar correctly configured in the specific classpath? Can you
>> establish a hive jdbc connection using
>>
>> the url : jdbc:hive2://localhost:10000 ?
>>
>>
>>
>> Thanks,
>>
>> Sun.
>>
>>
>>   ------------------------------
>>
>> fightf...@163.com
>>
>>
>>
>> *From:* sandeep vura <sandeepv...@gmail.com>
>>
>> *Date:* 2015-03-16 14:13
>>
>> *To:* Ted Yu <yuzhih...@gmail.com>
>>
>> *CC:* user@spark.apache.org
>>
>> *Subject:* Re: Unable to instantiate
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>
>> Hi Ted,
>>
>>
>>
>> Did you find any solution.
>>
>>
>>
>> Thanks
>>
>> Sandeep
>>
>>
>>
>> On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura <sandeepv...@gmail.com>
>> wrote:
>>
>>    Hi Ted,
>>
>>
>>
>> I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
>> files attached below.
>>
>>
>>
>> ------------------------------------
>>
>> ERROR IN SPARK
>> ------------------------------------
>>
>> n: Unable to instantiate
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>
>>         at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>>
>>      a:346)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkS
>>
>>      QLCLIDriver.scala:101)
>>
>>         at
>> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQ
>>
>>      LCLIDriver.scala)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>
>>      java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>
>>      sorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:622)
>>
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Caused by: java.lang.RuntimeException: Unable to instantiate
>> org.apache.hadoop.h
>>                                ive.metastore.HiveMetaStoreClient
>>
>>         at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore
>>
>>      Utils.java:1412)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(Retry
>>
>>      ingMetaStoreClient.java:62)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Ret
>>
>>      ryingMetaStoreClient.java:72)
>>
>>         at
>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.ja
>>
>>      va:2453)
>>
>>         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
>>
>>         at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>>
>>      a:340)
>>
>>         ... 9 more
>>
>> Caused by: java.lang.reflect.InvocationTargetException
>>
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
>>
>>      orAccessorImpl.java:57)
>>
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
>>
>>      onstructorAccessorImpl.java:45)
>>
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
>>
>>         at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore
>>
>>      Utils.java:1410)
>>
>>         ... 14 more
>>
>> Caused by: javax.jdo.JDOFatalInternalException: Error creating
>> transactional con
>>                              nection factory
>>
>> NestedThrowables:
>>
>> java.lang.reflect.InvocationTargetException
>>
>>         at
>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc
>>
>>      eption(NucleusJDOHelper.java:587)
>>
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat
>>
>>      ion(JDOPersistenceManagerFactory.java:788)
>>
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenc
>>
>>      eManagerFactory(JDOPersistenceManagerFactory.java:333)
>>
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceMa
>>
>>      nagerFactory(JDOPersistenceManagerFactory.java:202)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
>>
>>      java:57)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
>>
>>      sorImpl.java:43)
>>
>>         at java.lang.reflect.Method.invoke(Method.java:622)
>>
>>         at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>>
>>         at java.security.AccessController.doPrivileged(Native Method)
>>
>>         at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>>
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementatio
>>
>>      n(JDOHelper.java:1166)
>>
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>>
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>>
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:
>>
>>      310)
>>
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(Ob
>>
>>      jectStore.java:339)
>>
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.j
>>
>>      ava:248)
>>
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java
>>
>>      :223)
>>
>>         at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6
>>
>>      2)
>>
>>         at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja
>>
>>      va:117)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.j
>>
>>      ava:58)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy
>>
>>      .java:67)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore
>>
>>      (HiveMetaStore.java:497)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveM
>>
>>      etaStore.java:475)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefau
>>
>>      ltDB(HiveMetaStore.java:523)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMe
>>
>>      taStore.java:397)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(Hive
>>
>>      MetaStore.java:356)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHM
>>
>>      SHandler.java:54)
>>
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(Retrying
>>
>>      HMSHandler.java:59)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMeta
>>
>>      Store.java:4944)
>>
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaS
>>
>>      toreClient.java:171)
>>
>>         ... 19 more
>>
>> Caused by: java.lang.reflect.InvocationTargetException
>>
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
>>
>>      orAccessorImpl.java:57)
>>
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
>>
>>      onstructorAccessorImpl.java:45)
>>
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
>>
>>         at
>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExten
>>
>>      sion(NonManagedPluginRegistry.java:631)
>>
>>         at
>> org.datanucleus.plugin.PluginManager.createExecutableExtension(Plugin
>>
>>      Manager.java:325)
>>
>>         at
>> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(
>>
>>      AbstractStoreManager.java:282)
>>
>>         at
>> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManage
>>
>>      r.java:240)
>>
>>         at
>> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManage
>>
>>      r.java:286)
>>
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
>>
>>      orAccessorImpl.java:57)
>>
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
>>
>>      onstructorAccessorImpl.java:45)
>>
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
>>
>>         at
>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExten
>>
>>      sion(NonManagedPluginRegistry.java:631)
>>
>>         at
>> org.datanucleus.plugin.PluginManager.createExecutableExtension(Plugin
>>
>>      Manager.java:301)
>>
>>         at
>> org.datanucleus.NucleusContext.createStoreManagerForProperties(Nucleu
>>
>>      sContext.java:1187)
>>
>>         at
>> org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
>>
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat
>>
>>      ion(JDOPersistenceManagerFactory.java:775)
>>
>>         ... 48 more
>>
>> Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke
>> the "B
>>                    ONECP" plugin to create a ConnectionPool gave an error :
>> The specified datastore
>>                                     driver ("com.mysql.jdbc.Driver") was
>> not found in the CLASSPATH. Please check y
>>                                                        our CLASSPATH
>> specification, and the name of the driver.
>>
>>         at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources
>>
>>      (ConnectionFactoryImpl.java:259)
>>
>>         at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSourc
>>
>>      es(ConnectionFactoryImpl.java:131)
>>
>>         at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFa
>>
>>      ctoryImpl.java:85)
>>
>>         ... 66 more
>>
>> Caused by:
>> org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundExc
>>
>>      eption: The specified datastore driver ("com.mysql.jdbc.Driver") was
>> not found i
>>                        n the CLASSPATH. Please check your CLASSPATH
>> specification, and the name of the
>>                                                 driver.
>>
>>         at
>> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFact
>>
>>      ory.loadDriver(AbstractConnectionPoolFactory.java:58)
>>
>>         at
>> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactor
>>
>>      y.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
>>
>>         at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources
>>
>>      (ConnectionFactoryImpl.java:238)
>>
>>         ... 68 more
>>
>>
>>
>> Thanks
>>
>> Sandeep.v
>>
>>
>>
>> On Mon, Mar 16, 2015 at 10:32 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> Can you provide more information ?
>> Such as:
>> Version of Spark you're using
>> Command line
>>
>> Thanks
>>
>>
>>
>>
>> > On Mar 15, 2015, at 9:51 PM, sandeep vura <sandeepv...@gmail.com>
>> wrote:
>> >
>> > Hi Sparkers,
>> >
>> >
>> >
>> > I couldn't able to run spark-sql on spark.Please find the following
>> error
>> >
>> >  Unable to instantiate
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>> >
>> >
>> > Regards,
>> > Sandeep.v
>>
>>
>>
>>
>>
>>
>>
>
>

Attachment: spark-env.sh
Description: Bourne shell script

Attachment: spark-defaults.conf
Description: Binary data

<configuration>
<property>
     <name>javax.jdo.option.ConnectionURL</name>
     <value>jdbc:mysql://localhost:3306/metastore</value>
     <description>metadata is stored in a MySQL server</description>
</property>
<property>
     <name>javax.jdo.option.ConnectionDriverName</name>
     <value>com.mysql.jdbc.Driver</value>
     <description>MySQL JDBC driver class</description>
</property>
<property>
     <name>javax.jdo.option.ConnectionUserName</name>
     <value>hiveuser</value>
     <description>user name for connecting to mysql server </description>
</property>
<property>
     <name>javax.jdo.option.ConnectionPassword</name>
     <value>hivepassword</value>
     <description>password for connecting to mysql server </description>
</property>
<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>/user/hive/warehouse</value>
  <description>location of default database for the warehouse</description>
</property>
<property>
    <name>hive.metastore.local</name>
    <value>true</value>
    <description>controls whether to connect to remove metastore server or open a new metastore server in Hive Client JVM</description>
</property>
<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>false</value>
</property>
<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>
<!--
<property>
  <name>hive.metastore.uris</name>
  <value>thrift://localhost:9083</value>
  <description>IP address (or fully-qualified domain name) and port of the metastore host</description>
</property>
-->
</configuration>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to