Unsubscribe

2023-05-01 Thread sandeep vura
-- 
Sandeep V


Re: adding jars - hive on spark cdh 5.4.3

2016-01-10 Thread sandeep vura
Upgrade to CDH 5.5 for spark. It should work

On Sat, Jan 9, 2016 at 12:17 AM, Ophir Etzion  wrote:

> It didn't work. assuming I did the right thing.
> in the properties  you could see
>
> {"key":"hive.aux.jars.path","value":"file:///data/loko/foursquare.web-hiverc/current/hadoop-hive-serde.jar,file:///data/loko/foursquare.web-hiverc/current/hadoop-hive-udf.jar","isFinal":false,"resource":"programatically"}
> which includes the jar that has the class I need but I still get
>
> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find 
> class: com.foursquare.hadoop.hive.io.HiveThriftSequenceFileInputFormat
>
>
>
> On Fri, Jan 8, 2016 at 12:24 PM, Edward Capriolo 
> wrote:
>
>> You can not 'add jar' input formats and serde's. They need to be part of
>> your auxlib.
>>
>> On Fri, Jan 8, 2016 at 12:19 PM, Ophir Etzion 
>> wrote:
>>
>>> I tried now. still getting
>>>
>>> 16/01/08 16:37:34 ERROR exec.Utilities: Failed to load plan: 
>>> hdfs://hadoop-alidoro-nn-vip/tmp/hive/hive/c2af9882-38a9-42b0-8d17-3f56708383e8/hive_2016-01-08_16-36-41_370_3307331506800215903-3/-mr-10004/3c90a796-47fc-4541-bbec-b196c40aefab/map.xml:
>>>  org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find 
>>> class: com.foursquare.hadoop.hive.io.HiveThriftSequenceFileInputFormat
>>> Serialization trace:
>>> inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
>>> aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
>>> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find 
>>> class: com.foursquare.hadoop.hive.io.HiveThriftSequenceFileInputFormat
>>>
>>>
>>> HiveThriftSequenceFileInputFormat is in one of the jars I'm trying to add.
>>>
>>>
>>> On Thu, Jan 7, 2016 at 9:58 PM, Prem Sure  wrote:
>>>
 did you try -- jars property in spark submit? if your jar is of huge
 size, you can pre-load the jar on all executors in a common available
 directory to avoid network IO.

 On Thu, Jan 7, 2016 at 4:03 PM, Ophir Etzion 
 wrote:

> I' trying to add jars before running a query using hive on spark on
> cdh 5.4.3.
> I've tried applying the patch in
> https://issues.apache.org/jira/browse/HIVE-12045 (manually as the
> patch is done on a different hive version) but still hasn't succeeded.
>
> did anyone manage to do ADD JAR successfully with CDH?
>
> Thanks,
> Ophir
>


>>>
>>
>


Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
Thanks alot AKhil

On Mon, Jul 6, 2015 at 12:57 PM, sandeep vura sandeepv...@gmail.com wrote:

 It Works !!!

 On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 oK Let me try


 On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Its complaining for a jdbc driver. Add it in your driver classpath like:

 ./bin/spark-sql --driver-class-path
 /home/akhld/sigmoid/spark/lib/mysql-connector-java-5.1.32-bin.jar


 Thanks
 Best Regards

 On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 I am unable to start spark-sql service please check the error as
 mentioned below.

 Exception in thread main java.lang.RuntimeException:
 java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:101)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(RetryingMetaStoreClient.java:62)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
 ... 9 more
 Caused by: java.lang.reflect.InvocationTargetException
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at
 java.lang.reflect.Constructor.newInstance(Constructor.java:534)
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
 ... 14 more
 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional connection factory
 NestedThrowables:
 java.lang.reflect.InvocationTargetException
 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
 at
 javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
 at
 org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
 at
 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
It Works !!!

On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com wrote:

 oK Let me try


 On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Its complaining for a jdbc driver. Add it in your driver classpath like:

 ./bin/spark-sql --driver-class-path
 /home/akhld/sigmoid/spark/lib/mysql-connector-java-5.1.32-bin.jar


 Thanks
 Best Regards

 On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 I am unable to start spark-sql service please check the error as
 mentioned below.

 Exception in thread main java.lang.RuntimeException:
 java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:101)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(RetryingMetaStoreClient.java:62)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
 ... 9 more
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at
 java.lang.reflect.Constructor.newInstance(Constructor.java:534)
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
 ... 14 more
 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional connection factory
 NestedThrowables:
 java.lang.reflect.InvocationTargetException
 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
 at
 javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
 at
 org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
 at
 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.init(RawStoreProxy.java:58)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy

Unable to start spark-sql

2015-07-06 Thread sandeep vura
Hi Sparkers,

I am unable to start spark-sql service please check the error as mentioned
below.

Exception in thread main java.lang.RuntimeException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:101)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(RetryingMetaStoreClient.java:62)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 9 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 14 more
Caused by: javax.jdo.JDOFatalInternalException: Error creating
transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.init(RawStoreProxy.java:58)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:356)
at

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
oK Let me try


On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Its complaining for a jdbc driver. Add it in your driver classpath like:

 ./bin/spark-sql --driver-class-path
 /home/akhld/sigmoid/spark/lib/mysql-connector-java-5.1.32-bin.jar


 Thanks
 Best Regards

 On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 I am unable to start spark-sql service please check the error as
 mentioned below.

 Exception in thread main java.lang.RuntimeException:
 java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:101)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(RetryingMetaStoreClient.java:62)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
 ... 9 more
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
 ... 14 more
 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional connection factory
 NestedThrowables:
 java.lang.reflect.InvocationTargetException
 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
 at
 javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
 at
 org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
 at
 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.init(RawStoreProxy.java:58)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
 at
 org.apache.hadoop.hive.metastore.HiveMetaStore

How to run spark programs in eclipse like mapreduce

2015-04-20 Thread sandeep vura
Hi Sparkers,

I have written a code in python in eclipse now that code should execute in
spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help
me with instructions.

Regards,
Sandeep.v


Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Hi Sparkers,

I am trying to load data in spark with the following command

*sqlContext.sql(LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
  ' INTO TABLE src);*

*Getting exception below*


*Server IPC version 9 cannot communicate with client version 4*

NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13


Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

I am running the below command in spark/yarn directory where pom.xml file
is available

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

Please correct me if i am wrong.




On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
wrote:

 Looks like you have to build Spark with related Hadoop version, otherwise
 you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13








Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Build failed with following errors.

I have executed the below following command.

* mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
package*


[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 2:11:59.461s
[INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
[INFO] Final Memory: 30M/440M
[INFO]

[ERROR] Failed to execute goal on project spark-core_2.10: Could not
resolve dep
   endencies for project
org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
https://repo1.
   maven.org/maven2) - [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
swit
 ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please rea
   d the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

 lutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn goals -rf :spark-core_2.10


On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml file
 is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13










Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
*I am using hadoop 2.4 should i mention -Dhadoop.version=2.2*

*$ hadoop version*
*Hadoop 2.4.1*
*Subversion http://svn.apache.org/repos/asf/hadoop/common
http://svn.apache.org/repos/asf/hadoop/common -r 1604318*
*Compiled by jenkins on 2014-06-21T05:43Z*
*Compiled with protoc 2.5.0*
*From source with checksum bb7ac0a3c73dc131f4844b873c74b630*
*This command was run using
/home/hadoop24/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar*




On Wed, Mar 25, 2015 at 5:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 -D*hadoop.version=2.2*


 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Build failed with following errors.

 I have executed the below following command.

 * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
 package*


 [INFO]
 
 [INFO] BUILD FAILURE
 [INFO]
 
 [INFO] Total time: 2:11:59.461s
 [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
 [INFO] Final Memory: 30M/440M
 [INFO]
 
 [ERROR] Failed to execute goal on project spark-core_2.10: Could not
 resolve dep
endencies for project
 org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

 artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
 https://repo1.
  maven.org/maven2) - [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e swit
ch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please rea
d the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

lutionException
 [ERROR]
 [ERROR] After correcting the problems, you can resume the build with the
 command
 [ERROR]   mvn goals -rf :spark-core_2.10


 On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml
 file is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13












Re: Errors in SPARK

2015-03-24 Thread sandeep vura
Hi Denny,

Still facing the same issue.Please find the following errors.

*scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)*
*sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@4e4f880c*

*scala sqlContext.sql(CREATE TABLE IF NOT EXISTS src (key INT, value
STRING))*
*java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient*

Cheers,
Sandeep.v

On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura sandeepv...@gmail.com
wrote:

 No I am just running ./spark-shell command in terminal I will try with
 above command

 On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee denny.g@gmail.com wrote:

 Did you include the connection to a MySQL connector jar so that way
 spark-shell / hive can connect to the metastore?

 For example, when I run my spark-shell instance in standalone mode, I use:
 ./spark-shell --master spark://servername:7077 --driver-class-path
 /lib/mysql-connector-java-5.1.27.jar



 On Fri, Mar 13, 2015 at 8:31 AM sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 Can anyone please check the below error and give solution for this.I am
 using hive version 0.13 and spark 1.2.1 .

 Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
 Step 2:  Hive is running without any errors and able to create tables
 and loading data in hive table
 Step 3: copied hive-site.xml in spark/conf directory
 Step 4: copied core-site.xml in spakr/conf directory
 Step 5: started spark shell

 Please check the below error for clarifications.

 scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
 sqlContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.Hi
  veContext@2821ec0c

 scala sqlContext.sql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 java.lang.RuntimeException: java.lang.RuntimeException: Unable to
 instantiate or

  g.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)
 at
 org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

  ala:235)
 at
 org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

  ala:231)
 at scala.Option.orElse(Option.scala:257)
 at
 org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal

  a:231)
 at
 org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext

  .scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveMetastoreCatalog.init(HiveMetastoreCa

  talog.scala:55)

 Regards,
 Sandeep.v





Re: Errors in SPARK

2015-03-24 Thread sandeep vura
No I am just running ./spark-shell command in terminal I will try with
above command

On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee denny.g@gmail.com wrote:

 Did you include the connection to a MySQL connector jar so that way
 spark-shell / hive can connect to the metastore?

 For example, when I run my spark-shell instance in standalone mode, I use:
 ./spark-shell --master spark://servername:7077 --driver-class-path
 /lib/mysql-connector-java-5.1.27.jar



 On Fri, Mar 13, 2015 at 8:31 AM sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 Can anyone please check the below error and give solution for this.I am
 using hive version 0.13 and spark 1.2.1 .

 Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
 Step 2:  Hive is running without any errors and able to create tables and
 loading data in hive table
 Step 3: copied hive-site.xml in spark/conf directory
 Step 4: copied core-site.xml in spakr/conf directory
 Step 5: started spark shell

 Please check the below error for clarifications.

 scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
 sqlContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.Hi
  veContext@2821ec0c

 scala sqlContext.sql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 java.lang.RuntimeException: java.lang.RuntimeException: Unable to
 instantiate or

  g.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)
 at
 org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

  ala:235)
 at
 org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

  ala:231)
 at scala.Option.orElse(Option.scala:257)
 at
 org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal

  a:231)
 at
 org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext

  .scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveMetastoreCatalog.init(HiveMetastoreCa

  talog.scala:55)

 Regards,
 Sandeep.v




Re: About the env of Spark1.2

2015-03-21 Thread sandeep vura
Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck
or create 127.0.1.1 named it as localhost

On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu yuzhih...@gmail.com wrote:

 bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or
 service not known

 Can you check your DNS ?

 Cheers

 On Fri, Mar 20, 2015 at 8:54 PM, tangzilu zilu.t...@hotmail.com wrote:

 Hi All:
 I recently started to deploy Spark1.2 in my VisualBox Linux.
 But when I run the command ./spark-shell in the path of
 /opt/spark-1.2.1/bin, I got the result like this:

 [root@dhcp-10-35-14-100 bin]# ./spark-shell
 Using Spark's default log4j profile:
 org/apache/spark/log4j-defaults.properties
 15/03/20 13:56:06 INFO SecurityManager: Changing view acls to: root
 15/03/20 13:56:06 INFO SecurityManager: Changing modify acls to: root
 15/03/20 13:56:06 INFO SecurityManager: SecurityManager: authentication
 disabled; ui acls disabled; users with view permissions: Set(root); users
 with modify permissions: Set(root)
 15/03/20 13:56:06 INFO HttpServer: Starting HTTP Server
 15/03/20 13:56:06 INFO Utils: Successfully started service 'HTTP class
 server' on port 47691.
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.2.1
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
 Type in expressions to have them evaluated.
 Type :help for more information.
 java.net.UnknownHostException: dhcp-10-35-14-100: dhcp-10-35-14-100: Name
 or service not known
 at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
 at
 org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:710)
 at
 org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:702)
 at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:702)
 at org.apache.spark.HttpServer.uri(HttpServer.scala:158)
 at
 org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:982)
 at $iwC$$iwC.init(console:9)
 at $iwC.init(console:18)
 at init(console:20)
 at .init(console:24)
 at .clinit(console)
 at .init(console:7)
 at .clinit(console)
 at $print(console)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at
 org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
 at
 org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
 at
 org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
 at
 org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
 at
 org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
 at
 org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
 at
 org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
 at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
 at
 org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
 at
 org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
 at
 org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
 at
 org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
 at
 org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
 at
 org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
 at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
 at
 org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
 at
 org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
 at
 scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
 at org.apache.spark.repl.Main$.main(Main.scala:31)
 at org.apache.spark.repl.Main.main(Main.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
which location should i need to  specify the classpath exactly .

Thanks,


On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao hao.ch...@intel.com wrote:

  It doesn’t take effect if just putting jar files under the
 lib-managed/jars folder, you need to put that under class path explicitly.



 *From:* sandeep vura [mailto:sandeepv...@gmail.com]
 *Sent:* Monday, March 16, 2015 2:21 PM
 *To:* Cheng, Hao
 *Cc:* fightf...@163.com; Ted Yu; user

 *Subject:* Re: Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient



 I have already added mysql-connector-xx.jar file in spark/lib-managed/jars
 directory.



 Regards,
 Sandeep.v



 On Mon, Mar 16, 2015 at 11:48 AM, Cheng, Hao hao.ch...@intel.com wrote:

  Or you need to specify the jars either in configuration or



 bin/spark-sql --jars  mysql-connector-xx.jar



 *From:* fightf...@163.com [mailto:fightf...@163.com]
 *Sent:* Monday, March 16, 2015 2:04 PM
 *To:* sandeep vura; Ted Yu
 *Cc:* user
 *Subject:* Re: Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient



 Hi, Sandeep



 From your error log I can see that jdbc driver not found in your
 classpath. Did you had your mysql

 jdbc jar correctly configured in the specific classpath? Can you establish
 a hive jdbc connection using

 the url : jdbc:hive2://localhost:1 ?



 Thanks,

 Sun.


   --

 fightf...@163.com



 *From:* sandeep vura sandeepv...@gmail.com

 *Date:* 2015-03-16 14:13

 *To:* Ted Yu yuzhih...@gmail.com

 *CC:* user@spark.apache.org

 *Subject:* Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient

 Hi Ted,



 Did you find any solution.



 Thanks

 Sandeep



 On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com
 wrote:

Hi Ted,



 I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
 files attached below.



 

 ERROR IN SPARK
 

 n: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient

 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)

 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkS

  QLCLIDriver.scala:101)

 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQ

  LCLIDriver.scala)

 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)

 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)

 at java.lang.reflect.Method.invoke(Method.java:622)

 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)

 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.h
ive.metastore.HiveMetaStoreClient

 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1412)

 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(Retry

  ingMetaStoreClient.java:62)

 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Ret

  ryingMetaStoreClient.java:72)

 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.ja

  va:2453)

 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)

 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:340)

 ... 9 more

 Caused by: java.lang.reflect.InvocationTargetException

 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)

 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

  orAccessorImpl.java:57)

 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

  onstructorAccessorImpl.java:45)

 at java.lang.reflect.Constructor.newInstance(Constructor.java:534)

 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1410)

 ... 14 more

 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional con
  nection factory

 NestedThrowables:

 java.lang.reflect.InvocationTargetException

 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc

  eption(NucleusJDOHelper.java:587)

 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat

  ion(JDOPersistenceManagerFactory.java:788)

 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenc

  eManagerFactory(JDOPersistenceManagerFactory.java:333

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
Hi Fightfate,

I have attached my hive-site.xml file in the previous mail.Please check the
configuration once. In hive i am able to create tables and also able to
load data into hive table.

Please find the attached file.

Regards,
Sandeep.v

On Mon, Mar 16, 2015 at 11:34 AM, fightf...@163.com fightf...@163.com
wrote:

 Hi, Sandeep

 From your error log I can see that jdbc driver not found in your
 classpath. Did you had your mysql
 jdbc jar correctly configured in the specific classpath? Can you establish
 a hive jdbc connection using
 the url : jdbc:hive2://localhost:1 ?

 Thanks,
 Sun.

 --
 fightf...@163.com


 *From:* sandeep vura sandeepv...@gmail.com
 *Date:* 2015-03-16 14:13
 *To:* Ted Yu yuzhih...@gmail.com
 *CC:* user@spark.apache.org
 *Subject:* Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 Hi Ted,

 Did you find any solution.

 Thanks
 Sandeep

 On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Ted,

 I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
 files attached below.

 
 ERROR IN SPARK
 
 n: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkS

  QLCLIDriver.scala:101)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQ

  LCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.h
ive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1412)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(Retry

  ingMetaStoreClient.java:62)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Ret

  ryingMetaStoreClient.java:72)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.ja

  va:2453)
 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:340)
 ... 9 more
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

  orAccessorImpl.java:57)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

  onstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1410)
 ... 14 more
 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional con
  nection factory
 NestedThrowables:
 java.lang.reflect.InvocationTargetException
 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc

  eption(NucleusJDOHelper.java:587)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat

  ion(JDOPersistenceManagerFactory.java:788)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenc

  eManagerFactory(JDOPersistenceManagerFactory.java:333)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceMa

  nagerFactory(JDOPersistenceManagerFactory.java:202)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
 at
 javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementatio

  n(JDOHelper.java:1166)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
I have already added mysql-connector-xx.jar file in spark/lib-managed/jars
directory.

Regards,
Sandeep.v

On Mon, Mar 16, 2015 at 11:48 AM, Cheng, Hao hao.ch...@intel.com wrote:

  Or you need to specify the jars either in configuration or



 bin/spark-sql --jars  mysql-connector-xx.jar



 *From:* fightf...@163.com [mailto:fightf...@163.com]
 *Sent:* Monday, March 16, 2015 2:04 PM
 *To:* sandeep vura; Ted Yu
 *Cc:* user
 *Subject:* Re: Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient



 Hi, Sandeep



 From your error log I can see that jdbc driver not found in your
 classpath. Did you had your mysql

 jdbc jar correctly configured in the specific classpath? Can you establish
 a hive jdbc connection using

 the url : jdbc:hive2://localhost:1 ?



 Thanks,

 Sun.


  --

 fightf...@163.com



 *From:* sandeep vura sandeepv...@gmail.com

 *Date:* 2015-03-16 14:13

 *To:* Ted Yu yuzhih...@gmail.com

 *CC:* user@spark.apache.org

 *Subject:* Re: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient

 Hi Ted,



 Did you find any solution.



 Thanks

 Sandeep



 On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com
 wrote:

Hi Ted,



 I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
 files attached below.



 

 ERROR IN SPARK
 

 n: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient

 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)

 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkS

  QLCLIDriver.scala:101)

 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQ

  LCLIDriver.scala)

 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)

 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)

 at java.lang.reflect.Method.invoke(Method.java:622)

 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)

 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.h
ive.metastore.HiveMetaStoreClient

 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1412)

 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(Retry

  ingMetaStoreClient.java:62)

 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Ret

  ryingMetaStoreClient.java:72)

 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.ja

  va:2453)

 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)

 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:340)

 ... 9 more

 Caused by: java.lang.reflect.InvocationTargetException

 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)

 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

  orAccessorImpl.java:57)

 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

  onstructorAccessorImpl.java:45)

 at java.lang.reflect.Constructor.newInstance(Constructor.java:534)

 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1410)

 ... 14 more

 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional con
  nection factory

 NestedThrowables:

 java.lang.reflect.InvocationTargetException

 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc

  eption(NucleusJDOHelper.java:587)

 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat

  ion(JDOPersistenceManagerFactory.java:788)

 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenc

  eManagerFactory(JDOPersistenceManagerFactory.java:333)

 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceMa

  nagerFactory(JDOPersistenceManagerFactory.java:202)

 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)

 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)

 at java.lang.reflect.Method.invoke(Method.java:622)

 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965

Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Sparkers,



I couldn't able to run spark-sql on spark.Please find the following error

 Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient


Regards,
Sandeep.v


Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefau

 ltDB(HiveMetaStore.java:523)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMe

 taStore.java:397)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(Hive

 MetaStore.java:356)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.init(RetryingHM

 SHandler.java:54)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(Retrying

 HMSHandler.java:59)
at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMeta

 Store.java:4944)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.init(HiveMetaS

 toreClient.java:171)
... 19 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

 orAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

 onstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExten

 sion(NonManagedPluginRegistry.java:631)
at
org.datanucleus.plugin.PluginManager.createExecutableExtension(Plugin

 Manager.java:325)
at
org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(

 AbstractStoreManager.java:282)
at
org.datanucleus.store.AbstractStoreManager.init(AbstractStoreManage

 r.java:240)
at
org.datanucleus.store.rdbms.RDBMSStoreManager.init(RDBMSStoreManage

 r.java:286)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

 orAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

 onstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExten

 sion(NonManagedPluginRegistry.java:631)
at
org.datanucleus.plugin.PluginManager.createExecutableExtension(Plugin

 Manager.java:301)
at
org.datanucleus.NucleusContext.createStoreManagerForProperties(Nucleu

 sContext.java:1187)
at
org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat

 ion(JDOPersistenceManagerFactory.java:775)
... 48 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke
the B
   ONECP plugin to create a ConnectionPool gave an error :
The specified datastore
driver (com.mysql.jdbc.Driver) was
not found in the CLASSPATH. Please check y
   our CLASSPATH
specification, and the name of the driver.
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources

 (ConnectionFactoryImpl.java:259)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSourc

 es(ConnectionFactoryImpl.java:131)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.init(ConnectionFa

 ctoryImpl.java:85)
... 66 more
Caused by:
org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundExc

 eption: The specified datastore driver (com.mysql.jdbc.Driver) was
not found i
   n the CLASSPATH. Please check your CLASSPATH
specification, and the name of the
driver.
at
org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFact

 ory.loadDriver(AbstractConnectionPoolFactory.java:58)
at
org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactor

 y.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources

 (ConnectionFactoryImpl.java:238)
... 68 more

Thanks
Sandeep.v

On Mon, Mar 16, 2015 at 10:32 AM, Ted Yu yuzhih...@gmail.com wrote:

 Can you provide more information ?
 Such as:
 Version of Spark you're using
 Command line

 Thanks



  On Mar 15, 2015, at 9:51 PM, sandeep vura sandeepv...@gmail.com wrote:
 
  Hi Sparkers,
 
 
 
  I couldn't able to run spark-sql on spark.Please find the following error
 
   Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 
 
  Regards,
  Sandeep.v

configuration
property
 namejavax.jdo.option.ConnectionURL/name
 valuejdbc:mysql://localhost/metastore_db/value
 descriptionmetadata is stored in a MySQL server/description
/property
property
 namejavax.jdo.option.ConnectionDriverName/name
 valuecom.mysql.jdbc.Driver/value
 descriptionMySQL JDBC driver class/description

Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Ted,

Did you find any solution.

Thanks
Sandeep

On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com
wrote:

 Hi Ted,

 I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
 files attached below.

 
 ERROR IN SPARK
 
 n: Unable to instantiate
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:346)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkS

  QLCLIDriver.scala:101)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQ

  LCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.RuntimeException: Unable to instantiate
 org.apache.hadoop.h
ive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1412)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(Retry

  ingMetaStoreClient.java:62)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(Ret

  ryingMetaStoreClient.java:72)
 at
 org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.ja

  va:2453)
 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

  a:340)
 ... 9 more
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

  orAccessorImpl.java:57)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC

  onstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:534)
 at
 org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStore

  Utils.java:1410)
 ... 14 more
 Caused by: javax.jdo.JDOFatalInternalException: Error creating
 transactional con
  nection factory
 NestedThrowables:
 java.lang.reflect.InvocationTargetException
 at
 org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExc

  eption(NucleusJDOHelper.java:587)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfigurat

  ion(JDOPersistenceManagerFactory.java:788)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenc

  eManagerFactory(JDOPersistenceManagerFactory.java:333)
 at
 org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceMa

  nagerFactory(JDOPersistenceManagerFactory.java:202)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.

  java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces

  sorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
 at
 javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementatio

  n(JDOHelper.java:1166)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
 at
 javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:

  310)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(Ob

  jectStore.java:339)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.j

  ava:248)
 at
 org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java

  :223)
 at
 org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6

  2)
 at
 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja

  va:117)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.init(RawStoreProxy.j

  ava:58)
 at
 org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy

  .java:67

Errors in SPARK

2015-03-13 Thread sandeep vura
Hi Sparkers,

Can anyone please check the below error and give solution for this.I am
using hive version 0.13 and spark 1.2.1 .

Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
Step 2:  Hive is running without any errors and able to create tables and
loading data in hive table
Step 3: copied hive-site.xml in spark/conf directory
Step 4: copied core-site.xml in spakr/conf directory
Step 5: started spark shell

Please check the below error for clarifications.

scala val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.Hi
 veContext@2821ec0c

scala sqlContext.sql(CREATE TABLE IF NOT EXISTS src (key INT, value
STRING))
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate or

 g.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

 a:346)
at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

 ala:235)
at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

 ala:231)
at scala.Option.orElse(Option.scala:257)
at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal

 a:231)
at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext

 .scala:229)
at
org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
at
org.apache.spark.sql.hive.HiveMetastoreCatalog.init(HiveMetastoreCa

 talog.scala:55)

Regards,
Sandeep.v


Re: Spark-SQL and Hive - is Hive required?

2015-03-06 Thread sandeep vura
Hi ,

For creating a Hive table do i need to add hive-site.xml in spark/conf
directory.

On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust mich...@databricks.com
wrote:

 Its not required, but even if you don't have hive installed you probably
 still want to use the HiveContext.  From earlier in that doc:

 In addition to the basic SQLContext, you can also create a HiveContext,
 which provides a superset of the functionality provided by the basic
 SQLContext. Additional features include the ability to write queries using
 the more complete HiveQL parser, access to HiveUDFs, and the ability to
 read data from Hive tables. To use a HiveContext, *you do not need to
 have an existing Hive setup*, and all of the data sources available to a
 SQLContext are still available. HiveContext is only packaged separately to
 avoid including all of Hive’s dependencies in the default Spark build. If
 these dependencies are not a problem for your application then using
 HiveContext is recommended for the 1.2 release of Spark. Future releases
 will focus on bringing SQLContext up to feature parity with a HiveContext.


 On Fri, Mar 6, 2015 at 7:22 AM, Yin Huai yh...@databricks.com wrote:

 Hi Edmon,

 No, you do not need to install Hive to use Spark SQL.

 Thanks,

 Yin

 On Fri, Mar 6, 2015 at 6:31 AM, Edmon Begoli ebeg...@gmail.com wrote:

  Does Spark-SQL require installation of Hive for it to run correctly or
 not?

 I could not tell from this statement:

 https://spark.apache.org/docs/latest/sql-programming-guide.html#compatibility-with-apache-hive

 Thank you,
 Edmon






Does anyone integrate HBASE on Spark

2015-03-04 Thread sandeep vura
Hi Sparkers,

How do i integrate hbase on spark !!!

Appreciate for replies !!

Regards,
Sandeep.v


Re: Unable to run hive queries inside spark

2015-02-27 Thread sandeep vura
Hi Kundan,

Sorry even i am also facing the similar issue today.How did you resolve
this issue?

Regards,
Sandeep.v

On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust mich...@databricks.com
wrote:

 It looks like that is getting interpreted as a local path.  Are you
 missing a core-site.xml file to configure hdfs?

 On Tue, Feb 24, 2015 at 10:40 PM, kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi Denny,

 yes the user has all the rights to HDFS. I am running all the spark
 operations with this user.

 and my hive-site.xml looks like this

  property
 namehive.metastore.warehouse.dir/name
 value/user/hive/warehouse/value
 descriptionlocation of default database for the
 warehouse/description
   /property

 Do I need to do anything explicitly other than placing hive-site.xml in
 the spark.conf directory ?

 Thanks !!



 On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee denny.g@gmail.com
 wrote:

 The error message you have is:

 FAILED: Execution Error, return code 1 from 
 org.apache.hadoop.hive.ql.exec.DDLTask.
 MetaException(message:file:/user/hive/warehouse/src is not a directory
 or unable to create one)

 Could you verify that you (the user you are running under) has the
 rights to create the necessary folders within HDFS?


 On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation
 for details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF
 NOT EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin
 classes with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
 we are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not
 creating a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF
 NOT EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 

Re: Errors in spark

2015-02-27 Thread sandeep vura
Hi yana,

I have removed hive-site.xml from spark/conf directory but still getting
the same errors. Anyother way to work around.

Regards,
Sandeep

On Fri, Feb 27, 2015 at 9:38 PM, Yana Kadiyska yana.kadiy...@gmail.com
wrote:

 I think you're mixing two things: the docs say When* not *configured by
 the hive-site.xml, the context automatically creates metastore_db and
 warehouse in the current directory.. AFAIK if you want a local
 metastore, you don't put hive-site.xml anywhere. You only need the file if
 you're going to point to an external metastore. If you're pointing to an
 external metastore, in my experience I've also had to copy core-site.xml
 into conf in order to specify this property:  namefs.defaultFS/name

 On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi Sparkers,

 I am using hive version - hive 0.13 and copied hive-site.xml in
 spark/conf and using default derby local metastore .

 While creating a table in spark shell getting the following error ..Can
 any one please look and give solution asap..

 sqlContext.sql(CREATE TABLE IF NOT EXISTS sandeep (key INT, value
 STRING))
 15/02/27 23:06:13 ERROR RetryingHMSHandler:
 MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
 directory or unable to create one)
 at
 org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
 at
 org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
 at
 com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown Source)
 at
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
 at
 org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
 at com.sun.proxy.$Proxy13.createTable(Unknown Source)
 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
 at
 org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
 at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
 at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
 at
 org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
 at
 org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at
 org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
 at
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:108)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
 at $line9.$read$$iwC$$iwC$$iwC$$iwC.init(console:15)
 at $line9.$read$$iwC$$iwC$$iwC.init(console:20)
 at $line9.$read$$iwC$$iwC.init(console:22)
 at $line9.$read$$iwC.init(console:24)
 at $line9.$read.init(console:26)
 at $line9.$read$.init(console:30)
 at $line9.$read$.clinit(console)
 at $line9.$eval$.init(console:7)
 at $line9.$eval$.clinit(console)
 at $line9.$eval.$print(console)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke

Errors in spark

2015-02-27 Thread sandeep vura
Hi Sparkers,

I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf
and using default derby local metastore .

While creating a table in spark shell getting the following error ..Can any
one please look and give solution asap..

sqlContext.sql(CREATE TABLE IF NOT EXISTS sandeep (key INT, value STRING))
15/02/27 23:06:13 ERROR RetryingHMSHandler:
MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
directory or unable to create one)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
at com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown
Source)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy13.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at
org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:108)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
at $line9.$read$$iwC$$iwC$$iwC$$iwC.init(console:15)
at $line9.$read$$iwC$$iwC$$iwC.init(console:20)
at $line9.$read$$iwC$$iwC.init(console:22)
at $line9.$read$$iwC.init(console:24)
at $line9.$read.init(console:26)
at $line9.$read$.init(console:30)
at $line9.$read$.clinit(console)
at $line9.$eval$.init(console:7)
at $line9.$eval$.clinit(console)
at $line9.$eval.$print(console)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
at 

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
Oh Thanks for the clarification,I will try to downgrade hive.

On Thu, Feb 26, 2015 at 9:44 PM, Cheng Lian lian.cs@gmail.com wrote:

  You are using a Hive version which is not support by Spark SQL. Spark SQL
 1.1.x and prior versions only support Hive 0.12.0. Spark SQL 1.2.0 supports
 Hive 0.12.0 or Hive 0.13.1.


 On 2/27/15 12:12 AM, sandeep vura wrote:

 Hi Cheng,

  Thanks the above issue has been resolved.I have configured Remote
 metastore not Local metastore in Hive.

  While creating a table in sparksql another error reflecting on terminal
 . Below error is given below

  sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep_data/sales_pg.csv' INTO TABLE sandeep_test)
 15/02/26 21:49:24 ERROR Driver: FAILED: RuntimeException
 org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
 communicate with client version 4
 java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException: Server
 IPC version 9 cannot communicate with client version 4
 at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:222)
 at
 org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:278)
 at
 org.apache.hadoop.hive.ql.Context.getExternalTmpPath(Context.java:344)
 at
 org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSemanticAnalyzer.java:243)
 at
 org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:422)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322)
 at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:975)
 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1040)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
 at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at
 org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
 at
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:108)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
 at $line13.$read$$iwC$$iwC$$iwC$$iwC.init(console:15)
 at $line13.$read$$iwC$$iwC$$iwC.init(console:20)
 at $line13.$read$$iwC$$iwC.init(console:22)
 at $line13.$read$$iwC.init(console:24)
 at $line13.$read.init(console:26)
 at $line13.$read$.init(console:30)
 at $line13.$read$.clinit(console)
 at $line13.$eval$.init(console:7)
 at $line13.$eval$.clinit(console)
 at $line13.$eval.$print(console)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:622)
 at
 org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
 at
 org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
 at
 org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
 at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
 at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
 at
 org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
 at
 org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
 at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
 at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
 at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
 at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
 at
 org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
 at
 scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
 at org.apache.spark.repl.Main$.main(Main.scala:31)
 at org.apache.spark.repl.Main.main(Main.scala

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
)
at $iwC.init(console:24)
at init(console:26)
at .init(console:30)
at .clinit(console)
at .init(console:7)
at .clinit(console)
at $print(console)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Regards,
Sandeep.v



On Thu, Feb 26, 2015 at 8:02 PM, Cheng Lian lian.cs@gmail.com wrote:

  Seems that you are running Hive metastore over MySQL, but don’t have
 MySQL JDBC driver on classpath:

 Caused by:
 org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException:
 The specified datastore driver (“com.mysql.jdbc.Driver”) was not found in
 the CLASSPATH. Please check your CLASSPATH specification, and the name of
 the driver.

 Cheng

 On 2/26/15 8:03 PM, sandeep vura wrote:

Hi Sparkers,

  I am trying to creating hive table in SparkSql.But couldn't able to
 create it.Below are the following errors which are generating so far.

 java.lang.RuntimeException: java.lang.RuntimeException: Unable to
 instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
 at
 org.apache.spark.sql.hive.HiveContext$anonfun$4.apply(HiveContext.scala:235)
 at
 org.apache.spark.sql.hive.HiveContext$anonfun$4.apply(HiveContext.scala:231)
 at scala.Option.orElse(Option.scala:257)
 at
 org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
 at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
 at
 org.apache.spark.sql.hive.HiveMetastoreCatalog.init(HiveMetastoreCatalog.scala:55)
 at
 org.apache.spark.sql.hive.HiveContext$anon$1.init(HiveContext.scala:253)
 at
 org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
 at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
 at
 org.apache.spark.sql.hive.HiveContext$anon$3.init(HiveContext.scala:263)
 at
 org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:263)
 at
 org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:262)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
 at
 org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
 at
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:108)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
 at $iwC$iwC$iwC$iwC.init(console:15)
 at $iwC$iwC$iwC.init(console:20)
 at $iwC$iwC.init(console:22)
 at $iwC.init(console:24

Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
Hi Sparkers,

I am trying to creating hive table in SparkSql.But couldn't able to create
it.Below are the following errors which are generating so far.

java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)
at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)
at scala.Option.orElse(Option.scala:257)
at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
at
org.apache.spark.sql.hive.HiveMetastoreCatalog.init(HiveMetastoreCatalog.scala:55)
at
org.apache.spark.sql.hive.HiveContext$$anon$1.init(HiveContext.scala:253)
at
org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
at
org.apache.spark.sql.hive.HiveContext$$anon$3.init(HiveContext.scala:263)
at
org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:263)
at org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:262)
at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:411)
at
org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:411)
at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:108)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
at $iwC$$iwC$$iwC$$iwC.init(console:15)
at $iwC$$iwC$$iwC.init(console:20)
at $iwC$$iwC.init(console:22)
at $iwC.init(console:24)
at init(console:26)
at .init(console:30)
at .clinit(console)
at .init(console:7)
at .clinit(console)
at $print(console)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.init(RetryingMetaStoreClient.java:62)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
   

Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
Hi Deepak,

Thanks for posting the link.Looks Like it supports only for cloudera
distributions as per given in github.

We are using apache hadoop multinode cluster not cloudera
distribution.Please confirm me whether i can use it on apache hadoop
cluster.

Regards,
Sandeep.v

On Mon, Feb 23, 2015 at 10:53 PM, Deepak Vohra dvohr...@yahoo.com.invalid
wrote:

 Or, use the SparkOnHBase lab.
 http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/

  --
  *From:* Ted Yu yuzhih...@gmail.com
 *To:* Akhil Das ak...@sigmoidanalytics.com
 *Cc:* sandeep vura sandeepv...@gmail.com; user@spark.apache.org 
 user@spark.apache.org
 *Sent:* Monday, February 23, 2015 8:52 AM
 *Subject:* Re: How to integrate HBASE on Spark

 Installing hbase on hadoop cluster would allow hbase to utilize features
 provided by hdfs, such as short circuit read (See '90.2. Leveraging local
 data' under http://hbase.apache.org/book.html#perf.hdfs).

 Cheers

 On Sun, Feb 22, 2015 at 11:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 If you are having both the clusters on the same network, then i'd suggest
 you installing it on the hadoop cluster. If you install it on the spark
 cluster itself, then hbase might take up a few cpu cycles and there's a
 chance for the job to lag.

 Thanks
 Best Regards

 On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi

 I had installed spark on 3 node cluster. Spark services are up and
 running.But i want to integrate hbase on spark

 Do i need to install HBASE on hadoop cluster or spark cluster.

 Please let me know asap.

 Regards,
 Sandeep.v








Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
Hi Akhil,

I had installed spark on hadoop cluster itself.All of my clusters are on
the same network.

Thanks,
Sandeep.v

On Mon, Feb 23, 2015 at 1:08 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 If you are having both the clusters on the same network, then i'd suggest
 you installing it on the hadoop cluster. If you install it on the spark
 cluster itself, then hbase might take up a few cpu cycles and there's a
 chance for the job to lag.

 Thanks
 Best Regards

 On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Hi

 I had installed spark on 3 node cluster. Spark services are up and
 running.But i want to integrate hbase on spark

 Do i need to install HBASE on hadoop cluster or spark cluster.

 Please let me know asap.

 Regards,
 Sandeep.v





How to integrate HBASE on Spark

2015-02-22 Thread sandeep vura
Hi

I had installed spark on 3 node cluster. Spark services are up and
running.But i want to integrate hbase on spark

Do i need to install HBASE on hadoop cluster or spark cluster.

Please let me know asap.

Regards,
Sandeep.v