Unsubscribe

2023-05-01 Thread sandeep vura
-- Sandeep V

Re: adding jars - hive on spark cdh 5.4.3

2016-01-10 Thread sandeep vura
Upgrade to CDH 5.5 for spark. It should work On Sat, Jan 9, 2016 at 12:17 AM, Ophir Etzion wrote: > It didn't work. assuming I did the right thing. > in the properties you could see > >

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
Thanks alot AKhil On Mon, Jul 6, 2015 at 12:57 PM, sandeep vura sandeepv...@gmail.com wrote: It Works !!! On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com wrote: oK Let me try On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com wrote: Its

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
It Works !!! On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com wrote: oK Let me try On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com wrote: Its complaining for a jdbc driver. Add it in your driver classpath like: ./bin/spark-sql --driver-class-path

Unable to start spark-sql

2015-07-06 Thread sandeep vura
Hi Sparkers, I am unable to start spark-sql service please check the error as mentioned below. Exception in thread main java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura sandeepv...@gmail.com wrote: Hi Sparkers, I am unable to start spark-sql service please check the error as mentioned below. Exception in thread main java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate

How to run spark programs in eclipse like mapreduce

2015-04-20 Thread sandeep vura
Hi Sparkers, I have written a code in python in eclipse now that code should execute in spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help me with instructions. Regards, Sandeep.v

Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Hi Sparkers, I am trying to load data in spark with the following command *sqlContext.sql(LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src);* *Getting exception below* *Server IPC version 9 cannot communicate with client version 4* NOte : i am using Hadoop 2.2

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
, Saisai Shao sai.sai.s...@gmail.com wrote: Looks like you have to build Spark with related Hadoop version, otherwise you will meet exception as mentioned. you could follow this doc: http://spark.apache.org/docs/latest/building-spark.html 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
: mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package ​ Thanks Best Regards On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com wrote: Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh I am running the below command in spark/yarn

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com wrote: Build failed with following errors. I have executed the below following command. * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package* [INFO

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
src (key INT, value STRING))* *java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient* Cheers, Sandeep.v On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura sandeepv...@gmail.com wrote: No I am just running ./spark-shell

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
I run my spark-shell instance in standalone mode, I use: ./spark-shell --master spark://servername:7077 --driver-class-path /lib/mysql-connector-java-5.1.27.jar On Fri, Mar 13, 2015 at 8:31 AM sandeep vura sandeepv...@gmail.com wrote: Hi Sparkers, Can anyone please check the below error

Re: About the env of Spark1.2

2015-03-21 Thread sandeep vura
Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck or create 127.0.1.1 named it as localhost On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu yuzhih...@gmail.com wrote: bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or service not known Can you check

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
:* sandeep vura [mailto:sandeepv...@gmail.com] *Sent:* Monday, March 16, 2015 2:21 PM *To:* Cheng, Hao *Cc:* fightf...@163.com; Ted Yu; user *Subject:* Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient I have already added mysql-connector-xx.jar file in spark

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
. -- fightf...@163.com *From:* sandeep vura sandeepv...@gmail.com *Date:* 2015-03-16 14:13 *To:* Ted Yu yuzhih...@gmail.com *CC:* user@spark.apache.org *Subject:* Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient Hi Ted, Did you

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
*From:* fightf...@163.com [mailto:fightf...@163.com] *Sent:* Monday, March 16, 2015 2:04 PM *To:* sandeep vura; Ted Yu *Cc:* user *Subject:* Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient Hi, Sandeep From your error log I can see that jdbc driver

Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Sparkers, I couldn't able to run spark-sql on spark.Please find the following error Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient Regards, Sandeep.v

Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
information ? Such as: Version of Spark you're using Command line Thanks On Mar 15, 2015, at 9:51 PM, sandeep vura sandeepv...@gmail.com wrote: Hi Sparkers, I couldn't able to run spark-sql on spark.Please find the following error Unable to instantiate

Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Ted, Did you find any solution. Thanks Sandeep On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com wrote: Hi Ted, I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration files attached below. ERROR IN SPARK

Errors in SPARK

2015-03-13 Thread sandeep vura
Hi Sparkers, Can anyone please check the below error and give solution for this.I am using hive version 0.13 and spark 1.2.1 . Step 1 : I have installed hive 0.13 with local metastore (mySQL database) Step 2: Hive is running without any errors and able to create tables and loading data in hive

Re: Spark-SQL and Hive - is Hive required?

2015-03-06 Thread sandeep vura
Hi , For creating a Hive table do i need to add hive-site.xml in spark/conf directory. On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust mich...@databricks.com wrote: Its not required, but even if you don't have hive installed you probably still want to use the HiveContext. From earlier in

Does anyone integrate HBASE on Spark

2015-03-04 Thread sandeep vura
Hi Sparkers, How do i integrate hbase on spark !!! Appreciate for replies !! Regards, Sandeep.v

Re: Unable to run hive queries inside spark

2015-02-27 Thread sandeep vura
Hi Kundan, Sorry even i am also facing the similar issue today.How did you resolve this issue? Regards, Sandeep.v On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust mich...@databricks.com wrote: It looks like that is getting interpreted as a local path. Are you missing a core-site.xml file

Re: Errors in spark

2015-02-27 Thread sandeep vura
metastore, in my experience I've also had to copy core-site.xml into conf in order to specify this property: namefs.defaultFS/name On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura sandeepv...@gmail.com wrote: Hi Sparkers, I am using hive version - hive 0.13 and copied hive-site.xml in spark

Errors in spark

2015-02-27 Thread sandeep vura
Hi Sparkers, I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf and using default derby local metastore . While creating a table in spark shell getting the following error ..Can any one please look and give solution asap.. sqlContext.sql(CREATE TABLE IF NOT EXISTS

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
0.12.0 or Hive 0.13.1. On 2/27/15 12:12 AM, sandeep vura wrote: Hi Cheng, Thanks the above issue has been resolved.I have configured Remote metastore not Local metastore in Hive. While creating a table in sparksql another error reflecting on terminal . Below error is given below

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
/15 8:03 PM, sandeep vura wrote: Hi Sparkers, I am trying to creating hive table in SparkSql.But couldn't able to create it.Below are the following errors which are generating so far. java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate

Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
Hi Sparkers, I am trying to creating hive table in SparkSql.But couldn't able to create it.Below are the following errors which are generating so far. java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at

Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
at 10:53 PM, Deepak Vohra dvohr...@yahoo.com.invalid wrote: Or, use the SparkOnHBase lab. http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/ -- *From:* Ted Yu yuzhih...@gmail.com *To:* Akhil Das ak...@sigmoidanalytics.com *Cc:* sandeep vura

Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
it on the hadoop cluster. If you install it on the spark cluster itself, then hbase might take up a few cpu cycles and there's a chance for the job to lag. Thanks Best Regards On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura sandeepv...@gmail.com wrote: Hi I had installed spark on 3 node cluster

How to integrate HBASE on Spark

2015-02-22 Thread sandeep vura
Hi I had installed spark on 3 node cluster. Spark services are up and running.But i want to integrate hbase on spark Do i need to install HBASE on hadoop cluster or spark cluster. Please let me know asap. Regards, Sandeep.v