[HELP:]Save Spark Dataframe in Phoenix Table

2016-04-07 Thread Divya Gehlot
Hi, I hava a Hortonworks Hadoop cluster having below Configurations : Spark 1.5.2 HBASE 1.1.x Phoenix 4.4 I am able to connect to Phoenix through JDBC connection and able to read the Phoenix tables . But while writing the data back to Phoenix table I am getting below error : org.apache.spark.sql.

Re: [Query:]Table creation with column family in Phoenix

2016-03-14 Thread Divya Gehlot
Thanks Harish for the responding to my query and explanation. On 12 March 2016 at 06:53, Harish Krishnan wrote: > Your scan query is returning all states/versions of your columns and column > families > > Thanks & Regards, > Harish.T.K > > On Thu, Mar 10, 2016

Re: [Query:]Table creation with column family in Phoenix

2016-03-14 Thread Divya Gehlot
; wrote: > > > Your scan query is returning all states/versions of your columns and > column > > families > > > > Thanks & Regards, > > Harish.T.K > > > > On Thu, Mar 10, 2016 at 11:54 PM, Divya Gehlot > > wrote: > > &

[Query:]Table creation with column family in Phoenix

2016-03-10 Thread Divya Gehlot
Hi, I created a table in Phoenix with three column families and Inserted the values as shown below Syntax : > CREATE TABLE TESTCF (MYKEY VARCHAR NOT NULL PRIMARY KEY, CF1.COL1 VARCHAR, > CF2.COL2 VARCHAR, CF3.COL3 VARCHAR) > UPSERT INTO TESTCF (MYKEY,CF1.COL1,CF2.COL2,CF3.COL3)values > ('Key2','

Re: Spark on Hbase

2016-03-09 Thread Divya Gehlot
I agree with Talat As couldn't connect directly with Hbase Connecting it through Phoenix . If you are using Hortonworks distribution ,it comes with Phoenix. Thanks, Divya On Mar 10, 2016 3:04 AM, "Talat Uyarer" wrote: > Hi, > > Have you ever tried Apache phoenix ? They have spark solution[1]. I

[ERROR]: Spark 1.5.2 + Hbase 1.1 + Hive 1.2 + HbaseIntegration

2016-02-29 Thread Divya Gehlot
Hi, I am getting error when I am trying to connect hive table (which is being created through HbaseIntegration) in spark Steps I followed : *Hive Table creation code *: CREATE EXTERNAL TABLE IF NOT EXISTS TEST(NAME STRING,AGE INT) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH

[Error]: Spark 1.5.2 + HiveHbase Integration

2016-02-29 Thread Divya Gehlot
Hi, I am trying to access hive table which been created using HbaseIntegration I am able to access data in Hive CLI But when I am trying to access the table using hivecontext of Spark getting following

[BEST PRACTICES]: Registering Hbase table as hive external table

2016-02-28 Thread Divya Gehlot
Hi, Has any worked on registering Hbase tables as hive ? I would like to know the best practices as well as pros and cons of it . Would really appreciate if you could refer me to good blog ,study materials etc. If anybody has hands on /production experience ,could you please share the tips? Than

Re: [Error] : while registering Hbase table with hive

2016-02-28 Thread Divya Gehlot
Oh mymy found the mistake Its the typo no cf mapping for COLUMN7 Thanks Ted On 29 February 2016 at 11:45, Divya Gehlot wrote: > Hi Ted, > I am using > Hive - 1.2.1.2.3 > Hbase - 1.1.1.2.3 > > Using Hortonworks HDP 2.3.4 version > > > CREATE EXTERNAL TABLE TABLE_

Re: [Error] : while registering Hbase table with hive

2016-02-28 Thread Divya Gehlot
st1column=0:FREE_FIELD_7, timestamp=1456197984637, value=7 On 29 February 2016 at 11:26, Ted Yu wrote: > Can you give us some more information ? > > release of hbase > release of Hive > > code snippet for registering hbase table > > On Su

[Error] : while registering Hbase table with hive

2016-02-28 Thread Divya Gehlot
Hi, I trying to register a hbase table with hive and getting following error : Error while processing statement: FAILED: Execution Error, return code 1 > from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: > MetaException(message:org.apache.hadoop.hive.serde2.SerDeException E

Fwd: [Vote] : Spark-csv 1.3 + Spark 1.5.2 - Error parsing null values except String data type

2016-02-23 Thread Divya Gehlot
Hi, Please vote if you have faced this issue. I am getting error when parsing null values with Spark-csv DataFile : name age alice 35 bob null peter 24 Code : spark-shell --packages com.databricks:spark-csv_2.10:1.3.0 --master yarn-client -i /TestDivya/Spark/Testnull.scala Testnull.scala > im

Re: Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
Spark and Phoenix. > Phoenix uses 1.8.8 while Spark uses 1.9.13 > > One solution is to upgrade the jackson version in Phoenix. > See PHOENIX-2608 > > > On Thu, Feb 18, 2016 at 12:31 AM, Divya Gehlot > wrote: > > > Hi, > > I am getting following error while

Error : starting spark-shell with phoenix client jar

2016-02-18 Thread Divya Gehlot
Hi, I am getting following error while starting spark shell with phoenix clients spark-shell --jars /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --driver-class-path /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar --master yarn-client StackTrace : >

HBase : Transaction queries

2016-02-17 Thread Divya Gehlot
Hi, I am new bee to Hbase and recently went through documentation and videos on HBase Now got curious about , How Hbase handles transactions such rollback and commit. Would really appreciate the inputs and guidance . Thanks, Divya