Re: Spark Plugin Information

2016-04-08 Thread Benjamin Kim
Hi Josh, I am using CDH 5.5.2 with HBase 1.0.0, Phoenix 4.5.2, and Spark 1.6.0. I looked up the error and found others who led me to ask the question. I’ll try to use Phoenix 4.7.0 client jar and see what happens. The error I am getting is: java.sql.SQLException: ERROR 103 (08004): Unable to

Spark Plugin Information

2016-04-08 Thread Benjamin Kim
I want to know if there is a update/patch coming to Spark or the Spark plugin? I see that the Spark plugin does not work because HBase classes are missing from the Spark Assembly jar. So, when Spark does reflection, it does not look for HBase client classes in the Phoenix Plugin jar but only in

Spark Plugin Information

2016-04-08 Thread Benjamin Kim
I want to know if there is a update/patch coming to Spark or the Spark plugin? I see that the Spark plugin does not work because HBase classes are missing from the Spark Assembly jar. So, when Spark does reflection, it does not look for HBase client classes in the Phoenix Plugin jar but only in

Re: Missing Rows In Table After Bulk Load

2016-04-08 Thread Steve Terrell
Are the primary keys in the .csv file are all unique? (no rows overwriting other rows) On Fri, Apr 8, 2016 at 10:21 AM, Amit Shah wrote: > Hi, > > I am using phoenix 4.6 and hbase 1.0. After bulk loading 10 mil records > into a table using the psql.py utility, I tried

Missing Rows In Table After Bulk Load

2016-04-08 Thread Amit Shah
Hi, I am using phoenix 4.6 and hbase 1.0. After bulk loading 10 mil records into a table using the psql.py utility, I tried querying the table using the sqlline.py utility through a select count(*) query. I see only 0.1 million records. What could be missing? The psql.py logs are python

Re: [HELP:]Save Spark Dataframe in Phoenix Table

2016-04-08 Thread Josh Mahonin
Hi Divya, That's strange. Are you able to post a snippet of your code to look at? And are you sure that you're saving the dataframes as per the docs ( https://phoenix.apache.org/phoenix_spark.html)? Depending on your HDP version, it may or may not actually have phoenix-spark support.