0: jdbc:phoenix:master> select count(1) from STORE_SALES;
+--+
| COUNT(1) |
+--+
java.lang.RuntimeException:
org.apache.phoenix.exception.PhoenixIOException:
For HDP 2.4.2 this is what we ended up with to get it to work:
/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-core-4.4.0.2.4.2.0-258.jar
/usr/hdp/2.4.2.0-258/phoenix/lib/phoenix-spark-4.4.0.2.4.2.0-258.jar
/usr/hdp/2.4.2.0-258/phoenix/lib/hbase-client.jar
Robert,
you should use the phoenix-4*-spark.jar that is located in root phoenix
directory.
Thanks,
Sergey
On Tue, Jul 5, 2016 at 8:06 AM, Josh Elser wrote:
> Looking into this on the HDP side. Please feel free to reach out via HDP
> channels instead of Apache channels.
>
Hi Vamsi,
The DataFrame has an underlying number of partitions associated with it,
which will be processed by however many workers you have in your Spark
cluster.
You can check the number of partitions with:
df.rdd.partitions.size
And you can alter the partitions using:
Thanks Rajeshbabu.
On Tue, Jul 5, 2016 at 5:59 AM rajeshb...@apache.org <
chrajeshbab...@gmail.com> wrote:
> Hi Vamsi,
>
> There is a bug with local indexes in 4.4.0 which is fixed in 4.7.0
> https://issues.apache.org/jira/browse/PHOENIX-2334
>
> Thanks,
> Rajeshbabu.
>
> On Tue, Jul 5, 2016 at
Team,
In Phoenix-Spark plugin is DataFrame save operation single threaded?
df.write \
.format("org.apache.phoenix.spark") \
.mode("overwrite") \
.option("table", "TABLE1") \
.option("zkUrl", "localhost:2181") \
.save()
Thanks,
Vamsi Attluri
--
Vamsi Attluri
Looking into this on the HDP side. Please feel free to reach out via HDP
channels instead of Apache channels.
Thanks for letting us know as well.
Josh Mahonin wrote:
Hi Robert,
I recommend following up with HDP on this issue.
The underlying problem is that the
Hi Robert,
I recommend following up with HDP on this issue.
The underlying problem is that the 'phoenix-spark-4.4.0.2.4.0.0-169.jar'
they've provided isn't actually a fat client JAR, it's missing many of the
required dependencies. They might be able to provide the correct JAR for
you, but you'd
Hi Vamsi,
There is a bug with local indexes in 4.4.0 which is fixed in 4.7.0
https://issues.apache.org/jira/browse/PHOENIX-2334
Thanks,
Rajeshbabu.
On Tue, Jul 5, 2016 at 6:21 PM, Vamsi Krishna
wrote:
> Team,
>
> I'm working on HDP 2.3.2 (Phoenix 4.4.0, HBase 1.1.2).
Team,
I'm working on HDP 2.3.2 (Phoenix 4.4.0, HBase 1.1.2).
When I use '-it' option of CsvBulkLoadTool neither Acutal Table nor Local
Index Table is loaded.
*Command:*
*HADOOP_CLASSPATH=/usr/hdp/current/hbase-master/lib/hbase-protocol.jar:/etc/hbase/conf
yarn jar
I'm trying to use Phoenix on Spark, and can't get around this error:
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:82)
DETAILS:
1. I'm running HDP 2.4.0.0-169
2. Using
11 matches
Mail list logo