Hi Sean,

I figured out the problem. By putting these jars in the Spark classpath.txt 
file located in Spark conf, this allowed for these to be loaded first. This 
fixed it!

Thanks,
Ben


> On Jun 27, 2016, at 4:20 PM, Sean Busbey <bus...@apache.org> wrote:
> 
> Hi Ben!
> 
> For problems with the Cloudera Labs packaging of Apache Phoenix, you should 
> first seek help on the vendor-specific community forums, to ensure the issue 
> isn't specific to the vendor:
> 
> http://community.cloudera.com/t5/Cloudera-Labs/bd-p/ClouderaLabs
> 
> -busbey
> 
> On 2016-06-27 15:27 (-0500), Benjamin Kim <bbuil...@gmail.com> wrote: 
>> Anyone tried to save a DataFrame to a HBase table using Phoenix? I am able 
>> to load and read, but I canâ?Tt save.
>> 
>>>> spark-shell â?"jars 
>>>> /opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/lib/phoenix-spark-4.7.0-clabs-phoenix1.3.0.jar,/opt/cloudera/parcels/CLABS_PHOENIX/lib/phoenix/phoenix-4.7.0-clabs-phoenix1.3.0-client.jar
>> 
>> import org.apache.spark.sql._
>> import org.apache.phoenix.spark._
>> 
>> val hbaseConnectionString = â?o<zookeeper-quorum>â?
>> 
>> // Save to OUTPUT_TABLE
>> df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> 
>> "OUTPUT_TABLE",
>>  "zkUrl" -> hbaseConnectionString))
>> 
>> java.lang.ClassNotFoundException: Class 
>> org.apache.phoenix.mapreduce.PhoenixOutputFormat not found
>>      at 
>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
>>      at 
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
>> 
>> Thanks,
>> Ben

Reply via email to