Reposting for other user benefits
---------- Forwarded message ----------
From: Divya Gehlot <divya.htco...@gmail.com>
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin <jmaho...@gmail.com>


Hi Josh,
I am doing in the same manner as mentioned in Phoenix Spark manner.
Using the latest version of HDP 2.3.4 .
In case of version mismatch/lack of spark Phoenix support it's should have
thrown the error at read also.
Which is working fine as expected .
Will surely pass on the code snippets once I log on to my System.
In the mean while I would like to know the zkURL parameter.If I build it
with HbaseConfiguration and passing zk quorom ,znode and port .
It throws error for example localhost :2181/hbase-unsecure
This localhost gets replaced by all the quorom
Like quorum1,quorum2:2181/hbase-unsecure

I am just providing the IP address of my HBase master.

I feel like I am  not on right track so asked for the help .
How to connect to Phoenix through Spark on hadoop cluster .
Thanks for the help.
Cheers,
Divya
On Apr 8, 2016 7:06 PM, "Josh Mahonin" <jmaho...@gmail.com> wrote:

> Hi Divya,
>
> That's strange. Are you able to post a snippet of your code to look at?
> And are you sure that you're saving the dataframes as per the docs (
> https://phoenix.apache.org/phoenix_spark.html)?
>
> Depending on your HDP version, it may or may not actually have
> phoenix-spark support. Double-check that your Spark configuration is setup
> with the right worker/driver classpath settings. and that the phoenix JARs
> contain the necessary phoenix-spark classes
> (e.g. org.apache.phoenix.spark.PhoenixRelation). If not, I suggest
> following up with Hortonworks.
>
> Josh
>
>
>
> On Fri, Apr 8, 2016 at 1:22 AM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
>
>> Hi,
>> I hava a Hortonworks Hadoop cluster having below Configurations :
>> Spark 1.5.2
>> HBASE 1.1.x
>> Phoenix 4.4
>>
>> I am able to connect to Phoenix through JDBC connection and able to read
>> the Phoenix tables .
>> But while writing the data back to Phoenix table
>> I am getting below error :
>>
>> org.apache.spark.sql.AnalysisException:
>> org.apache.phoenix.spark.DefaultSource does not allow user-specified
>> schemas.;
>>
>> Can any body help in resolving the above errors or any other solution of
>> saving Spark Dataframes to Phoenix.
>>
>> Would really appareciate the help.
>>
>> Thanks,
>> Divya
>>
>
>

Reply via email to