lib/phoenix-server.jar,/usr/hdp/2.3.4.0-3485/hbase/lib/phoenix-client-4.4.0.jar
>>>> --packages com.databricks:spark-csv_2.10:1.4.0 --master yarn-client -i
>>>> /TestDivya/Spark/WriteToPheonix.scala*
>>>>
>>>>
>>>> Getting the below error :
>>>>
>>>
; org.apache.phoenix.spark.DefaultSource
>> does not allow user-specified schemas.;
>>
>> Am I on the right track or missing any properties ?
>>
>> Because of this I am unable to proceed with Phoenix and have to find
>> alternate options.
>> Would really appreciate the help
>>
yarn-client -i
>>> /TestDivya/Spark/WriteToPheonix.scala*
>>>
>>>
>>> Getting the below error :
>>>
>>> org.apache.spark.sql.AnalysisException:
>>> org.apache.phoenix.spark.DefaultSource
>>> does not allow user-specifie
t track or missing any properties ?
>>
>> Because of this I am unable to proceed with Phoenix and have to find
>> alternate options.
>> Would really appreciate the help
>>
>>
>>
>>
>>
>> -- Forwarded message --
>> Fr
oenix.spark.DefaultSource
> does not allow user-specified schemas.;
>
> Am I on the right track or missing any properties ?
>
> Because of this I am unable to proceed with Phoenix and have to find
> alternate options.
> Would really appreciate the help
>
>
>
>
>
>
sage --
From: Divya Gehlot <divya.htco...@gmail.com>
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin <jmaho...@gmail.com>
Hi Josh,
I am doing in the same manner as mentioned in Phoenix Spark manner.
Using the latest version of HDP
Reposting for other user benefits
-- Forwarded message --
From: Divya Gehlot <divya.htco...@gmail.com>
Date: 8 April 2016 at 19:54
Subject: Re: [HELP:]Save Spark Dataframe in Phoenix Table
To: Josh Mahonin <jmaho...@gmail.com>
Hi Josh,
I am doing in the same manner
Hi Divya,
That's strange. Are you able to post a snippet of your code to look at? And
are you sure that you're saving the dataframes as per the docs (
https://phoenix.apache.org/phoenix_spark.html)?
Depending on your HDP version, it may or may not actually have
phoenix-spark support.
Hi,
I hava a Hortonworks Hadoop cluster having below Configurations :
Spark 1.5.2
HBASE 1.1.x
Phoenix 4.4
I am able to connect to Phoenix through JDBC connection and able to read
the Phoenix tables .
But while writing the data back to Phoenix table
I am getting below error :