[ 
https://issues.apache.org/jira/browse/PHOENIX-3729?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16008153#comment-16008153
 ] 

Josh Mahonin commented on PHOENIX-3729:
---------------------------------------

[~shekharreddy...@gmail.com] Can you confirm you are setting the {{ 
spark.executor.extraClassPath }} and {{ spark.driver.extraClassPath }} settings 
to point to the Phoenix client JAR, or are you using another method to make the 
JAR available to the Spark class loader?

I'm not sure what the ramifications of 'addClasspath' would be to other 
frameworks that also use the phoenix-client JAR. Would love to hear some other 
folks weigh in, if they have any thoughts or opinions.



> Phoenix 4.7 integration issue with Spark 2.1.0
> ----------------------------------------------
>
>                 Key: PHOENIX-3729
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3729
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.7.0, 4.9.0, 4.8.1, 4.10.0, 5.0
>            Reporter: shekhar reddy
>            Priority: Blocker
>
> I was using phoenix 4.7 with spark 1.6 after I upgraded spark 1.6 to 2.1.0 my 
> spark stopped working. I build Phoenix 4.7.0 with spark version 2.1.0  and 
> copied latest jars in /usr/lib/spark/jars then started getting Jackson jars 
> conflicts as spark 2.1.0 also having different versions of Jackson jars in 
> /usr/lb/spark/jars folder.
> I fixed this issue by updating phoenix-core/pom.xml by commenting below line 
> and Spark started working!!!!!!!!
> <addClasspath>true</addClasspath>
> Please let me how to push this changes in Phoenix code if you guys feel this 
> approach OK.
> I am happy to contribute!!!!!!!!
> Thanks,
> Shekhar Reddy



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to