[ 
https://issues.apache.org/jira/browse/PHOENIX-3729?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16015807#comment-16015807
 ] 

Josh Mahonin commented on PHOENIX-3729:
---------------------------------------

[~shekharreddy...@gmail.com] The JAR copying and symlinking method you've 
described is tricky to get right, and plays havoc with Spark's classloader.

You may have more luck by removing those, and simply setting the extraClassPath 
settings as described by the documentation. Also, a very quick Google search 
turned up a little Github note from [~youngwookim] which seems to verify that 
the configuration works for Spark 2.1.0.

> Phoenix 4.7 integration issue with Spark 2.1.0
> ----------------------------------------------
>
>                 Key: PHOENIX-3729
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3729
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.7.0, 4.9.0, 4.8.1, 4.10.0, 5.0
>            Reporter: shekhar reddy
>            Priority: Blocker
>
> I was using phoenix 4.7 with spark 1.6 after I upgraded spark 1.6 to 2.1.0 my 
> spark stopped working. I build Phoenix 4.7.0 with spark version 2.1.0  and 
> copied latest jars in /usr/lib/spark/jars then started getting Jackson jars 
> conflicts as spark 2.1.0 also having different versions of Jackson jars in 
> /usr/lb/spark/jars folder.
> I fixed this issue by updating phoenix-core/pom.xml by commenting below line 
> and Spark started working!!!!!!!!
> <addClasspath>true</addClasspath>
> Please let me how to push this changes in Phoenix code if you guys feel this 
> approach OK.
> I am happy to contribute!!!!!!!!
> Thanks,
> Shekhar Reddy



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to