I had a problem when I used "spark.executor.userClassPathFirst" before. I
don't remember what the problem is.

Alternatively, you can use --driver-class-path and "--conf
spark.executor.extraClassPath". Unfortunately you may feel frustrated like
me when trying to make it work.

Depends on how you run spark:
- standalone or yarn,
- run as Application or in spark-shell
The configuration will be different. It is hard to say in a short, so I
wrote two blogs to explain it.
http://ben-tech.blogspot.com/2016/05/how-to-resolve-spark-cassandra.html
http://ben-tech.blogspot.com/2016/04/how-to-resolve-spark-cassandra.html

Hope those blogs help.

If you still have class conflict problem, you can consider to load the
external library and its dependencies using a special classloader just like
spark-hive, which can load the specified version of hive jars.

On Fri, Sep 9, 2016 at 2:53 PM, Colin Kincaid Williams <disc...@uw.edu>
wrote:

> My bad, gothos on IRC pointed me to the docs:
>
> http://jhz.name/2016/01/10/spark-classpath.html
>
> Thanks Gothos!
>
> On Fri, Sep 9, 2016 at 9:23 PM, Colin Kincaid Williams <disc...@uw.edu>
> wrote:
> > I'm using the spark shell v1.61 . I have a classpath conflict, where I
> > have an external library ( not OSS either :( , can't rebuild it.)
> > using httpclient-4.5.2.jar. I use spark-shell --jars
> > file:/path/to/httpclient-4.5.2.jar
> >
> > However spark is using httpclient-4.3 internally. Then when I try to
> > use the external library I get
> >
> > getClass.getResource("/org/apache/http/conn/ssl/
> SSLConnectionSocketFactory.class");
> >
> > res5: java.net.URL =
> > jar:file:/opt/spark-1.6.1-bin-hadoop2.4/lib/spark-assembly-
> 1.6.1-hadoop2.4.0.jar!/org/apache/http/conn/ssl/
> SSLConnectionSocketFactory.class
> >
> > How do I get spark-shell on 1.6.1 to allow me to use the external
> > httpclient-4.5.2.jar for my application,and ignore it's internal
> > library. Or is this not possible?
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to