Thanks a lot. I will give a try!

On Monday, March 16, 2015, Adam Lewandowski <adam.lewandow...@gmail.com>
wrote:

> Prior to 1.3.0, Spark has 'spark.files.userClassPathFirst' for non-yarn
> apps. For 1.3.0, use 'spark.executor.userClassPathFirst'.
>
> See
> https://mail-archives.apache.org/mod_mbox/spark-user/201503.mbox/%3CCALrvLxdWwSByxNvcZtTVo8BsNRR_7tbPzWdUiAV8Ps8H1oAayQ%40mail.gmail.com%3E
>
> On Fri, Mar 13, 2015 at 1:04 PM, Shuai Zheng <szheng.c...@gmail.com
> <javascript:_e(%7B%7D,'cvml','szheng.c...@gmail.com');>> wrote:
>
>> Hi All,
>>
>>
>>
>> I am running spark to deal with AWS.
>>
>>
>>
>> And aws sdk latest version is working with httpclient 3.4+. Then but
>> spark-assembly-*-.jar file has packaged an old httpclient version which
>> cause me: ClassNotFoundException for
>> org/apache/http/client/methods/HttpPatch
>>
>>
>>
>> Even when I put the right httpclient jar there, it won’t help because
>> spark always take the class from same packaging first.
>>
>>
>>
>> I don’t know why spark only provide a big package which doesn’t allow us
>> to customize the library loading sequence. I know I can just rebuild the
>> spark, but this is very troublesome, and it should not be a general
>> solution for long term (I can’t rebuild spark jar every time when have a
>> jar conflict as spark is supposed to be a cluster).
>>
>>
>>
>> In hadoop, we have “mapreduce.job.user.classpath.first=true”. But
>> “spark.yarn.user.classpath.first” only work for Yarn.
>>
>>
>>
>> I think I am not the one who face this issue. Anyone has a more general
>> solution for this?
>>
>>
>>
>> Regards,
>>
>>
>>
>> Shuai
>>
>>
>>
>>
>>
>
>

Reply via email to