Hi All,
I am running spark to deal with AWS. And aws sdk latest version is working with httpclient 3.4+. Then but spark-assembly-*-.jar file has packaged an old httpclient version which cause me: ClassNotFoundException for org/apache/http/client/methods/HttpPatch Even when I put the right httpclient jar there, it won't help because spark always take the class from same packaging first. I don't know why spark only provide a big package which doesn't allow us to customize the library loading sequence. I know I can just rebuild the spark, but this is very troublesome, and it should not be a general solution for long term (I can't rebuild spark jar every time when have a jar conflict as spark is supposed to be a cluster). In hadoop, we have "mapreduce.job.user.classpath.first=true". But "spark.yarn.user.classpath.first" only work for Yarn. I think I am not the one who face this issue. Anyone has a more general solution for this? Regards, Shuai