Hi All,

 

I am running Spark 1.2.1 and AWS SDK. To make sure AWS compatible on the
httpclient 4.2 (which I assume spark use?), I have already downgrade to the
version 1.9.0

 

But even that, I still got an error:

 

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.impl.conn.DefaultClientConnectionOperator.<init>(Lorg/apache
/http/conn/scheme/SchemeRegistry;Lorg/apache/http/conn/DnsResolver;)V

        at
org.apache.http.impl.conn.PoolingClientConnectionManager.createConnectionOpe
rator(PoolingClientConnectionManager.java:140)

        at
org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClien
tConnectionManager.java:114)

        at
org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClien
tConnectionManager.java:99)

        at
com.amazonaws.http.ConnectionManagerFactory.createPoolingClientConnManager(C
onnectionManagerFactory.java:29)

        at
com.amazonaws.http.HttpClientFactory.createHttpClient(HttpClientFactory.java
:102)

        at
com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:190)

        at
com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:119)

        at
com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:410)

        at
com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:392)

        at
com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:376)

 

When I search the maillist, it looks the same issue as:

https://github.com/apache/spark/pull/2535

http://stackoverflow.com/questions/24788949/nosuchmethoderror-while-running-
aws-s3-client-on-spark-while-javap-shows-otherwi

 

But I don't understand the solution mention here? The issue is caused by an
pre-package  DefaultClientConnectionOperator in the spark all-in-one jar
file which doesn't have the that method.

 

I have some questions here:

 

How can we find out which exact version when spark try to pre-package
everything (this really very painful). and how can we override it?

 

I have tried:

 

    val conf = new SparkConf()

      .set("spark.files.userClassPathFirst", "true")// For non Yarn APP
before spark 1.3

      .set("spark.executor.userClassPathFirst", "true")// For spark 1.3.0

But it doesn't work

 

This really create a lot of issues to me (especially we don't know what
version is used by Spark to package its own jar, we need to try out). Even
maven doesn't give enough information because httpclient is not under the
maven dependency (even indirect dependency, after I use tools to resolved
the whole dependency tree).

 

Regards,

 

Shuai

Reply via email to