Hi Sean,

OK I'm about 90% sure about the cause of this problem: Just another classic
Dependency conflict:
Myproject -> Selenium -> apache.httpcomponents:httpcore 4.3.1 (has
ContentType)
Spark -> Spark SQL Hive -> Hive -> Thrift -> apache.httpcomponents:httpcore
4.1.3 (has no ContentType)

Though I generated an uber jar excluding Spark/Shark as 'provided' and
indeed include the latest httpcore 4.3. By default spark-submit will load
the uber jar of itself first, then load application's, so unfortunately my
dependency was shaded. I hope I can change the class loading sequence (which
is very unlikely unless someone submit a JIRA), but in worst case I can only
resort the dumb way - manually renaming packages in maven-shade plugin.

That will be the plan for tomorrow. However, I'm wondering if there is a
'clean solution'? Like some plugin that automagically put packages in
different versions, or detect conflicts and rename to aliases?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-throws-NoSuchFieldError-when-testing-on-cluster-mode-tp8064p8083.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to