hi, all
  I suggest spark not use assembly jar as default run-time 
dependency(spark-submit/spark-class depend on assembly jar),use a library of 
all 3rd dependency jar like hadoop/hive/hbase more reasonable.

  1 assembly jar packaged all 3rd jars into a big one, so we need rebuild this 
jar if we want to update the version of some component(such as hadoop)
  2 in our practice with spark, sometimes we meet jar compatibility issue, it 
is hard to diagnose compatibility issue with assembly jar







---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to