Hi Experts,
I would like to submit a spark job with configuring additional jar on hdfs,
however the hadoop gives me a warning on skipping remote jar. Although I
can still get my final results on hdfs, I cannot obtain the effect of
additional remote jar. I would appreciate if you can give me some
s
Hi
According to official spec as below, I think the SparkSession.builder gets
the highest priority to the configuration and it means the ‘spark-submit’
passing options would be ignored. Please correct me if I am wrong, many
thanks.
Properties set directly on the SparkConf take highest precedence,