GitHub user shanyu opened a pull request:

    https://github.com/apache/spark/pull/23009

    SPARK-26011: pyspark app with "spark.jars.packages" config does not work

    SparkSubmit determines pyspark app by the suffix of primary resource but 
Livy
    uses "spark-internal" as the primary resource when calling spark-submit,
    therefore args.isPython is set to false in SparkSubmit.scala.
    
    The fix is to resolve maven coordinates not only when args.isPython is true,
    but also when primary resource is spark-internal.
    
    Tested the patch with Livy submitting pyspark app, spark-submit, pyspark 
with or without packages config.
    
    Signed-off-by: Shanyu Zhao <shz...@microsoft.com>

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/shanyu/spark shanyu-26011

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/23009.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #23009
    
----
commit c8424aff80e33f9a3f5a7d19a04442c7dac701a4
Author: Shanyu Zhao <shzhao@...>
Date:   2018-11-12T02:57:01Z

    SPARK-26011: pyspark app with "spark.jars.packages" config does not work
    
    SparkSubmit determines pyspark app by the suffix of primary resource but 
Livy
    uses "spark-internal" as the primary resource when calling spark-submit,
    therefore args.isPython is set to false in SparkSubmit.scala.
    
    The fix is to resolve maven coordinates not only when args.isPython is true,
    but also when primary resource is spark-internal.
    
    Signed-off-by: Shanyu Zhao <shz...@microsoft.com>

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to