[ 
https://issues.apache.org/jira/browse/SPARK-32675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-32675.
-----------------------------------
    Fix Version/s: 3.1.0
       Resolution: Fixed

Issue resolved by pull request 29499
[https://github.com/apache/spark/pull/29499]

> --py-files option is appended without passing value for it
> ----------------------------------------------------------
>
>                 Key: SPARK-32675
>                 URL: https://issues.apache.org/jira/browse/SPARK-32675
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 3.0.0
>            Reporter: Farhan Khan
>            Assignee: Farhan Khan
>            Priority: Major
>             Fix For: 3.1.0
>
>
> Submitted application passing --py-files option in a hardcoded manner for a 
> Mesos Cluster in cluster mode using REST Submission API. It is causing a 
> simple Java-based SparkPi job to fail.
> This Bug is introduced by SPARK-26466.
> Here is the example job submission:
> {code:bash}
> curl -X POST http://localhost:7077/v1/submissions/create --header 
> "Content-Type:application/json" --data '{
> "action": "CreateSubmissionRequest",
> "appResource": 
> "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
> "clientSparkVersion": "3.0.0",
> "appArgs": ["30"],
> "environmentVariables": {},
> "mainClass": "org.apache.spark.examples.SparkPi",
> "sparkProperties": {
>   "spark.jars": 
> "file:///opt/spark-3.0.0-bin-3.2.0/examples/jars/spark-examples_2.12-3.0.0.jar",
>   "spark.driver.supervise": "false",
>   "spark.executor.memory": "512m",
>   "spark.driver.memory": "512m",
>   "spark.submit.deployMode": "cluster",
>   "spark.app.name": "SparkPi",
>   "spark.master": "mesos://localhost:5050"
> }}'
> {code}
> Expected Driver log would contain:
> {code:bash}
> 20/08/20 20:19:57 WARN DependencyUtils: Local jar 
> /var/lib/mesos/slaves/e6779377-08ec-4765-9bfc-d27082fbcfa1-S0/frameworks/e6779377-08ec-4765-9bfc-d27082fbcfa1-0000/executors/driver-20200820201954-0002/runs/d9d734e8-a299-4d87-8f33-b134c65c422b/spark.driver.memory=512m
>  does not exist, skipping.
> Error: Failed to load class org.apache.spark.examples.SparkPi.
> 20/08/20 20:19:57 INFO ShutdownHookManager: Shutdown hook called
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to