[jira] [Resolved] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep resolved SPARK-33201.
--
Resolution: Duplicate

> Mesos dispatcher service is not working due to empty --pyFiles conf in 
> cluster mode which is the default
> 
>
> Key: SPARK-33201
> URL: https://issues.apache.org/jira/browse/SPARK-33201
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 3.0.0, 3.0.1
>Reporter: Amandeep
>Priority: Major
>
> In MesosCluster mode, all of the spark jobs fail to run because by default 
> "--py-files" is set to an empty string which causes spark-submit to use wrong 
> jar name. This issue is caused by [SPARK-26466 | 
> https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]
>  For ex:
> {quote}--total-executor-cores
>  2
>  --py-files
>  --conf
>  spark.driver.maxResultSize=15g
> {quote}
>  so the very next --conf is used as the value for --py-files so 
> `spark.driver.maxResultSize=15g` is used as jar name 
> which causes error as below:
> {quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
> \{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
>  does not exist, skipping.
>  20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
> org.apache.spark.examples.SparkPi.
>  java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
> {quote}
> In order to reproduce the bug run any spark provided examples on mesos in 
> cluster mode:
> {code:java}
>  ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
> mesos://:7077 --deploy-mode cluster --conf 
> spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
> /examples/jars/spark-examples_2.12-3.0.1.jar 100
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep updated SPARK-33201:
-
Description: 
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by [SPARK-26466 | 
https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

 For ex:
{quote}--total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
{quote}
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
{quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
{quote}
In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
{code:java}
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100
{code}

  was:
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by [Git commit | 
https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

 For ex:
{quote}--total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
{quote}
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
{quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
{quote}
In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
{code:java}
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100
{code}


> Mesos dispatcher service is not working due to empty --pyFiles conf in 
> cluster mode which is the default
> 
>
> Key: SPARK-33201
> URL: https://issues.apache.org/jira/browse/SPARK-33201
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 3.0.0, 3.0.1
>Reporter: Amandeep
>Priority: Major
>
> In MesosCluster mode, all of the spark jobs fail to run because by default 
> "--py-files" is set to an empty string which causes spark-submit to use wrong 
> jar name. This issue is caused by [SPARK-26466 | 
> https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]
>  For ex:
> {quote}--total-executor-cores
>  2
>  --py-files
>  --conf
>  spark.driver.maxResultSize=15g
> {quote}
>  so the very next --conf is used as the value for --py-files so 
> `spark.driver.maxResultSize=15g` is used as jar name 
> which causes error as below:
> {quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
> \{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
>  does not exist, skipping.
>  20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
> org.apache.spark.examples.SparkPi.
>  java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
> {quote}
> In order to reproduce the bug run any spark provided examples on mesos in 
> cluster mode:
> {code:java}
>  ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
> mesos://:7077 --deploy-mode cluster --conf 
> spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
> /examples/jars/spark-examples_2.12-3.0.1.jar 100
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep updated SPARK-33201:
-
Description: 
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by [Git commit | 
https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

 For ex:
{quote}--total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
{quote}
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
{quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
{quote}
In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
{code:java}
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100
{code}

  was:
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
[https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

 For ex:
{quote}
 --total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
{quote}
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
{quote}
 20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
{quote}

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
{code}
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100
{code}


> Mesos dispatcher service is not working due to empty --pyFiles conf in 
> cluster mode which is the default
> 
>
> Key: SPARK-33201
> URL: https://issues.apache.org/jira/browse/SPARK-33201
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 3.0.0, 3.0.1
>Reporter: Amandeep
>Priority: Major
>
> In MesosCluster mode, all of the spark jobs fail to run because by default 
> "--py-files" is set to an empty string which causes spark-submit to use wrong 
> jar name. This issue is caused by [Git commit | 
> https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]
>  For ex:
> {quote}--total-executor-cores
>  2
>  --py-files
>  --conf
>  spark.driver.maxResultSize=15g
> {quote}
>  so the very next --conf is used as the value for --py-files so 
> `spark.driver.maxResultSize=15g` is used as jar name 
> which causes error as below:
> {quote}20/10/19 20:19:18 WARN DependencyUtils: Local jar 
> \{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
>  does not exist, skipping.
>  20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
> org.apache.spark.examples.SparkPi.
>  java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
> {quote}
> In order to reproduce the bug run any spark provided examples on mesos in 
> cluster mode:
> {code:java}
>  ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
> mesos://:7077 --deploy-mode cluster --conf 
> spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
> /examples/jars/spark-examples_2.12-3.0.1.jar 100
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep updated SPARK-33201:
-
Description: 
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
[https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

 For ex:
{quote}
 --total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
{quote}
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
{quote}
 20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
{quote}

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
{code}
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100
{code}

  was:
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
[https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

For ex:
 --total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
  so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
 20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi

 

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100


> Mesos dispatcher service is not working due to empty --pyFiles conf in 
> cluster mode which is the default
> 
>
> Key: SPARK-33201
> URL: https://issues.apache.org/jira/browse/SPARK-33201
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 3.0.0, 3.0.1
>Reporter: Amandeep
>Priority: Major
>
> In MesosCluster mode, all of the spark jobs fail to run because by default 
> "--py-files" is set to an empty string which causes spark-submit to use wrong 
> jar name. This issue is caused by 
> [https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]
>  For ex:
> {quote}
>  --total-executor-cores
>  2
>  --py-files
>  --conf
>  spark.driver.maxResultSize=15g
> {quote}
>  so the very next --conf is used as the value for --py-files so 
> `spark.driver.maxResultSize=15g` is used as jar name 
> which causes error as below:
> {quote}
>  20/10/19 20:19:18 WARN DependencyUtils: Local jar 
> \{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
>  does not exist, skipping.
>  20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
> org.apache.spark.examples.SparkPi.
>  java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
> {quote}
> In order to reproduce the bug run any spark provided examples on mesos in 
> cluster mode:
> {code}
>  ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
> mesos://:7077 --deploy-mode cluster --conf 
> spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
> /examples/jars/spark-examples_2.12-3.0.1.jar 100
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: 

[jira] [Updated] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep updated SPARK-33201:
-
Description: 
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
[https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]

For ex:
 --total-executor-cores
 2
 --py-files
 --conf
 spark.driver.maxResultSize=15g
  so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
 20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
 20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
 java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi

 

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077 --deploy-mode cluster --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100

  was:
In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed|http://example.com

For ex:
--total-executor-cores
2
--py-files
--conf
spark.driver.maxResultSize=15g
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi


 

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077  --deploy-mode cluster  --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100


> Mesos dispatcher service is not working due to empty --pyFiles conf in 
> cluster mode which is the default
> 
>
> Key: SPARK-33201
> URL: https://issues.apache.org/jira/browse/SPARK-33201
> Project: Spark
>  Issue Type: Bug
>  Components: Mesos
>Affects Versions: 3.0.0, 3.0.1
>Reporter: Amandeep
>Priority: Major
>
> In MesosCluster mode, all of the spark jobs fail to run because by default 
> "--py-files" is set to an empty string which causes spark-submit to use wrong 
> jar name. This issue is caused by 
> [https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed]
> For ex:
>  --total-executor-cores
>  2
>  --py-files
>  --conf
>  spark.driver.maxResultSize=15g
>   so the very next --conf is used as the value for --py-files so 
> `spark.driver.maxResultSize=15g` is used as jar name 
> which causes error as below:
>  20/10/19 20:19:18 WARN DependencyUtils: Local jar 
> \{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
>  does not exist, skipping.
>  20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
> org.apache.spark.examples.SparkPi.
>  java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
>  
> In order to reproduce the bug run any spark provided examples on mesos in 
> cluster mode:
>  ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
> mesos://:7077 --deploy-mode cluster --conf 
> spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
> /examples/jars/spark-examples_2.12-3.0.1.jar 100



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-33201) Mesos dispatcher service is not working due to empty --pyFiles conf in cluster mode which is the default

2020-10-20 Thread Amandeep (Jira)
Amandeep created SPARK-33201:


 Summary: Mesos dispatcher service is not working due to empty 
--pyFiles conf in cluster mode which is the default
 Key: SPARK-33201
 URL: https://issues.apache.org/jira/browse/SPARK-33201
 Project: Spark
  Issue Type: Bug
  Components: Mesos
Affects Versions: 3.0.1, 3.0.0
Reporter: Amandeep


In MesosCluster mode, all of the spark jobs fail to run because by default 
"--py-files" is set to an empty string which causes spark-submit to use wrong 
jar name. This issue is caused by 
https://github.com/apache/spark/commit/38f030725c561979ca98b2a6cc7ca6c02a1f80ed|http://example.com

For ex:
--total-executor-cores
2
--py-files
--conf
spark.driver.maxResultSize=15g
 so the very next --conf is used as the value for --py-files so 
`spark.driver.maxResultSize=15g` is used as jar name 

which causes error as below:
20/10/19 20:19:18 WARN DependencyUtils: Local jar 
\{dir}/slaves/d9971b08-4929-4b00-9677-b14088c38603-S13/frameworks/86ff1f10-79fd-44f0-b807-71741091b457-/executors/driver-20201019200043-0014-retry-9/runs/3697919f-c3dc-4a13-8928-2fb3a20ac98d/spark.driver.maxResultSize=15g
 does not exist, skipping.
20/10/19 20:19:18 WARN SparkSubmit$$anon$2: Failed to load 
org.apache.spark.examples.SparkPi.
java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi


 

In order to reproduce the bug run any spark provided examples on mesos in 
cluster mode:
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master 
mesos://:7077  --deploy-mode cluster  --conf 
spark.master.rest.enabled=true --executor-memory 5G --total-executor-cores 2 
/examples/jars/spark-examples_2.12-3.0.1.jar 100



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Issue Comment Deleted] (SPARK-31431) CalendarInterval encoder support

2020-04-20 Thread Amandeep (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-31431?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Amandeep updated SPARK-31431:
-
Comment: was deleted

(was: Hi Team,

I would like to work on this improvement, please let me know how to proceed.

Thanks,

Amandeep)

> CalendarInterval encoder support
> 
>
> Key: SPARK-31431
> URL: https://issues.apache.org/jira/browse/SPARK-31431
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 3.1.0
>Reporter: Kent Yao
>Priority: Major
>
> CalenderInterval is available to be converted to/from internal Spark SQL 
> representation when it is a member of a Scala's product type e.g tuples/ case 
> class etc but not as a primitive type



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-31431) CalendarInterval encoder support

2020-04-20 Thread Amandeep (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-31431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17087881#comment-17087881
 ] 

Amandeep commented on SPARK-31431:
--

Hi Team,

I would like to work on this improvement, please let me know how to proceed.

Thanks,

Amandeep

> CalendarInterval encoder support
> 
>
> Key: SPARK-31431
> URL: https://issues.apache.org/jira/browse/SPARK-31431
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 3.1.0
>Reporter: Kent Yao
>Priority: Major
>
> CalenderInterval is available to be converted to/from internal Spark SQL 
> representation when it is a member of a Scala's product type e.g tuples/ case 
> class etc but not as a primitive type



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org