[jira] [Assigned] (SPARK-32160) Executors should not be able to create SparkContext.

2020-07-08 Thread Hyukjin Kwon (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon reassigned SPARK-32160:


Assignee: Takuya Ueshin

> Executors should not be able to create SparkContext.
> 
>
> Key: SPARK-32160
> URL: https://issues.apache.org/jira/browse/SPARK-32160
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Assignee: Takuya Ueshin
>Priority: Major
>
> Currently executors can create SparkContext, but shouldn't be able to create 
> it.
> {code:scala}
> sc.range(0, 1).foreach { _ =>
>   new SparkContext(new SparkConf().setAppName("test").setMaster("local"))
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-32160) Executors should not be able to create SparkContext.

2020-07-02 Thread Apache Spark (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-32160:


Assignee: Apache Spark

> Executors should not be able to create SparkContext.
> 
>
> Key: SPARK-32160
> URL: https://issues.apache.org/jira/browse/SPARK-32160
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Assignee: Apache Spark
>Priority: Major
>
> Currently executors can create SparkContext, but shouldn't be able to create 
> it.
> {code:scala}
> sc.range(0, 1).foreach { _ =>
>   new SparkContext(new SparkConf().setAppName("test").setMaster("local"))
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-32160) Executors should not be able to create SparkContext.

2020-07-02 Thread Apache Spark (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-32160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-32160:


Assignee: (was: Apache Spark)

> Executors should not be able to create SparkContext.
> 
>
> Key: SPARK-32160
> URL: https://issues.apache.org/jira/browse/SPARK-32160
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Takuya Ueshin
>Priority: Major
>
> Currently executors can create SparkContext, but shouldn't be able to create 
> it.
> {code:scala}
> sc.range(0, 1).foreach { _ =>
>   new SparkContext(new SparkConf().setAppName("test").setMaster("local"))
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org