[jira] [Assigned] (SPARK-15618) Use SparkSession.builder.sparkContext(...) in tests where possible

2016-05-27 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-15618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-15618:


Assignee: Apache Spark  (was: Dongjoon Hyun)

> Use SparkSession.builder.sparkContext(...) in tests where possible
> --
>
> Key: SPARK-15618
> URL: https://issues.apache.org/jira/browse/SPARK-15618
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.0.0
>Reporter: Andrew Or
>Assignee: Apache Spark
>Priority: Minor
>
> There are many places where we could be more explicit about the particular 
> underlying SparkContext we want, but we just do 
> `SparkSession.builder.getOrCreate()` anyway. It's better to be clearer in the 
> code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-15618) Use SparkSession.builder.sparkContext(...) in tests where possible

2016-05-27 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-15618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-15618:


Assignee: Dongjoon Hyun  (was: Apache Spark)

> Use SparkSession.builder.sparkContext(...) in tests where possible
> --
>
> Key: SPARK-15618
> URL: https://issues.apache.org/jira/browse/SPARK-15618
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.0.0
>Reporter: Andrew Or
>Assignee: Dongjoon Hyun
>Priority: Minor
>
> There are many places where we could be more explicit about the particular 
> underlying SparkContext we want, but we just do 
> `SparkSession.builder.getOrCreate()` anyway. It's better to be clearer in the 
> code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org