[ 
https://issues.apache.org/jira/browse/SPARK-22308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16211291#comment-16211291
 ] 

Nathan Kronenfeld commented on SPARK-22308:
-------------------------------------------

That's a perfect example of what I'm talking about.

Take a look at 
https://github.com/holdenk/spark-testing-base/blob/master/src/main/2.0/scala/com/holdenkarau/spark/testing/SharedSparkContext.scala
 - it's essentially a copy of SharedSparkContext in the Spark code base,.

Which isn't even necessary now, as SharedSparkContext already _is_ published.

If you take a look at my associated PR, it's barely messing wth 
SharedSparkContext at all - that is already pretty much fine.  Even 
SharedSQLContext is mostly fine.

All it's doing is pulling apart SharedSQLContext into the part that needs to be 
a FunSuite and the part that just needs to be a Suite (into 
SharedSessionContext) so that it can be used with other styles of tests.

In other words, we already are publishing this stuff, I'm just trying to make 
it more usable.

> Support unit tests of spark code using ScalaTest using suites other than 
> FunSuite
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-22308
>                 URL: https://issues.apache.org/jira/browse/SPARK-22308
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, Spark Core, SQL, Tests
>    Affects Versions: 2.2.0
>            Reporter: Nathan Kronenfeld
>            Priority: Minor
>              Labels: scalatest, test-suite, test_issue
>
> External codebases that have spark code can test it using SharedSparkContext, 
> no matter how they write their scalatests - basing on FunSuite, FunSpec, 
> FlatSpec, or WordSpec.
> SharedSQLContext only supports FunSuite.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to