Hi Assaf,
No idea (and don't remember I've ever wondered about it before), but why
not doing this (untested):
trait MySparkTestTrait {
lazy val spark: SparkSession = SparkSession.builder().getOrCreate() //
<-- you sure you don't need master?
import spark.implicits._
}
Wouldn't that import work?
Pozdrawiam,
Jacek Laskowski
https://about.me/JacekLaskowski
Mastering Spark SQL https://bit.ly/mastering-spark-sql
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Kafka Streams https://bit.ly/mastering-kafka-streams
Follow me at https://twitter.com/jaceklaskowski
On Sun, Aug 5, 2018 at 5:34 PM, assaf.mendelson
wrote:
> Hi all,
>
> I have been playing a bit with SQLImplicits and noticed that it is an
> abstract class. I was wondering why is that? It has no constructor.
>
> Because of it being an abstract class it means that adding a test trait
> cannot extend it and still be a trait.
>
> Consider the following:
>
> trait MySparkTestTrait extends SQLImplicits {
> lazy val spark: SparkSession = SparkSession.builder().getOrCreate()
> protected override def _sqlContext: SQLContext = spark.sqlContext
> }
>
>
> This would mean that if I can do something like this:
>
>
> class MyTestClass extends FunSuite with MySparkTestTrait {
> test("SomeTest") {
> // use spark implicits without needing to do import
> spark.implicits._
> }
> }
>
> Is there a reason for this being an abstract class?
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>