Re: SQLContext and "stable identifier required"

2016-05-03 Thread Reynold Xin
Probably not. Want to submit a pull request?

On Tuesday, May 3, 2016, Koert Kuipers  wrote:

> yes it works fine if i switch to using the implicits on the SparkSession
> (which is a val)
>
> but do we want to break the old way of doing the import?
>
> On Tue, May 3, 2016 at 12:56 PM, Ted Yu  > wrote:
>
>> Have you tried the following ?
>>
>> scala> import spark.implicits._
>> import spark.implicits._
>>
>> scala> spark
>> res0: org.apache.spark.sql.SparkSession =
>> org.apache.spark.sql.SparkSession@323d1fa2
>>
>> Cheers
>>
>> On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers > > wrote:
>>
>>> with the introduction of SparkSession SQLContext changed from being a
>>> lazy val to a def.
>>> however this is troublesome if you want to do:
>>>
>>> import someDataset.sqlContext.implicits._
>>>
>>> because it is no longer a stable identifier, i think? i get:
>>> stable identifier required, but someDataset.sqlContext.implicits found.
>>>
>>> anyone else seen this?
>>>
>>>
>>
>


Re: SQLContext and "stable identifier required"

2016-05-03 Thread Koert Kuipers
sure i can do that

On Tue, May 3, 2016 at 1:21 PM, Reynold Xin  wrote:

> Probably not. Want to submit a pull request?
>
>
> On Tuesday, May 3, 2016, Koert Kuipers  wrote:
>
>> yes it works fine if i switch to using the implicits on the SparkSession
>> (which is a val)
>>
>> but do we want to break the old way of doing the import?
>>
>> On Tue, May 3, 2016 at 12:56 PM, Ted Yu  wrote:
>>
>>> Have you tried the following ?
>>>
>>> scala> import spark.implicits._
>>> import spark.implicits._
>>>
>>> scala> spark
>>> res0: org.apache.spark.sql.SparkSession =
>>> org.apache.spark.sql.SparkSession@323d1fa2
>>>
>>> Cheers
>>>
>>> On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers  wrote:
>>>
 with the introduction of SparkSession SQLContext changed from being a
 lazy val to a def.
 however this is troublesome if you want to do:

 import someDataset.sqlContext.implicits._

 because it is no longer a stable identifier, i think? i get:
 stable identifier required, but someDataset.sqlContext.implicits found.

 anyone else seen this?


>>>
>>


Re: SQLContext and "stable identifier required"

2016-05-03 Thread Koert Kuipers
yes it works fine if i switch to using the implicits on the SparkSession
(which is a val)

but do we want to break the old way of doing the import?

On Tue, May 3, 2016 at 12:56 PM, Ted Yu  wrote:

> Have you tried the following ?
>
> scala> import spark.implicits._
> import spark.implicits._
>
> scala> spark
> res0: org.apache.spark.sql.SparkSession =
> org.apache.spark.sql.SparkSession@323d1fa2
>
> Cheers
>
> On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers  wrote:
>
>> with the introduction of SparkSession SQLContext changed from being a
>> lazy val to a def.
>> however this is troublesome if you want to do:
>>
>> import someDataset.sqlContext.implicits._
>>
>> because it is no longer a stable identifier, i think? i get:
>> stable identifier required, but someDataset.sqlContext.implicits found.
>>
>> anyone else seen this?
>>
>>
>


Re: SQLContext and "stable identifier required"

2016-05-03 Thread Ted Yu
Have you tried the following ?

scala> import spark.implicits._
import spark.implicits._

scala> spark
res0: org.apache.spark.sql.SparkSession =
org.apache.spark.sql.SparkSession@323d1fa2

Cheers

On Tue, May 3, 2016 at 9:16 AM, Koert Kuipers  wrote:

> with the introduction of SparkSession SQLContext changed from being a lazy
> val to a def.
> however this is troublesome if you want to do:
>
> import someDataset.sqlContext.implicits._
>
> because it is no longer a stable identifier, i think? i get:
> stable identifier required, but someDataset.sqlContext.implicits found.
>
> anyone else seen this?
>
>