Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Koert Kuipers
you can get it from the SparkSession for backwards compatibility:
val sqlContext = spark.sqlContext

On Mon, Aug 8, 2016 at 9:11 AM, Mich Talebzadeh 
wrote:

> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> [08/08/2016 14:07:22.22]
>
> Spark 2 should give due to backward compatibility?
>
> But I get
>
> cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> :24: error: not found: value sqlContext
>sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
>
> Now we can change it to HiveContext and it works
>
> However, what is the best solution if any as we have loads of sqlContext
> in our code?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>


Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Jacek Laskowski
Hi,

Also, in shell you have sql function available without the object.

Jacek

On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh" 
wrote:

> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> [08/08/2016 14:07:22.22]
>
> Spark 2 should give due to backward compatibility?
>
> But I get
>
> cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> :24: error: not found: value sqlContext
>sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
>
> Now we can change it to HiveContext and it works
>
> However, what is the best solution if any as we have loads of sqlContext
> in our code?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>


Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Jacek Laskowski
What about the following :

val sqlContext = spark

?

On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh" 
wrote:

> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> [08/08/2016 14:07:22.22]
>
> Spark 2 should give due to backward compatibility?
>
> But I get
>
> cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
> :24: error: not found: value sqlContext
>sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
>
> Now we can change it to HiveContext and it works
>
> However, what is the best solution if any as we have loads of sqlContext
> in our code?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>


Spark 2 and existing code with sqlContext

2016-08-08 Thread Mich Talebzadeh
Hi,

In Spark 1.6.1 this worked

scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') ").collect.foreach(println)
[08/08/2016 14:07:22.22]

Spark 2 should give due to backward compatibility?

But I get

cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') ").collect.foreach(println)
:24: error: not found: value sqlContext
   sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') ").collect.foreach(println)

Now we can change it to HiveContext and it works

However, what is the best solution if any as we have loads of sqlContext in
our code?

Thanks

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.