Hi,

In Spark 1.6.1 this worked

scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
HH:mm:ss.ss') ").collect.foreach(println)
[08/08/2016 14:07:22.22]

Spark 2 should give due to backward compatibility?

But I get

cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
HH:mm:ss.ss') ").collect.foreach(println)
<console>:24: error: not found: value sqlContext
       sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
HH:mm:ss.ss') ").collect.foreach(println)

Now we can change it to HiveContext and it works

However, what is the best solution if any as we have loads of sqlContext in
our code?

Thanks

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to