you can get it from the SparkSession for backwards compatibility:
val sqlContext = spark.sqlContext
On Mon, Aug 8, 2016 at 9:11 AM, Mich Talebzadeh
wrote:
> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(),
Hi,
Also, in shell you have sql function available without the object.
Jacek
On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh"
wrote:
> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss')
What about the following :
val sqlContext = spark
?
On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh"
wrote:
> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
> HH:mm:ss.ss') ").collect.foreach(println)
>
Hi,
In Spark 1.6.1 this worked
scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/
HH:mm:ss.ss') ").collect.foreach(println)
[08/08/2016 14:07:22.22]
Spark 2 should give due to backward compatibility?
But I get
cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(),