Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Koert Kuipers
you can get it from the SparkSession for backwards compatibility: val sqlContext = spark.sqlContext On Mon, Aug 8, 2016 at 9:11 AM, Mich Talebzadeh wrote: > Hi, > > In Spark 1.6.1 this worked > > scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(),

Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Jacek Laskowski
Hi, Also, in shell you have sql function available without the object. Jacek On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh" wrote: > Hi, > > In Spark 1.6.1 this worked > > scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/ > HH:mm:ss.ss')

Re: Spark 2 and existing code with sqlContext

2016-08-12 Thread Jacek Laskowski
What about the following : val sqlContext = spark ? On 8 Aug 2016 6:11 a.m., "Mich Talebzadeh" wrote: > Hi, > > In Spark 1.6.1 this worked > > scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/ > HH:mm:ss.ss') ").collect.foreach(println) >

Spark 2 and existing code with sqlContext

2016-08-08 Thread Mich Talebzadeh
Hi, In Spark 1.6.1 this worked scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/ HH:mm:ss.ss') ").collect.foreach(println) [08/08/2016 14:07:22.22] Spark 2 should give due to backward compatibility? But I get cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(),