BTW I forgot to mention that this was added through SPARK-11736 which went
into the upcoming 1.6.0 release

FYI

On Mon, Dec 7, 2015 at 12:53 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> scala> val test=sqlContext.sql("select monotonically_increasing_id() from
> t").show
> +---+
> |_c0|
> +---+
> |  0|
> |  1|
> |  2|
> +---+
>
> Cheers
>
> On Mon, Dec 7, 2015 at 12:48 PM, sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Ted,
>>
>> Gave and exception am I following right approach ?
>>
>> val test=sqlContext.sql("select *,  monotonicallyIncreasingId()  from kali")
>>
>>
>> On Mon, Dec 7, 2015 at 4:52 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Have you tried using monotonicallyIncreasingId ?
>>>
>>> Cheers
>>>
>>> On Mon, Dec 7, 2015 at 7:56 AM, Sri <kali.tumm...@gmail.com> wrote:
>>>
>>>> Thanks , I found the right function current_timestamp().
>>>>
>>>> different Question:-
>>>> Is there a row_number() function in spark SQL ? Not in Data frame just
>>>> spark SQL?
>>>>
>>>>
>>>> Thanks
>>>> Sri
>>>>
>>>> Sent from my iPhone
>>>>
>>>> On 7 Dec 2015, at 15:49, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>
>>>> Does unix_timestamp() satisfy your needs ?
>>>> See
>>>> sql/core/src/test/scala/org/apache/spark/sql/DateFunctionsSuite.scala
>>>>
>>>> On Mon, Dec 7, 2015 at 6:54 AM, kali.tumm...@gmail.com <
>>>> kali.tumm...@gmail.com> wrote:
>>>>
>>>>> I found a way out.
>>>>>
>>>>> import java.text.SimpleDateFormat
>>>>> import java.util.Date;
>>>>>
>>>>> val format = new SimpleDateFormat("yyyy-M-dd hh:mm:ss")
>>>>>
>>>>>  val testsql=sqlContext.sql("select
>>>>> column1,column2,column3,column4,column5
>>>>> ,'%s' as TIME_STAMP from TestTable limit 10".format(format.format(new
>>>>> Date())))
>>>>>
>>>>>
>>>>> Thanks
>>>>> Sri
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-sql-current-time-stamp-function-tp25620p25621.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Thanks & Regards
>> Sri Tummala
>>
>>
>

Reply via email to