Hi,

We currently do not support distinct clauses in window functions. Nor is
such functionality planned.

Spark 2.0 uses native spark UDAFs (instead of Hive window functions) and
allows you to use your own UDAFs, it is trivial to implement a distinct
count/sum in that case.

Kind regards,

Herman van Hövell

2016-01-27 13:25 GMT+01:00 Akhil Das <ak...@sigmoidanalytics.com>:

> Does it support over? I couldn't find it in the documentation
> http://spark.apache.org/docs/latest/sql-programming-guide.html#supported-hive-features
>
> Thanks
> Best Regards
>
> On Fri, Jan 22, 2016 at 2:31 PM, 汪洋 <tiandiwo...@icloud.com> wrote:
>
>> I think it cannot be right.
>>
>> 在 2016年1月22日,下午4:53,汪洋 <tiandiwo...@icloud.com> 写道:
>>
>> Hi,
>>
>> Do we support distinct count in the over clause in spark sql?
>>
>> I ran a sql like this:
>>
>> select a, count(distinct b) over ( order by a rows between unbounded
>> preceding and current row) from table limit 10
>>
>> Currently, it return an error says: expression ‘a' is neither present in
>> the group by, nor is it an aggregate function. Add to group by or wrap in
>> first() if you don't care which value you get.;
>>
>> Yang
>>
>>
>>
>

Reply via email to