Does it support over? I couldn't find it in the documentation
http://spark.apache.org/docs/latest/sql-programming-guide.html#supported-hive-features

Thanks
Best Regards

On Fri, Jan 22, 2016 at 2:31 PM, 汪洋 <tiandiwo...@icloud.com> wrote:

> I think it cannot be right.
>
> 在 2016年1月22日,下午4:53,汪洋 <tiandiwo...@icloud.com> 写道:
>
> Hi,
>
> Do we support distinct count in the over clause in spark sql?
>
> I ran a sql like this:
>
> select a, count(distinct b) over ( order by a rows between unbounded
> preceding and current row) from table limit 10
>
> Currently, it return an error says: expression ‘a' is neither present in
> the group by, nor is it an aggregate function. Add to group by or wrap in
> first() if you don't care which value you get.;
>
> Yang
>
>
>

Reply via email to