we solve this.. I thought repartition would work
but it did not..
I try to use rowBetween or rangeBetween but it was giving error --
pyspark.sql.utils.AnalysisException: u'Window Frame ROWS BETWEEN 1
PRECEDING AND 1 FOLLOWING must match the required frame ROWS BETWEEN
UNBOUNDED PRECEDING AN
D
-- Forwarded message --
From: Dana Ram Meghwal <dana...@saavn.com>
Date: Thu, Feb 23, 2017 at 10:40 PM
Subject: Duplicate Rank for within same partitions
To: user-h...@spark.apache.org
Hey Guys,
I am new to spark. I am trying to write a spark script which involves
finding