I have set spark.sql.shuffle.partitions=1000 then also its failing.


On Tue, Aug 25, 2015 at 11:36 AM, Raghavendra Pandey <
raghavendra.pan...@gmail.com> wrote:

> Did you try increasing sql partitions?
>
> On Tue, Aug 25, 2015 at 11:06 AM, kundan kumar <iitr.kun...@gmail.com>
> wrote:
>
>> I am running this query on a data size of 4 billion rows and
>> getting org.apache.spark.shuffle.FetchFailedException error.
>>
>> select adid,position,userid,price
>> from (
>> select adid,position,userid,price,
>> dense_rank() OVER (PARTITION BY adlocationid ORDER BY price DESC) as rank
>> FROM trainInfo) as tmp
>> WHERE rank <= 2
>>
>>
>> I have attached the error logs from spark-sql terminal.
>>
>> Please suggest what is the reason for these kind of errors and how can I
>> resolve them.
>>
>>
>> Regards,
>> Kundan
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>
>

Reply via email to