[jira] [Updated] (SPARK-40412) limit(x,y) + 子查询 出现数据丢失和乱序问题

2022-09-13 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40412?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-40412:

Fix Version/s: (was: 2.4.5)

> limit(x,y) + 子查询 出现数据丢失和乱序问题
> 
>
> Key: SPARK-40412
> URL: https://issues.apache.org/jira/browse/SPARK-40412
> Project: Spark
>  Issue Type: Bug
>  Components: Shuffle
>Affects Versions: 2.4.5
> Environment: hive on spark
> hive 3.1.0
> spark 2.4.5
>Reporter: FengJia
>Priority: Major
>  Labels: hiveonspark, limit
>
> select * 
> from(
> select * from
> table
> limit 10,20
> )
> 结果只有10条  并且不是第11条到第20条  顺序也不对
>  
> select * from
> table
> limit 10,20
> 结果是20条,顺序是11到第30条
> select * 
> from(
> select * from
> table
> order by id
> limit 10,20
> )
> 结果是20条,且顺序也是11到30条
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-40412) limit(x,y) + 子查询 出现数据丢失和乱序问题

2022-09-13 Thread Yuming Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-40412?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-40412:

Target Version/s:   (was: 2.4.5)

> limit(x,y) + 子查询 出现数据丢失和乱序问题
> 
>
> Key: SPARK-40412
> URL: https://issues.apache.org/jira/browse/SPARK-40412
> Project: Spark
>  Issue Type: Bug
>  Components: Shuffle
>Affects Versions: 2.4.5
> Environment: hive on spark
> hive 3.1.0
> spark 2.4.5
>Reporter: FengJia
>Priority: Major
>  Labels: hiveonspark, limit
>
> select * 
> from(
> select * from
> table
> limit 10,20
> )
> 结果只有10条  并且不是第11条到第20条  顺序也不对
>  
> select * from
> table
> limit 10,20
> 结果是20条,顺序是11到第30条
> select * 
> from(
> select * from
> table
> order by id
> limit 10,20
> )
> 结果是20条,且顺序也是11到30条
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org