[ 
https://issues.apache.org/jira/browse/SPARK-35168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kent Yao reassigned SPARK-35168:
--------------------------------

    Assignee: Kent Yao

> mapred.reduce.tasks should be shuffle.partitions not 
> adaptive.coalescePartitions.initialPartitionNum
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-35168
>                 URL: https://issues.apache.org/jira/browse/SPARK-35168
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.2, 3.1.1, 3.2.0
>            Reporter: Kent Yao
>            Assignee: Kent Yao
>            Priority: Minor
>
> {code:java}
> spark-sql> set spark.sql.adaptive.coalescePartitions.initialPartitionNum=1;
> spark.sql.adaptive.coalescePartitions.initialPartitionNum     1
> Time taken: 2.18 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks;
> 21/04/21 14:27:11 WARN SetCommand: Property mapred.reduce.tasks is 
> deprecated, showing spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions  1
> Time taken: 0.03 seconds, Fetched 1 row(s)
> spark-sql> set spark.sql.shuffle.partitions;
> spark.sql.shuffle.partitions  200
> Time taken: 0.024 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks=2;
> 21/04/21 14:31:52 WARN SetCommand: Property mapred.reduce.tasks is 
> deprecated, automatically converted to spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions  2
> Time taken: 0.017 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks;
> 21/04/21 14:31:55 WARN SetCommand: Property mapred.reduce.tasks is 
> deprecated, showing spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions  1
> Time taken: 0.017 seconds, Fetched 1 row(s)
> spark-sql>
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to