[ https://issues.apache.org/jira/browse/SPARK-32978?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17201516#comment-17201516 ]
Chen Zhang commented on SPARK-32978: ------------------------------------ Hello, [~yumwang] I used the default config Spark to run this code and failed to reproduce this issue. {code:none} number of written files: 50 written output: 55.4 KiB number of output rows: 10,000 number of dynamic part: 50 {code} > Incorrect number of dynamic part metric > --------------------------------------- > > Key: SPARK-32978 > URL: https://issues.apache.org/jira/browse/SPARK-32978 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Yuming Wang > Priority: Major > Attachments: screenshot-1.png > > > How to reproduce this issue: > {code:sql} > create table dynamic_partition(i bigint, part bigint) using parquet > partitioned by (part); > insert overwrite table dynamic_partition partition(part) select id, id % 50 > as part from range(10000); > {code} > The number of dynamic part should be 50, but it is 800. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org