[ https://issues.apache.org/jira/browse/SPARK-23304?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16350423#comment-16350423 ]
Thomas Graves commented on SPARK-23304: --------------------------------------- it doesn't look like sql("xyz").rdd.partitions.length comes back correct in either spark 2.2 or 2.3. But if I change the query from SELECT COUNT(DISTINCT(bcookie)) . to just SELECT bookie, then the partitions.length works. So perhaps is something with the count spark 2.3 SELECT COUNT(DISTINCT(bcookie)) scala> query.rdd.partitions.length res4: Int = 1 scala> query.count() [Stage 5:===============================================> (15420 + 619) / 16039] spark 2.2 SELECT COUNT(DISTINCT(bcookie)): scala> query.rdd.partitions.length res0: Int = 1 scala> query.count() [Stage 0:==========> (1136 + 1600) / 5346] spark 2.2 Query with just select bcookie: scala> query.rdd.partitions.length res1: Int = 5346 spark 2.3 Query with just select bcookie: scala> query.rdd.partitions.length res9: Int = 16039 Note if I change to just be SELECT DISTINCT(bcookie) then I get 200: scala> query.rdd.partitions.length res10: Int = 200 > Spark SQL coalesce() against hive not working > --------------------------------------------- > > Key: SPARK-23304 > URL: https://issues.apache.org/jira/browse/SPARK-23304 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.2.1, 2.3.0 > Reporter: Thomas Graves > Assignee: Xiao Li > Priority: Major > Attachments: spark22_oldorc_explain.txt, spark23_oldorc_explain.txt, > spark23_oldorc_explain_convermetastoreorcfalse.txt > > > The query below seems to ignore the coalesce. This is running spark 2.2 or > spark 2.3 against hive, which is reading orc: > > Query: > spark.sql("SELECT COUNT(DISTINCT(something)) FROM sometable WHERE dt >= > '20170301' AND dt <= '20170331' AND something IS NOT > NULL").coalesce(160000).show() > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org