[GitHub] spark pull request #15676: [SPARK-18167] [SQL] Add debug code for SQLQuerySu...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/15676 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #15676: [SPARK-18167] [SQL] Add debug code for SQLQuerySu...
Github user yhuai commented on a diff in the pull request: https://github.com/apache/spark/pull/15676#discussion_r85627218 --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala --- @@ -585,7 +586,19 @@ private[client] class Shim_v0_13 extends Shim_v0_12 { getAllPartitionsMethod.invoke(hive, table).asInstanceOf[JSet[Partition]] } else { logDebug(s"Hive metastore filter is '$filter'.") -getPartitionsByFilterMethod.invoke(hive, table, filter).asInstanceOf[JArrayList[Partition]] +try { + getPartitionsByFilterMethod.invoke(hive, table, filter) +.asInstanceOf[JArrayList[Partition]] +} catch { + case e: InvocationTargetException => +// SPARK-18167 retry to investigate the flaky test. This should be reverted before +// the release is cut. +val retry = Try(getPartitionsByFilterMethod.invoke(hive, table, filter)) +val full = Try(getAllPartitionsMethod.invoke(hive, table)) +logError("getPartitionsByFilter failed, retry success = " + retry.isSuccess) +logError("getPartitionsByFilter failed, full fetch success = " + full.isSuccess) --- End diff -- I am wondering if we should also log all of partition specs? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #15676: [SPARK-18167] [SQL] Add debug code for SQLQuerySu...
GitHub user ericl opened a pull request: https://github.com/apache/spark/pull/15676 [SPARK-18167] [SQL] Add debug code for SQLQuerySuite flakiness when metastore partition pruning is enabled ## What changes were proposed in this pull request? org.apache.spark.sql.hive.execution.SQLQuerySuite is flaking when hive partition pruning is enabled. Based on the stack traces, it seems to be an old issue where Hive fails to cast a numeric partition column ("Invalid character string format for type DECIMAL"). There are two possibilities here: either we are somehow corrupting the partition table to have non-decimal values in that column, or there is a transient issue with Derby. This PR logs the result of the retry when this exception is encountered, so we can confirm what is going on. ## How was this patch tested? n/a You can merge this pull request into a Git repository by running: $ git pull https://github.com/ericl/spark spark-18167 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/15676.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #15676 commit 0d62dbd96dddf65a263532269f534383ddc32456 Author: Eric LiangDate: 2016-10-28T22:43:06Z Fri Oct 28 15:43:06 PDT 2016 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org