[ 
https://issues.apache.org/jira/browse/SPARK-5617?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reynold Xin resolved SPARK-5617.
--------------------------------
       Resolution: Fixed
    Fix Version/s: 1.3.0
         Assignee: wangfei

> test failure of SQLQuerySuite
> -----------------------------
>
>                 Key: SPARK-5617
>                 URL: https://issues.apache.org/jira/browse/SPARK-5617
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: wangfei
>            Assignee: wangfei
>             Fix For: 1.3.0
>
>
> SQLQuerySuite test failure: 
> [info] - simple select (22 milliseconds)
> [info] - sorting (722 milliseconds)
> [info] - external sorting (728 milliseconds)
> [info] - limit (95 milliseconds)
> [info] - date row *** FAILED *** (35 milliseconds)
> [info]   Results do not match for query:
> [info]   'Limit 1
> [info]    'Project [CAST(2015-01-28, DateType) AS c0#3630]
> [info]     'UnresolvedRelation [testData], None
> [info]   
> [info]   == Analyzed Plan ==
> [info]   Limit 1
> [info]    Project [CAST(2015-01-28, DateType) AS c0#3630]
> [info]     LogicalRDD [key#0,value#1], MapPartitionsRDD[1] at mapPartitions 
> at ExistingRDD.scala:35
> [info]   
> [info]   == Physical Plan ==
> [info]   Limit 1
> [info]    Project [16463 AS c0#3630]
> [info]     PhysicalRDD [key#0,value#1], MapPartitionsRDD[1] at mapPartitions 
> at ExistingRDD.scala:35
> [info]   
> [info]   == Results ==
> [info]   !== Correct Answer - 1 ==   == Spark Answer - 1 ==
> [info]   ![2015-01-28]               [2015-01-27] (QueryTest.scala:77)
> [info]   org.scalatest.exceptions.TestFailedException:
> [info]   at 
> org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:495)
> [info]   at 
> org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
> [info]   at org.scalatest.Assertions$class.fail(Assertions.scala:1328)
> [info]   at org.scalatest.FunSuite.fail(FunSuite.scala:1555)
> [info]   at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:77)
> [info]   at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:95)
> [info]   at 
> org.apache.spark.sql.SQLQuerySuite$$anonfun$23.apply$mcV$sp(SQLQuerySuite.scala:300)
> [info]   at 
> org.apache.spark.sql.SQLQuerySuite$$anonfun$23.apply(SQLQuerySuite.scala:300)
> [info]   at 
> org.apache.spark.sql.SQLQuerySuite$$anonfun$23.apply(SQLQuerySuite.scala:300)
> [info]   at 
> org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
> [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
> [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
> [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
> [info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
> [info]   at org.scalatest.FunSuite.withFixture(FunSuite.scala:1555)
> [info]   at 
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
> [info]   at 
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
> [info]   at 
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
> [info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
> [info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
> [info]   at 
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> [info]   at 
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNode



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to