Github user dilipbiswal commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22270#discussion_r215484920
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/DataFrameFunctionsSuite.scala ---
    @@ -85,12 +85,12 @@ class DataFrameFunctionsSuite extends QueryTest with 
SharedSQLContext {
         }
     
         val df5 = Seq((Seq("a", null), Seq(1, 2))).toDF("k", "v")
    -    intercept[RuntimeException] {
    +    intercept[Exception] {
           df5.select(map_from_arrays($"k", $"v")).collect
    --- End diff --
    
    @maropu We get a SparkException here which in turn wraps a 
RuntimeException. When we have ConvertToLocalRelation active, we get a 
RuntimeException from driver. But when we disable it, the error is raised from 
the executor with a SparkException as the top level exception.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to