[ 
https://issues.apache.org/jira/browse/SPARK-15491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15297385#comment-15297385
 ] 

Marc Prud'hommeaux edited comment on SPARK-15491 at 5/24/16 1:00 PM:
---------------------------------------------------------------------

Putting in some debug statements shows above the assertion failure:

{code}
fieldNames(5): List(url, table, parts, properties, sparkSession)

fieldValues(4): Stream(jdbc:postgresql://REDACTED, categories, 
[Lorg.apache.spark.Partition;@657b3b, {user=REDACTED, password=REDACTED, 
url=jdbc:postgresql://REDACTED, dbtable=categories})
{code}

It looks like the "sparkSession" is the mismatched value.

Also, commenting out the assertion at TreeNode.scala:598 seems to work: JSON is 
output, and there is no assertion error thrown.


was (Author: mprudhom):
Putting in some debug statements shows above the assertion failure:

fieldNames(5): List(url, table, parts, properties, sparkSession)

fieldValues(4): Stream(jdbc:postgresql://REDACTED, categories, 
[Lorg.apache.spark.Partition;@657b3b, {user=REDACTED, password=REDACTED, 
url=jdbc:postgresql://REDACTED, dbtable=categories})

It looks like the "sparkSession" is the mismatched value.

Also, commenting out the assertion at TreeNode.scala:598 seems to work: JSON is 
output, and there is no assertion error thrown.

> JSON serialization fails for JDBC DataFrames
> --------------------------------------------
>
>                 Key: SPARK-15491
>                 URL: https://issues.apache.org/jira/browse/SPARK-15491
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>         Environment: MacOS 10.11.5, Spark 2.0.0-preview
>            Reporter: Marc Prud'hommeaux
>
> The TreeNode.toJSON feature implemented in SPARK-12321 fails with an 
> assertion error on DataFrames that use JDBC in spark 2.0.0-preview:
> {code}
> scala> 
> sqlContext.read.json("examples/src/main/resources/people.json").select("name",
>  "age").agg(avg("age"), count("name")).filter(avg("age") > 
> 10).queryExecution.logical.toJSON
> res113: String = 
> [{"class":"org.apache.spark.sql.catalyst.plans.logical.Filter","num-children":1,"condition":[{"class":"org.apache.spark.sql.catalyst.expressions.GreaterThan","num-children":2,"left":0,"right":1},{"class":"org.apache.spark.sql.catalyst.expressions.aggregate.AggregateExpression","num-children":1,...
> scala> sqlContext.read.format("jdbc").options(db + ("dbtable" -> 
> "categories")).load().queryExecution.logical.simpleString
> res120: String = Relation[category#2148,categoryname#2149] 
> JDBCRelation(categories)
> scala> sqlContext.read.format("jdbc").options(db + ("dbtable" -> 
> "categories")).load().queryExecution.logical.toJSON
> java.lang.AssertionError: assertion failed
>   at scala.Predef$.assert(Predef.scala:156)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode.org$apache$spark$sql$catalyst$trees$TreeNode$$parseToJson(TreeNode.scala:598)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$jsonFields$2.apply(TreeNode.scala:562)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$jsonFields$2.apply(TreeNode.scala:553)
>   at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>   at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>   at scala.collection.immutable.List.foreach(List.scala:381)
>   at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>   at scala.collection.immutable.List.map(List.scala:285)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode.jsonFields(TreeNode.scala:553)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode.org$apache$spark$sql$catalyst$trees$TreeNode$$collectJsonValue$1(TreeNode.scala:538)
>   at 
> org.apache.spark.sql.catalyst.trees.TreeNode.jsonValue(TreeNode.scala:543)
>   at org.apache.spark.sql.catalyst.trees.TreeNode.toJSON(TreeNode.scala:529)
>   ... 48 elided
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to