Hey Spark developers,

Is there a good reason for JsonRDD being a Scala object as opposed to
class? Seems most other RDDs are classes, and can be extended.

The reason I'm asking is that there is a problem with Hive interoperability
with JSON DataFrames where jsonFile generates case sensitive schema, while
Hive expects case insensitive and fails with an exception during
saveAsTable if there are two columns with the same name in different case.

I'm trying to resolve the problem, but that requires me to extend JsonRDD,
which I can't do. Other RDDs are subclass friendly, why is JsonRDD
different?

Dan

Reply via email to