Hi,
Thanks for the quick replies. I've tried those suggestions but Eclipse is
showing:
*Unable** to find encoder for type stored in a Dataset. Primitive
types (Int, String, etc) and Product types (case classes) are supported by
importing sqlContext.implicits._ Support for serializing other
Hi,
I think encoders for case classes are already provided in spark. You'll
just need to import them.
val sql = new SQLContext(sc)
import sql.implicits._
And then do the cast to Dataset.
2016-06-06 14:13 GMT+02:00 Dave Maughan :
> Hi,
>
> I've figured out how to select data from a remo
Hi,
I've figured out how to select data from a remote Hive instance and encode
the DataFrame -> Dataset using a Java POJO class:
TestHive.sql("select foo_bar as `fooBar` from table1"
).as(Encoders.bean(classOf[Table1])).show()
However, I'm struggling to find out to do the equivalent in Scala