The implicit conversion function mentioned by Hao is createSchemaRDD in
SQLContext/HiveContext.

You can import it by doing
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
// Or new org.apache.spark.sql.hive.HiveContext(sc) for HiveContext
import sqlContext.createSchemaRDD



On Wed, Oct 22, 2014 at 8:03 AM, Cheng, Hao <hao.ch...@intel.com> wrote:

>  You needn’t do anything, the implicit conversion should do this for you.
>
>
>
>
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala#L103
>
>
> https://github.com/apache/spark/blob/2ac40da3f9fa6d45a59bb45b41606f1931ac5e81/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala#L35
>
>
>
> Just be sure you import the right implicit conversion function.
>
>
>
> *From:* Dai, Kevin [mailto:yun...@ebay.com]
> *Sent:* Wednesday, October 22, 2014 4:17 PM
> *To:* user@spark.apache.org
> *Subject:* SchemaRDD Convert
>
>
>
> Hi, ALL
>
>
>
> I have a RDD of case class T and T contains several primitive types and a
> Map.
>
> How can I convert this to a SchemaRDD?
>
>
>
> Best Regards,
>
> Kevin.
>

Reply via email to