Great, I got that to work following your example! Thanks.

A followup question is: If I had a custom SQL type (UserDefinedType<T>),
how can I map it to this type from the RDD in the DataFrame?

Regards

On Mon, Jan 18, 2016 at 1:35 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> By SparkSQLContext, I assume you mean SQLContext.
> From the doc for SQLContext#createDataFrame():
>
>    *  dataFrame.registerTempTable("people")
>    *  sqlContext.sql("select name from people").collect.foreach(println)
>
> If you want to persist table externally, you need Hive, etc
>
> Regards
>
> On Mon, Jan 18, 2016 at 10:28 AM, Raghu Ganti <raghuki...@gmail.com>
> wrote:
>
>> This requires Hive to be installed and uses HiveContext, right?
>>
>> What is the SparkSQLContext useful for?
>>
>> On Mon, Jan 18, 2016 at 1:27 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Please take a look
>>> at 
>>> sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveDataFrameSuite.scala
>>>
>>> On Mon, Jan 18, 2016 at 9:57 AM, raghukiran <raghuki...@gmail.com>
>>> wrote:
>>>
>>>> Is creating a table using the SparkSQLContext currently supported?
>>>>
>>>> Regards,
>>>> Raghu
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-create-table-tp25996.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>
>>
>

Reply via email to