toDF() works very well -- thanks

On Sun, Mar 15, 2015 at 6:12 AM, David Mitchell <jdavidmitch...@gmail.com>
wrote:

>
> Thank you for your help.  "toDF()" solved my first problem.  And, the
> second issue was a non-issue, since the second example worked without any
> modification.
>
> David
>
>
> On Sun, Mar 15, 2015 at 1:37 AM, Rishi Yadav <ri...@infoobjects.com>
> wrote:
>
>> programmatically specifying Schema needs
>>
>>  import org.apache.spark.sql.type._
>>
>> for StructType and StructField to resolve.
>>
>> On Sat, Mar 14, 2015 at 10:07 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> Yes I think this was already just fixed by:
>>>
>>> https://github.com/apache/spark/pull/4977
>>>
>>> a ".toDF()" is missing
>>>
>>> On Sat, Mar 14, 2015 at 4:16 PM, Nick Pentreath
>>> <nick.pentre...@gmail.com> wrote:
>>> > I've found people.toDF gives you a data frame (roughly equivalent to
>>> the
>>> > previous Row RDD),
>>> >
>>> > And you can then call registerTempTable on that DataFrame.
>>> >
>>> > So people.toDF.registerTempTable("people") should work
>>> >
>>> >
>>> >
>>> > —
>>> > Sent from Mailbox
>>> >
>>> >
>>> > On Sat, Mar 14, 2015 at 5:33 PM, David Mitchell <
>>> jdavidmitch...@gmail.com>
>>> > wrote:
>>> >>
>>> >>
>>> >> I am pleased with the release of the DataFrame API.  However, I
>>> started
>>> >> playing with it, and neither of the two main examples in the
>>> documentation
>>> >> work: http://spark.apache.org/docs/1.3.0/sql-programming-guide.html
>>> >>
>>> >> Specfically:
>>> >>
>>> >> Inferring the Schema Using Reflection
>>> >> Programmatically Specifying the Schema
>>> >>
>>> >>
>>> >> Scala 2.11.6
>>> >> Spark 1.3.0 prebuilt for Hadoop 2.4 and later
>>> >>
>>> >> Inferring the Schema Using Reflection
>>> >> scala>     people.registerTempTable("people")
>>> >> <console>:31: error: value registerTempTable is not a member of
>>> >> org.apache.spark
>>> >> .rdd.RDD[Person]
>>> >>                   people.registerTempTable("people")
>>> >>                          ^
>>> >>
>>> >> Programmatically Specifying the Schema
>>> >> scala> val peopleDataFrame = sqlContext.createDataFrame(people,
>>> schema)
>>> >> <console>:41: error: overloaded method value createDataFrame with
>>> >> alternatives:
>>> >>   (rdd: org.apache.spark.api.java.JavaRDD[_],beanClass:
>>> >> Class[_])org.apache.spar
>>> >> k.sql.DataFrame <and>
>>> >>   (rdd: org.apache.spark.rdd.RDD[_],beanClass:
>>> >> Class[_])org.apache.spark.sql.Dat
>>> >> aFrame <and>
>>> >>   (rowRDD:
>>> >> org.apache.spark.api.java.JavaRDD[org.apache.spark.sql.Row],columns:
>>> >> java.util.List[String])org.apache.spark.sql.DataFrame <and>
>>> >>   (rowRDD:
>>> >> org.apache.spark.api.java.JavaRDD[org.apache.spark.sql.Row],schema: o
>>> >> rg.apache.spark.sql.types.StructType)org.apache.spark.sql.DataFrame
>>> <and>
>>> >>   (rowRDD: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row],schema:
>>> >> org.apache
>>> >> .spark.sql.types.StructType)org.apache.spark.sql.DataFrame
>>> >>  cannot be applied to (org.apache.spark.rdd.RDD[String],
>>> >> org.apache.spark.sql.ty
>>> >> pes.StructType)
>>> >>        val df = sqlContext.createDataFrame(people, schema)
>>> >>
>>> >> Any help would be appreciated.
>>> >>
>>> >> David
>>> >>
>>> >
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>
>
> --
> ### Confidential e-mail, for recipient's (or recipients') eyes only, not
> for distribution. ###
>

Reply via email to