...

Thank you! I'm so stupid... This is the only thing I miss in the
tutorial...orz

Thanks,
Tim

2014-12-04 16:49 GMT-06:00 Michael Armbrust <mich...@databricks.com>:

> You need to import sqlContext._
>
> On Thu, Dec 4, 2014 at 2:26 PM, Tim Chou <timchou....@gmail.com> wrote:
>
>> I have tried to use function where and filter in SchemaRDD.
>>
>> I have build class for tuple/record in the table like this:
>> case class Region(num:Int, str1:String, str2:String)
>>
>> I also successfully create a SchemaRDD.
>>
>> scala> val results = sqlContext.sql("select * from region")
>> results: org.apache.spark.sql.SchemaRDD =
>> SchemaRDD[99] at RDD at SchemaRDD.scala:103
>> == Query Plan ==
>> == Physical Plan ==
>> ExistingRdd [num#0,str1#1,str2#2], MapPartitionsRDD[4] at mapPartitions
>> at BasicOperators.scala:208
>>
>> But I cannot use symbol in where and filter function. Here is the log:
>>
>> scala> results.where('num === 1)
>> <console>:22: error: value === is not a member of Symbol
>>               results.where('num === 1)
>>                                   ^
>>
>> I don't know why.
>> Any suggestions?
>>
>> Thanks,
>> Tim
>>
>
>

Reply via email to