You shouldn't need to set SPARK_HIVE=true unless you want to use the
JavaHiveContext.  You should be able to access
org.apache.spark.sql.api.java.JavaSQLContext with the default build.

How are you building your application?

Michael


On Thu, Apr 24, 2014 at 9:17 AM, Andrew Or <and...@databricks.com> wrote:

> Did you build it with SPARK_HIVE=true?
>
>
> On Thu, Apr 24, 2014 at 7:00 AM, diplomatic Guru <diplomaticg...@gmail.com
> > wrote:
>
>> Hi Matei,
>>
>> I checked out the git repository and built it. However, I'm still getting
>> below error. It couldn't find those SQL packages. Please advice.
>>
>> package org.apache.spark.sql.api.java does not exist
>> [ERROR]
>> /home/VirtualBoxImages.com/Documents/projects/errCount/src/main/java/errorCount/TransDriverSQL.java:[49,8]
>> cannot find symbol
>> [ERROR] symbol  : class JavaSchemaRDD
>>
>> Kind regards,
>>
>> Raj.
>>
>>
>>
>> On 23 April 2014 22:09, Matei Zaharia <matei.zaha...@gmail.com> wrote:
>>
>>> It's currently in the master branch, on https://github.com/apache/spark.
>>> You can check that out from git, build it with sbt/sbt assembly, and then
>>> try it out. We're also going to post some release candidates soon that will
>>> be pre-built.
>>>
>>> Matei
>>>
>>> On Apr 23, 2014, at 1:30 PM, diplomatic Guru <diplomaticg...@gmail.com>
>>> wrote:
>>>
>>> > Hello Team,
>>> >
>>> > I'm new to SPARK and just came across SPARK SQL, which appears to be
>>> interesting but not sure how I could get it.
>>> >
>>> > I know it's an Alpha version but not sure if its available for
>>> community yet.
>>> >
>>> > Many thanks.
>>> >
>>> > Raj.
>>>
>>>
>>
>

Reply via email to