You don't need to set anything up, it'll create a local hive metastore by
default if you don't explicitly configure one.

On Thu, Sep 17, 2015 at 11:45 AM, Cui Lin <icecreamlc...@gmail.com> wrote:

> Hi, Michael,
>
> It works to me! Thanks a lot!
> If I use spark-hive or HiveContext, do I have to setup Hive on server? Can
> I run this on my local laptop?
>
> On Thu, Sep 17, 2015 at 11:02 AM, Michael Armbrust <mich...@databricks.com
> > wrote:
>
>> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.4.1"
>>
>> Though, I would consider using spark-hive and HiveContext, as the query 
>> parser is more powerful and you'll have access to window functions and other 
>> features.
>>
>>
>> On Thu, Sep 17, 2015 at 10:59 AM, Cui Lin <icecreamlc...@gmail.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I got stuck in adding spark sql into my standalone application.
>>> The build.sbt is defined as:
>>>
>>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
>>>
>>>
>>> I got the following error when building the package:
>>>
>>> *[error] /data/workspace/test/src/main/scala/TestMain.scala:6: object sql 
>>> is not a member of package org.apache.spark
>>> [error] import org.apache.spark.sql.SQLContext;
>>> [error]                         ^
>>> [error] /data/workspace/test/src/main/scala/TestMain.scala:19: object sql 
>>> is not a member of package org.apache.spark
>>> [error]     val sqlContext = new org.apache.spark.sql.SQLContext(sc)
>>> [error]                                           ^
>>> [error] two errors found
>>> [error] (compile:compile) Compilation failed*
>>>
>>>
>>> So sparksql is not part of spark core package? I have no issue when
>>> testing my codes in spark-shell. Thanks for the help!
>>>
>>>
>>>
>>> --
>>> Best regards!
>>>
>>> Lin,Cui
>>>
>>
>>
>
>
> --
> Best regards!
>
> Lin,Cui
>

Reply via email to