Hello,
I got stuck in adding spark sql into my standalone application.
The build.sbt is defined as:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
I got the following error when building the package:
*[error] /data/workspace/test/src/main/scala/TestMain.scala:6: object
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.4.1"
Though, I would consider using spark-hive and HiveContext, as the
query parser is more powerful and you'll have access to window
functions and other features.
On Thu, Sep 17, 2015 at 10:59 AM, Cui Lin
You don't need to set anything up, it'll create a local hive metastore by
default if you don't explicitly configure one.
On Thu, Sep 17, 2015 at 11:45 AM, Cui Lin wrote:
> Hi, Michael,
>
> It works to me! Thanks a lot!
> If I use spark-hive or HiveContext, do I have to
Hi, Michael,
It works to me! Thanks a lot!
If I use spark-hive or HiveContext, do I have to setup Hive on server? Can
I run this on my local laptop?
On Thu, Sep 17, 2015 at 11:02 AM, Michael Armbrust
wrote:
> libraryDependencies += "org.apache.spark" %% "spark-sql" %