I could not correctly import org.apache.spark.LocalSparkContext,

I use sbt on Intellij for developing,here is my build sbt.

```
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.0"

libraryDependencies += "com.clearspring.analytics" % "stream" % "2.7.0"

libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
```

I think maybe I have make some mistakes on the library setting, as a new
developer of spark application, I wonder what is the standard procedure of
developing a spark application.

Any reply is appreciated.


Alcaid


2015-01-21 2:05 GMT+08:00 Will Benton <wi...@redhat.com>:

> It's declared here:
>
>
> https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/LocalSparkContext.scala
>
> I assume you're already importing LocalSparkContext, but since the test
> classes aren't included in Spark packages, you'll also need to package them
> up in order to use them in your application (viz., outside of Spark).
>
>
>
> best,
> wb
>
> ----- Original Message -----
> > From: "James" <alcaid1...@gmail.com>
> > To: dev@spark.apache.org
> > Sent: Tuesday, January 20, 2015 6:35:07 AM
> > Subject: not found: type LocalSparkContext
> >
> > Hi all,
> >
> > When I was trying to write a test on my spark application I met
> >
> > ```
> > Error:(14, 43) not found: type LocalSparkContext
> > class HyperANFSuite extends FunSuite with LocalSparkContext {
> > ```
> >
> > At the source code of spark-core I could not found "LocalSparkContext",
> > thus I wonder how to write a test like [this] (
> >
> https://github.com/apache/spark/blob/master/graphx/src/test/scala/org/apache/spark/graphx/lib/ConnectedComponentsSuite.scala
> > )
> >
> > Alcaid
> >
>

Reply via email to