The easiest way is to import an SBT project into IntelliJ that has the
Spark jars as dependencies. Then you can work in local mode (--master
local[*]). You could start with the build files in my "spark workshop",
https://github.com/deanwampler/spark-workshop and go from there.

I haven't done much with worksheets in IDEs and Spark. You would need to
create the SparkContext,

import org.apache.spark.SparkContext
val sc = new SparkContext("local[*]", "app")

Rerunning the worksheet will be tricky, though. You'll have to stop and
reconstruct the sc every time. Put sc.stop() at the end of the page.

dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Mon, Dec 21, 2015 at 11:21 AM, Eran Witkon <eranwit...@gmail.com> wrote:

> Any pointers how to use InteliJ for spark development?
> Any way to use scala worksheet run like spark- shell?
>

Reply via email to