For intelliJ + SBT, also you can follow the directions
http://jayunit100.blogspot.com/2014/07/set-up-spark-application-devleopment.html
.  ITs really easy to run spark in an IDE .  The process for eclipse is
virtually identical.

On Fri, Oct 3, 2014 at 10:03 AM, Sanjay Subramanian <
sanjaysubraman...@yahoo.com.invalid> wrote:

> cool thanks will set this up and report back how things went
> regards
> sanjay
>   ------------------------------
>  *From:* Daniel Siegmann <daniel.siegm...@velos.io>
> *To:* Ashish Jain <ashish....@gmail.com>
> *Cc:* Sanjay Subramanian <sanjaysubraman...@yahoo.com>; "
> user@spark.apache.org" <user@spark.apache.org>
> *Sent:* Thursday, October 2, 2014 6:52 AM
> *Subject:* Re: Spark inside Eclipse
>
> You don't need to do anything special to run in local mode from within
> Eclipse. Just create a simple SparkConf and create a SparkContext from
> that. I have unit tests which execute on a local SparkContext, and they
> work from inside Eclipse as well as SBT.
>
> val conf = new SparkConf().setMaster("local").setAppName(s"Whatever")
> val sc = new SparkContext(sparkConf)
>
> Keep in mind you can only have one local SparkContext at a time,
> otherwise you will get some weird errors. If you have tests running
> sequentially, make sure to close the SparkContext in your tear down
> method. If tests run in parallel you'll need to share the SparkContext
> between tests.
>
> For unit testing, you can make use of SparkContext.parallelize to set up
> your test inputs and RDD.collect to retrieve the outputs.
>
>
>
>
> On Wed, Oct 1, 2014 at 7:43 PM, Ashish Jain <ashish....@gmail.com> wrote:
>
> Hello Sanjay,
> This can be done, and is a very effective way to debug.
> 1) Compile and package your project to get a fat jar
> 2) In your SparkConf use setJars and give location of this jar. Also set
> your master here as local in SparkConf
> 3) Use this SparkConf when creating JavaSparkContext
> 4) Debug your program like you would any normal program.
> Hope this helps.
> Thanks
> Ashish
> On Oct 1, 2014 4:35 PM, "Sanjay Subramanian"
> <sanjaysubraman...@yahoo.com.invalid> wrote:
>
> hey guys
>
> Is there a way to run Spark in local mode from within Eclipse.
> I am running Eclipse Kepler on a Macbook Pro with Mavericks
> Like one can run hadoop map/reduce applications from within Eclipse and
> debug and learn.
>
> thanks
>
> sanjay
>
>
>
>
> --
> Daniel Siegmann, Software Developer
> Velos
> Accelerating Machine Learning
>
> 440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
> E: daniel.siegm...@velos.io W: www.velos.io
>
>
>


-- 
jay vyas

Reply via email to