How to run spark programs in eclipse like mapreduce

2015-04-20 Thread sandeep vura
Hi Sparkers, I have written a code in python in eclipse now that code should execute in spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help me with instructions. Regards, Sandeep.v

Re: How to run spark programs in eclipse like mapreduce

2015-04-20 Thread ๏̯͡๏
I just do Run as Applicaton/Debug As Application on main program. On Mon, Apr 20, 2015 at 12:14 PM, sandeep vura sandeepv...@gmail.com wrote: Hi Sparkers, I have written a code in python in eclipse now that code should execute in spark cluster like mapreduce jobs in hadoop cluster.Can

Re: How to run spark programs in eclipse like mapreduce

2015-04-20 Thread Akhil Das
Why not build the project and submit the build jar with Spark submit? If you want to run it within eclipse, then all you have to do is, create a SparkContext pointing to your cluster, do a sc.addJar(/path/to/your/project/jar) and then you can hit the run button to run the job (note that network