Hi Sparkers,
I have written a code in python in eclipse now that code should execute in
spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help
me with instructions.
Regards,
Sandeep.v
I just do Run as Applicaton/Debug As Application on main program.
On Mon, Apr 20, 2015 at 12:14 PM, sandeep vura sandeepv...@gmail.com
wrote:
Hi Sparkers,
I have written a code in python in eclipse now that code should execute in
spark cluster like mapreduce jobs in hadoop cluster.Can
Why not build the project and submit the build jar with Spark submit?
If you want to run it within eclipse, then all you have to do is, create a
SparkContext pointing to your cluster, do a
sc.addJar(/path/to/your/project/jar) and then you can hit the run button
to run the job (note that network