1.      It depends on what you want to do. Don’t worry about singleton and 
wiring the beans as it is pretty much taken care by the Spark Framework itself. 
Infact doing so, you will run into issues like serialization errors.



2.      You can write your code using Scala/ Python using the spark shell or a 
notebook like Ipython, zeppelin  or if you have written a application using 
Scala/Java using the Spark API you can create a jar and run it using 
spark-submit.

From: HARSH TAKKAR [mailto:takkarha...@gmail.com]
Sent: Monday, February 01, 2016 10:00 AM
To: user@spark.apache.org
Subject: Re: Using Java spring injection with spark


Hi

Please can anyone reply on this.

On Mon, 1 Feb 2016, 4:28 p.m. HARSH TAKKAR 
<takkarha...@gmail.com<mailto:takkarha...@gmail.com>> wrote:
Hi
I am new to apache spark and big data analytics, before starting to code on 
spark data frames and rdd, i just wanted to confirm following
1. Can we create an implementation of java.api.Function as a singleton bean 
using the spring frameworks and, can it be injected using autowiring to other 
classes.
2. what is the best way to submit jobs to spark , using the api or using the 
shell script?
Looking forward for your help,

Kind Regards
Harsh

Reply via email to