Hi,

To start a spark job, remotely we are using:


1)      spark-submit

2)      Spark Job Server


As for quick testing of my application in IDE, I am giving a call to function 
that makes Spark Context, set's all spark configuration and master, and 
executes the flow and I can see the output in my driver.
I was wondering  what if we don't use spark submit/spark job server and give a 
call to the function that executes the job.

Will it create any implications in production environment, am I missing some 
important points?


Thank You,
Prateek

"DISCLAIMER: This message is proprietary to Aricent and is intended solely for 
the use of the individual to whom it is addressed. It may contain privileged or 
confidential information and should not be circulated or used for any purpose 
other than for what it is intended. If you have received this message in error, 
please notify the originator immediately. If you are not the intended 
recipient, you are notified that you are strictly prohibited from using, 
copying, altering, or disclosing the contents of this message. Aricent accepts 
no responsibility for loss or damage arising from the use of the information 
transmitted by this email including damage from virus."

Reply via email to