Thanks all for your information! What Pietro mentioned seems to be the 
appropriate solution.. I also find a 
slides<http://www.slideshare.net/EvanChan2/spark-summit-2014-spark-job-server-talk>
 talking about it.
Several quick questions:

1.       Is it already available in Spark main branch? (seems not but I am not 
sure if it is in plan)

2.       It seems that the current job sever can only submit Java jars (or 
Scala I guess?) - is there any plan to support Python in the future?
Thanks and any information would be appreciated!

Xiaoyong

From: Pietro Gentile [mailto:pietro.gentil...@gmail.com]
Sent: Monday, December 15, 2014 10:33 PM
To: Xiaoyong Zhu
Subject: R: is there a way to interact with Spark clusters remotely?

Hi,

try this https://github.com/spark-jobserver/spark-jobserver .

Best Regards,

Pietro Gentile


Da: Xiaoyong Zhu [mailto:xiaoy...@microsoft.com]
Inviato: lunedì 15 dicembre 2014 15:17
A: user@spark.apache.org<mailto:user@spark.apache.org>
Oggetto: is there a way to interact with Spark clusters remotely?

Hi experts

I am wondering if there is a way to interactive with Spark remotely? i.e. no 
access to clusters required but submit Python/Scala scripts to cluster and get 
result based on (REST) APIs.
That will facilitate the development process a lot..

Xiaoyong

________________________________
[http://static.avast.com/emails/avast-mail-stamp.png]<http://www.avast.com/>


Questa e-mail è priva di virus e malware perché è attiva la protezione avast! 
Antivirus<http://www.avast.com/> .


Reply via email to