Sorry, my mistake (quick copy-paste), livy doesn't let me submit
applications the classic way (with assembly jars) and force me to change
all my current applications.

------------------------------

*Mariano Semelman*
P13N - IT
Av. Corrientes Nº 746 - piso 13 - C.A.B.A. (C1043AAU)
Teléfono (54) 11- *4894-3500*


[image: Seguinos en Twitter!] <http://twitter.com/#!/despegarar> [image:
Seguinos en Facebook!] <http://www.facebook.com/despegar> [image: Seguinos
en YouTube!] <http://www.youtube.com/Despegar>
*Despegar.com*
El mejor precio para tu viaje.

Este mensaje es confidencial y puede contener información amparada por el
secreto profesional. Si usted ha recibido este e-mail por error, por favor
comuníquenoslo inmediatamente respondiendo a este e-mail y luego
eliminándolo de su sistema. El contenido de este mensaje no deberá ser
copiado ni divulgado a ninguna persona.

On 29 September 2016 at 01:08, Ofer Eliassaf <ofer.elias...@gmail.com>
wrote:

> Are u sure that livy doesn't support standalone cluster mode?
>
> On Thu, Sep 29, 2016 at 1:42 AM, Mariano Semelman <
> mariano.semel...@despegar.com> wrote:
>
>> ​Hello everybody,
>>
>> I'm developing an application to submit batch and streaming apps in a
>> fault tolerant fashion. For that I need a programatically way to submit and
>> monitor my apps and relaunch them in case of failure. Right now I'm using
>> spark standalone (1.6.x) and submitting in cluster mode. The options I have
>> explored so far are:
>>
>> SparkLauncher.java [1]: It has two modes:
>>     - 1) launch() Doesn't give me the application-id in order to monitor
>> (with spark master rest API). Would have to infer from the application name 
>> and
>> startTime in api/v1/applications using the spark master API [9]
>>     - 2) startApplication(...) Only works if submitted locally or client
>> mode (BTW, the fact that only works in client or local mode is not
>> documented in the package summary page[1] which led me to many, many wasted
>> hours)
>>
>> Spark-Jobserver [2]:
>>     Doesn't support standalone cluster mode
>>
>> Livy [3]:
>>     Doesn't support standalone cluster mode
>>
>> Spark Submission Rest API [4,5,6]:
>>     It seems the sensible way, but is black magic for the user. It's not
>> documented and there's no official Client. There's only one [7] unofficial
>> client. And it occurred to me also to copy in my own project the
>> RestSubmissionClient [8].
>>
>>
>> I'm between using launch and infer the appId or using Spark Submission
>> Rest API, but none of them seem a proper way to solve this. If someone
>> could give me an advise on how to face this I would appreciate it.
>>
>> Thanks in advance,
>>
>> Mariano
>>
>>
>> [1] https://spark.apache.org/docs/1.6.1/api/java/org/apache/
>> spark/launcher/package-summary.html
>> [2] https://github.com/spark-jobserver/spark-jobserver
>> [3] http://livy.io/
>> [4] http://stackoverflow.com/questions/28992802/triggering-s
>> park-jobs-with-rest (most voted answer)
>> [5] http://arturmkrtchyan.com/apache-spark-hidden-rest-api
>> [6] https://issues.apache.org/jira/browse/SPARK-5388
>> [7] https://github.com/ywilkof/spark-jobs-rest-client
>> [8] https://github.com/apache/spark/blob/master/core/src/mai
>> n/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala
>> [9] http://spark.apache.org/docs/latest/monitoring.html
>>
>>
>>
>>
>
>
> --
> Regards,
> Ofer Eliassaf
>

Reply via email to