nviato: 16/09/2014 21.18
A: Matei Zaharia<mailto:matei.zaha...@gmail.com>;
user@spark.apache.org<mailto:user@spark.apache.org>
Oggetto: RE: Spark as a Library
Hello,
Thanks for the response and great to hear it is possible. But how do I
connect to Spark without using the s
ways to connect to Spark? I can’t find in the docs anything other
> than using the script. Thanks!
>
>
>
> Best, Oliver
>
>
>
> *From:* Matei Zaharia [mailto:matei.zaha...@gmail.com]
> *Sent:* Tuesday, September 16, 2014 1:31 PM
> *To:* Ruebenacker, Oliver A; use
script. Thanks!
Best, Oliver
From: Matei Zaharia [mailto:matei.zaha...@gmail.com]
Sent: Tuesday, September 16, 2014 1:31 PM
To: Ruebenacker, Oliver A; user@spark.apache.org
Subject: Re: Spark as a Library
If you want to run the computation on just one machine (using Spark's local
mode
It depends on what you want to do with Spark. The following has worked for
me.
Let the container handle the HTTP request and then talk to Spark using
another HTTP/REST interface. You can use the Spark Job Server for this.
Embedding Spark inside the container is not a great long term solution IMO
b
If you want to run the computation on just one machine (using Spark's local
mode), it can probably run in a container. Otherwise you can create a
SparkContext there and connect it to a cluster outside. Note that I haven't
tried this though, so the security policies of the container might be too
Hello,
Suppose I want to use Spark from an application that I already submit to run
in another container (e.g. Tomcat). Is this at all possible? Or do I have to
split the app into two components, and submit one to Spark and one to the other
container? In that case, what is the preferred