No, you usually run Spark apps via the spark-submit script, and the
Spark machinery is already deployed on a cluster. Although it's
possible to embed the driver and get it working that way, it's not
supported.
On Fri, Feb 20, 2015 at 4:48 PM, Niranda Perera
niranda.per...@gmail.com wrote:
Hi Sean,
does it mean that Spark is not encouraged to be embedded on other products?
On Fri, Feb 20, 2015 at 3:29 PM, Sean Owen so...@cloudera.com wrote:
I don't think an OSGI bundle makes sense for Spark. It's part JAR,
part lifecycle manager. Spark has its own lifecycle management and is
not generally embeddable. Packaging is generally 'out of scope' for
the core project beyond the standard Maven and assembly releases.
On Fri, Feb 20, 2015 at 8:33 AM, Niranda Perera
niranda.per...@gmail.com wrote:
Hi,
I am interested in a Spark OSGI bundle.
While checking the maven repository I found out that it is still not
being
implemented.
Can we see an OSGI bundle being released soon? Is it in the Spark
Project
roadmap?
Rgds
--
Niranda
--
Niranda
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org