You can run Spark app on Dataproc, which is Google's managed Spark and
Hadoop service:

https://cloud.google.com/dataproc/docs/

basically, you:

* assemble a jar
* create a cluster
* submit a job to that cluster (with the jar)
* delete a cluster when the job is done

Before all that, one has to create a Cloud Platform project, enable
billing and Dataproc API - but all this is explained in the docs.

Cheers,
Dinko


On 4 January 2017 at 17:34, Anahita Talebi <anahita.t.am...@gmail.com> wrote:
>
> To whom it might concern,
>
> I have a question about running a spark code on Google cloud.
>
> Actually, I have a spark code and would like to run it using multiple
> machines on Google cloud. Unfortunately, I couldn't find a good
> documentation about how to do it.
>
> Do you have any hints which could help me to solve my problem?
>
> Have a nice day,
>
> Anahita
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to