Spark is a computation engine that runs on a set of distributed nodes.  You
must "bring your own" hardware, although of course there are hosted
solutions available.

On Sat, Oct 10, 2020 at 9:24 AM Santosh74 <sardesaisant...@gmail.com> wrote:

> Is spark compute engine only or it's also cluster which comes with set of
> hardware /nodes  ? What exactly is spark clusterr?
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to