500GB of data will have nearly 3900 partitions and if you can have nearly
that many number of cores and around 500GB of memory then things will be
lightening fast. :)

Thanks
Best Regards

On Sun, May 3, 2015 at 12:49 PM, sherine ahmed <sherine.sha...@hotmail.com>
wrote:

> I need to use spark to upload a 500 GB data from hadoop on standalone mode
> cluster what are the minimum hardware requirements if it's known that it
> will be used for advanced analysis (social network analysis)?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Hardware-requirements-tp22744.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to