Try tuning the options like memoryFraction and executorMemory found here :
http://spark.apache.org/docs/latest/configuration.html.

Thanks

Prashant Sharma


On Mon, May 5, 2014 at 9:34 PM, Ajay Nair <prodig...@gmail.com> wrote:

> Hi,
>
> Is there any way to overcome this error? I am running this from the
> spark-shell, is that the cause of concern ?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-spark-on-27gb-wikipedia-data-tp6487p6490.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>

Reply via email to