We have hit the same issue in spark shell when registering a temp table. We
observed it happening with those who had JDK 6. The problem went away after
installing jdk 8. This was only for the tutorial materials which was about
loading a parquet file.

Regards
Andy

On Sat, Jul 4, 2015 at 2:54 AM, sim <s...@swoop.com> wrote:

> @bipin, in my case the error happens immediately in a fresh shell in 1.4.0.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/1-4-0-regression-out-of-memory-errors-on-small-data-tp23595p23614.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Andy Huang | Managing Consultant | Servian Pty Ltd | t: 02 9376 0700 |
f: 02 9376 0730| m: 0433221979

Reply via email to