val Patel [mailto:dhaval1...@gmail.com]
Sent: Saturday, November 7, 2015 12:26 AM
To: Spark User Group
Subject: [sparkR] Any insight on java.lang.OutOfMemoryError: GC overhead limit
exceeded
I have been struggling through this error since past 3 days and have tried all
possible ways/suggest
I have been struggling through this error since past 3 days and have tried
all possible ways/suggestions people have provided on stackoverflow and
here in this group.
I am trying to read a parquet file using sparkR and convert it into an R
dataframe for further usage. The file size is not that