Yes agreed.It seems to be issue with mapping the text file contents to case
classes, not sure though.
On Thu, Aug 4, 2016 at 8:17 PM, $iddhe$h Divekar wrote:
> Hi Deepak,
>
> My files are always > 50MB.
> I would think there would be a small config to overcome this.
Hi Deepak,
My files are always > 50MB.
I would think there would be a small config to overcome this.
Tried almost everything i could after searching online.
Any help from the mailing list would be appreciated.
On Thu, Aug 4, 2016 at 7:43 AM, Deepak Sharma wrote:
> I am
I am facing the same issue with spark 1.5.2
If the file size that's being processed by spark , is of size 10-12 MB , it
throws out of memory .
But if the same file is within 5 MB limit , it runs fine.
I am using spark configuration with 7GB of memory and 3 cores for executors
in the cluster of 8
Hi,
I am running spark jobs using apache oozie in yarn-client mode.
My job.properties has sparkConf which gets used in workflow.xml.
I have tried increasing MaxPermSize using sparkConf in job.properties
but that is not resolving the issue.
*sparkConf*=--verbose --driver-java-options