Hi,
I have a three node cluster with 30G of Memory. I am trying to analyzing
the data of 200MB and running out of memory every time. This is the command
I am using
Driver Memory = 10G
Executor memory=10G
sc <- sparkR.session(master =
We have setup a spark cluster which is on NFS shared storage, there is no
permission issues with NFS storage, all the users are able to write to NFS
storage. When I fired write.df command in SparkR, I am getting below. Can
some one please help me to fix this issue.
16/09/17 08:03:28 ERROR