I am launching spark R with following script: ./sparkR --driver-memory 12G
and I try to load a local 3G csv file with following code, > a=read.transactions("/home/admin/datamining/data.csv",sep="\t",format="single",cols=c(1,2)) but I encounter an error: could not allocate memory (2048 Mb) in C function 'R_AllocStringBuffer' I have allocated 12G memory,not sure why it complains could not allocate 2G memory. Could someone help me? Thanks!