Hello list

Given the case I have a file whose size is 10GB. The ram of total cluster is 24GB, three nodes. So the local node has only 8GB. If I load this file into Spark as a RDD via sc.textFile interface, will this operation run into "out of memory" issue?

Thank you.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to