Hi,
I'm trying to run the SimpleApp example (
http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala)
on a larger dataset.
The input file is about 1GB, but when I run the Spark program, it
says:java.lang.OutOfMemoryError: GC overhead limit exceeded, the full
error output
I you run locally then Spark doesn't launch remote executors. However,
in this case you can set the memory with --spark-driver-memory flag to
spark-submit. Does that work?
- Patrick
On Mon, Jun 9, 2014 at 3:24 PM, Henggang Cui cuihengg...@gmail.com wrote:
Hi,
I'm trying to run the SimpleApp