@spark.apache.org
Subject: RE: com.esotericsoftware.kryo.KryoException: java.io.IOException:
File too large vs FileNotFoundException (Too many open files) on spark 1.2.1
Do you think the ulimit for the user running Spark on your nodes?
Can you run ulimit -a under the user who is running spark
Do you think the ulimit for the user running Spark on your nodes?
Can you run ulimit -a under the user who is running spark on the executor
node? Does the result make sense for the data you are trying to process?
Yong
From: szheng.c...@gmail.com
To: user@spark.apache.org
Subject:
Assuming you are on Linux, what is your /etc/security/limits.conf set for
nofile/soft (number of open file handles)?
On Fri, Mar 20, 2015 at 3:29 PM Shuai Zheng szheng.c...@gmail.com wrote:
Hi All,
I try to run a simple sort by on 1.2.1. And it always give me below two
errors:
1,