@spark.apache.org
Subject: RE: com.esotericsoftware.kryo.KryoException: java.io.IOException:
File too large vs FileNotFoundException (Too many open files) on spark 1.2.1
Do you think the ulimit for the user running Spark on your nodes?
Can you run ulimit -a under the user who is running spark
: com.esotericsoftware.kryo.KryoException: java.io.IOException: File too
large vs FileNotFoundException (Too many open files) on spark 1.2.1
Date: Fri, 20 Mar 2015 15:28:26 -0400
Hi All, I try to run a simple sort by on 1.2.1. And it always give me below two
errors: 1, 15/03/20 17:48:29 WARN TaskSetManager: Lost
Hi All,
I try to run a simple sort by on 1.2.1. And it always give me below two
errors:
1, 15/03/20 17:48:29 WARN TaskSetManager: Lost task 2.0 in stage 1.0 (TID
35, ip-10-169-217-47.ec2.internal): java.io.FileNotFoundException:
Assuming you are on Linux, what is your /etc/security/limits.conf set for
nofile/soft (number of open file handles)?
On Fri, Mar 20, 2015 at 3:29 PM Shuai Zheng szheng.c...@gmail.com wrote:
Hi All,
I try to run a simple sort by on 1.2.1. And it always give me below two
errors:
1,