Hi,
I am trying to make Spark SQL 1.1 to work to replace part of our ETL
processes that are currently done by Hive 0.12.

A common problem that I have encountered is the "Too many files open"
error. Once that happened, the query just failed. I started the
spark-shell by using "ulimit -n 4096 & spark-shell". And it still pops the
same error.

Any solutions?

Many thanks.


Bill



-- 
Many thanks.


Bill

Reply via email to