How do you determine the number of partitions? For example, I have 16
workers, and the number of cores and the worker memory set in spark-env.sh
are:

CORE = 8
MEMORY = 16g

The .csv data I have is about 500MB, but I am eventually going to use a file
that is about 15GB.

Is the MEMORY variable in spark-env.sh different from spark.executor.memory
that you mentioned? If they're different, how do I set
spark.executor.memory?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-TaskSchedulerImpl-Lost-an-executor-tp4566p4621.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to