Hi,

I'm creating a process using systemML. But after certain period of time, the 
performance decreases.


1) This warning message: WARN TaskSetManager: Stage 25254 contains a task of 
very large size (3954 KB). The maximum recommended task size is 100 KB.


2) For Spark, we are implementing this setting:

                     spark.executor.memory 2048g

                      spark.driver.memory 2048g

                spark.driver.maxResultSize 2048

is this good enough, or we can do something else to improve the performance? WE 
tried the spark implementation suggested in the documentation. But it didn't 
help much.


3) We are running on a system with 244 gb ram 32 cores and 100 gb hard disk 
space.


it will be great if anyone can guide me how to improve the performance.


Thank you!

Arijit

Reply via email to